Learning the (Data) — Data-Driven Management, HBR Edition

The months of December and January are usually full of reviews of significant events and achievements during the previous twelve months. Harvard Business Review makes the search for some of the best writing on the subject of data-driven transformation by occasionally publishing in one volume the best writing on a critical subject of interest to professional through the magazine OnPoint. It is worth making part of your permanent data management library.

The volume begins with a very concise article by Thomas C. Redman with the provocative title “Does Your Company Know What to Do with All Its Data?” He then goes on to list seven takeaways of optimizing the use of existing data that includes many of the themes that I have written about in this blog: better decision-making, innovation, what he calls “informationalize products”, and other significant effects. Most importantly, he refers to the situation of information asymmetry and how this provides companies and organizations with a strategic advantage that directly affects the bottom line–whether that be in negotiations with peers, contractual relationships, or market advantages. Aside from the OnPoint article, he also has some important things to say about corporate data quality. Highly recommended and a good reason to implement systems that assure internal information systems fidelity.

Edd Wilder-James also covers a theme that I have hammered home in a number of blog posts in the article “Breaking Down Data Silos.” The issue here is access to data and the manner in which it is captured and transformed into usable analytics. His recommended approach to a task that is often daunting is to find the path of least resistance in finding opportunities to break down silos and maximize data to apply advanced analytics. The article provides a necessary balm that counteracts the hype that often accompanies this topic.

Both of these articles are good entrees to the subject and perfectly positioned to prompt both thought and reflection of similar experiences. In my own day job I provide products that specifically address these business needs. Yet executives and management in all too many cases continue to be unaware of the economic advantages of data optimization or the manner in which continuing to support data silos is limiting their ability to effectively manage their organizations. There is no doubt that things are changing and each day offers a new set of clients who are feeling their way in this new data-driven world, knowing that the promises of almost effort-free goodness and light by highly publicized data gurus are not the reality of practitioners, who apply the detail work of data normalization and rationalization. At the end it looks like magic, but there is effort that needs to be expended up-front to get to that state. In this physical universe under the Second Law of Thermodynamics there are no free lunches–energy must be borrowed from elsewhere in order to perform work. We can minimize these efforts through learning and the application of new technology, but managers cannot pretend not to have to understand the data that they intend to use to make business decisions.

All of the longer form articles are excellent, but I am particularly impressed with the Leandro DalleMule and Thomas H. Davenport article entitled “What’s Your Data Strategy?” from the May-June 2017 issue of HBR. Oftentimes when addressing big data at professional conferences and in visiting businesses the topic often runs to the manner of handling the bulk of non-structured data. But as the article notes, less than half of an organization’s relevant structured data is actually used in decision-making. The most useful artifact that I have permanently plastered at my workplace is the graphic “The Elements of Data Strategy”, and I strongly recommend that any manager concerned with leveraging new technology to optimize data do the same. The graphic illuminates the defensive and offensive positions inherent in a cohesive data strategy leading an organization to the state: “In our experience, a more flexible and realistic approach to data and information architectures involves both a single source of truth (SSOT) and multiple versions of the truth (MVOTs). The SSOT works at the data level; MVOTs support the management of information.” Elimination of proprietary data silos, elimination of redundant data streams, and warehousing of data that is accessed using a number of analytical methods achieve the necessary states of SSOT that provides the basis for an environment supporting MVOTs.

The article “Why IT Fumbles Analytics” by Donald A. Marchand and Joe Peppard from 2013, still rings true today. As with the article cited above by Wilder-James, the emphasis here is with the work necessary to ensure that new data and analytical capabilities succeed, but the emphasis shifts to “figuring out how to use the information (the new system) generates to make better decisions or gain deeper…insights into key aspects of the business.” The heart of managing the effort in providing this capability is to put into place a project organization, as well as systems and procedures, that will support the organizational transformation that will occur as a result of the explosion of new analytical capability.

The days of simply buying an off-the-shelf silo-ed “tool” and automating a specific manual function are over, especially for organizations that wish to be effective and competitive–and more profitable–in today’s data and analytical environment. A more comprehensive and collaborative approach is necessary. As with the DalleMule and Davenport article, there is a very useful graphic that contrasts traditional IT project approaches against Analytics and Big Data (or perhaps “Bigger” Data) Projects. Though the prescriptions in the article assume an earlier concept of Big Data optimization focused on non-structured data, thereby making some of these overkill, an implementation plan is essential in supporting the kind of transformation that will occur, and managers act at their own risk if they fail to take this effect into account.

All of the other articles in this OnPoint issue are of value. The bottom line, as I have written in the past, is to keep a focus on solving business challenges, rather than buying the new bright shiny object. Alternatively, in today’s business environment the day that business decision-makers can afford to stay within their silo-ed comfort zone are phasing out very quickly, so they need to shift their attention to those solutions that address these new realities.

So why do this apart from the fancy term “data optimization”? Well, because there is a direct return-on-investment in transforming organizations and systems to data-driven ones. At the end of the day the economics win out. Thus, our organizations must be prepared to support and have a plan in place to address the core effects of new data-analytics and Big Data technology:

a. The management and organizational transformation that takes place when deploying the new technology, requiring proactive socialization of the changing environment, the teaching of new skill sets, new ways of working, and of doing business.

b. Supporting transformation from a sub-optimized silo-ed “tell me what I need to know” work environment to a learning environment, driven by what the data indicates, supporting the skills cited above that include intellectual curiosity, engaging domain expertise, and building cross-domain competencies.

c. A practical plan that teaches the organization how best to use the new capability through a practical, hands-on approach that focuses on addressing specific business challenges.

The Revolution Will Not Be Televised — The Sustainability Manifesto for Projects

While doing stuff and living life (which seems to take me away from writing) there were a good many interesting things written on project management.  The very insightful Dave Gordon at his blog, The Practicing IT Project Manager, provides a useful weekly list of the latest contributions to the literature that are of note.  If you haven’t checked it out please do so–I recommend it highly.

While I was away Dave posted to an interesting link on the concept of sustainability in project management.  Along those lines three PM professionals have proposed a Sustainability Manifesto for Projects.  As Dave points out in his own post on the topic, it rests on three basic principles:

  • Benefits realization over metrics limited to time, scope, and cost
  • Value for many over value of money
  • The long-term impact of our projects over their immediate results

These are worthy goals and no one needs to have me rain on their parade.  I would like to see these ethical principles, which is what they really are, incorporated into how we all conduct ourselves in business.  But then there is reality–the “is” over the “ought.”

For example, Dave and I have had some correspondence regarding the nature of the marketplace in which we operate through this blog.  Some time ago I wrote a series of posts here, here, and here providing an analysis of the markets in which we operate both in macroeconomic and microeconomic terms.

This came in response to one my colleagues making the counterfactual assertion that we operate in a “free market” based on the concept of “private enterprise.”  Apparently, such just-so stories are lies we have to tell ourselves to make the hypocrisy of daily life bearable.  But, to bring the point home, in talking about the concept of sustainability, what concrete measures will the authors of the manifesto bring to the table to counter the financialization of American business that has occurred of the past 35 years?

For example, the news lately has been replete with stories of companies moving plants from the United States to Mexico.  This despite rising and record corporate profits during a period of stagnating median working class incomes.  Free trade and globalization have been cited as the cause, but this involves more hand waving and the invocation of mantras, rather than analysis.  There has also been the predictable invocations of the Ayn Randian cult and the pseudoscience* of Social Darwinism.  Those on the opposite side of the debate characterize things as a morality play, with the public good versus greed being the main issue.  All of these explanations miss their mark, some more than others.

An article setting aside a few myths was recently published by Jonathan Rothwell at Brookings, which came to me via Mark Thoma’s blog, in the article, “Make elites compete: Why the 1% earn so much and what to do about it”.  Rothwell looks at the relative gains of the market over the last 40 years and finds that corporate profits, while doing well, have not been the driver of inequality that Robert Reich and other economists would have it be.  In looking at another myth that has been promulgated by Greg Mankiw, he finds that the rewards of one’s labors is not related to any special intelligence or skill.  On the contrary, one’s entry into the 1% is actually related to what industry one chooses to enter, regardless of all other factors.  This disparity is known as a “pay premium”.  As expected, petroleum and coal products, financial instruments, financial institutions, and lawyers, are at the top of the pay premium.  What is not, against all expectations of popular culture and popular economic writing, is the IT industry–hardware, software, etc.  Though they are the poster children of new technology, Bill Gates, Mark Zuckerburg, and others are the exception to the rule in an industry that is marked by a 90% failure rate.  Our most educated and talented people–those in science, engineering, the arts, and academia–are poorly paid–with negative pay premiums associated with their vocations.

The financialization of the economy is not a new or unnoticed phenomenon.  Kevin Phillips, in Wealth and Democracy, which was written in 2003, noted this trend.  There have been others.  What has not happened as a result is a national discussion on what to do about it, particularly in defining the term “sustainability”.

For those of us who have worked in the acquisition community, the practical impact of financialization and de-industrialization have made logistics challenging to say the least.  As a young contract negotiator and Navy Contracting Officer, I was challenged to support the fleet when any kind of fabrication or production was involved, especially in non-stocked machined spares of any significant complexity or size.  Oftentimes my search would find that the company that manufactured the items was out of business, its pieces sold off during Chapter 11, and most of the production work for those items still available done seasonally out of country.  My “out” at the time–during the height of the Cold War–was to take the technical specs, which were paid for and therefore owned by the government, to one of the Navy industrial activities for fabrication and production.  The skillset for such work was still fairly widespread, supported by the quality control provided by a fairly well-unionized and trade-based workforce–especially among machinists and other skilled workers.

Given the new and unique ways judges and lawyers have applied privatized IP law to items financed by the public, such opportunities to support our public institutions and infrastructure, as I was able, have been largely closed out.  Furthermore, the places to send such work, where possible, have also gotten vanishingly smaller.  Perhaps digital printing will be the savior for manufacturing that it is touted to be.  What it will not do is stitch back the social fabric that has been ripped apart in communities hollowed out by the loss of their economic base, which, when replaced, comes with lowered expectations and quality of life–and often shortened lives.

In the end, though, such “fixes” benefit a shrinkingly few individuals at the expense of the democratic enterprise.  Capitalism did not exist when the country was formed, despite the assertion of polemicists to link the economic system to our democratic government.  Smith did not write his pre-modern scientific tract until 1776, and much of what it meant was years off into the future, and its relevance given what we’ve learned over the last 240 years about human nature and our world is up for debate.  What was not part of such a discussion back then–and would not have been understood–was the concept of sustainability.  Sustainability in the study of healthy ecosystems usually involves the maintenance of great diversity and the flourishing of life that denotes health.  This is science.  Economics, despite Keynes and others, is still largely rooted in 18th and 19th century pseudoscience.

I know of no fix or commitment to a sustainability manifesto that includes global, environmental, and social sustainability that makes this possible short of a major intellectual, social or political movement willing to make a long-term commitment to incremental, achievable goals toward that ultimate end.  Otherwise it’s just the mental equivalent to camping out in Zuccotti Park.  The anger we note around us during this election year of 2016 (our year of discontent) is a natural human reaction to the end of an idea, which has outlived its explanatory power and, therefore, its usefulness.  Which way shall we lurch?

The Sustainability Manifesto for Projects, then, is a modest proposal.  It may also simply be a sign of the times, albeit a rational one.  As such, it leaves open a lot of questions, and most of these questions cannot be addressed or determined by the people to which it is targeted: project managers, who are usually simply employees of a larger enterprise.  People behave as they are treated–to the incentives and disincentives presented to them, oftentimes not completely apparent on the conscious level.  Thus, I’m not sure if this manifesto hits its mark or even the right one.

*This term is often misunderstood by non-scientists.  Pseudoscience means non-science, just as alternative medicine means non-medicine.  If any of the various hypotheses of pseudoscience are found true, given proper vetting and methodology, that proposition would simply be called science.  Just as alternative methods of treatment, if found effective and consistent, given proper controls, would simply be called medicine.

Three’s a Crowd — The Nash Equilibrium, Computer Science, and Economics (and what it means for Project Management theory)

Over the last couple of weeks reading picked up on an interesting article via Brad DeLong’s blog, who picked it up from Larry Hardesty at MIT News.  First a little background devoted to defining terms.  The Nash Equilibrium is a part of Game Theory in measuring how and why people make choices in social networks.  As defined in this Columbia University paper:

A game (in strategic or normal form) consists of the following three elements: a set of players, a set of actions (or pure-strategies) available to each player, and a payoff (or utility) function for each player. The payoff functions represent each player’s preferences over action profiles, where an action profile is simply a list of actions, one for each player. A pure-strategy Nash equilibrium is an action profile with the property that no single player can obtain a higher payoff by deviating unilaterally from this profile.

John Von Neumann developed Game Theory to measure, in a mathematical model, the dynamic of conflicts and cooperation between intelligent rational decision-makers in a system.  All social systems can be measured by the application of Game Theory models.  But with all mathematical modeling, there are limitations to what can be determined.  Unlike science, mathematics can only measure and model what we observe, but it can provide insights that would otherwise go unnoticed.  As such, Von Newmann’s work (along with Oskar Morgenstern and Leonid Kantorovich) in this area has become the cornerstone of mathematical economics.

When dealing with two players in a game, a number of models have been developed to explain the behavior that is observed.  For example, most familiar to us are zero-sum games and tit-for-tat games.  Many of us in business, diplomacy, the military profession, and engaging in old-fashioned office politics have come upon such strategies in day-to-day life.  In the article from MIT News that describes the latest work of Constantinos Daskalakis, an assistant professor in MIT’s Computer Science and Artificial Intelligence Laboratory:

In the real world, competitors in a market or drivers on a highway don’t (usually) calculate the Nash equilibria for their particular games and then adopt the resulting strategies. Rather, they tend to calculate the strategies that will maximize their own outcomes given the current state of play. But if one player shifts strategies, the other players will shift strategies in response, which will drive the first player to shift strategies again, and so on. This kind of feedback will eventually converge toward equilibrium:…The argument has some empirical support. Approximations of the Nash equilibrium for two-player poker have been calculated, and professional poker players tend to adhere to them — particularly if they’ve read any of the many books or articles on game theory’s implications for poker.

Anyone who has engaged in two-player games can intuitively understand this insight, from anything from card games to chess.  But in modeling behavior, when a third player is added to the mix, the mathematics in describing market or system behavior becomes “intractable.”  That is, all of the computing power in the world cannot calculate the Nash equilibrium.

Part of this issue is the age-old paradox, put in plain language, that everything that was hard to do for the first time in the past is now easy to do and verify today.  This includes everything from flying aircraft to dealing with quantum physics.  In computing and modeling, the issue is that every hard problem that has to be computed to solved requires far less resources to be verified.  This is known as the problem of P=NP.

We deal with P=NP problems all the time when developing software applications and dealing with ever larger sets of data.  For example, I attended a meeting recently where a major concern among the audience was over the question of scalability, especially in dealing with large sets of data.  In the past “scalability” to the software publisher simply meant the ability of the application to be used over a large set of users via some form of distributed processing (client-server, shared services, desktop virtualization, or a browser-based deployment).  But now with the introduction of KDD (knowledge discovery in databases) scalability now also addresses the ability of technologies to derive importance from the data itself outside of the confines of a hard-coded application.

The search for optimum polynomial algorithms to reduce the speed of time-intensive problems forces the developer to find the solution (the proof of NP-completeness) in advance and then work toward the middle in developing the appropriate algorithm.  This should not be a surprise.  In breaking Enigma during World War II Bletchley Park first identified regularities in the messages that the German high command was sending out.  This then allowed them to work backwards and forwards in calculating how the encryption could be broken.  The same applies to any set of mundane data, regardless of size, which is not trying hard not to be deciphered.  While we may be faced with a Repository of Babel, it is one that badly wants to be understood.

While intuitively the Nash equilibrium does exist, its mathematically intractable character has demanded that new languages and approaches to solving it be developed.  In the case of Daskalakis, he has proposed three routes.  These are:

  1. “One is to say, we know that there exist games that are hard, but maybe most of them are not hard.  In that case you can seek to identify classes of games that are easy, that are tractable.”
  2. Find mathematical models other than Nash equilibria to characterize markets — “models that describe transition states on the way to equilibrium, for example, or other types of equilibria that aren’t so hard to calculate.”
  3. Approximation of the Nash equilibrium, “where the players’ strategies are almost the best responses to their opponents’ strategies — might not be. In those cases, the approximate equilibrium could turn out to describe the behavior of real-world systems.”

This is the basic engineering approach to any complex problem (and a familiar approach to anyone schooled in project management):  break the system down into smaller pieces to solve.

So what does all of this mean for the discipline of project management?  In modeling complex systems behavior for predictive purposes, our approach must correspondingly break down the elements of systems behavior into their constituent parts, but then integrate them in such as way as to derive significance.  The key to this lies in the availability of data and our ability to process it using methods that go beyond trending data for individual variables.

 

 

 

The Monster Mash — Zombie Ideas in Project and Information Management

Just completed a number of meetings and discussions among thought leaders in the area of complex project management this week, and I was struck by a number of zombie ideas in project management, especially related to information, that just won’t die.  The use of the term zombie idea is usually attributed to the Nobel economist Paul Krugman from his excellent and highly engaging (as well as brutally honest) posts at the New York Times, but for those not familiar, a zombie idea is “a proposition that has been thoroughly refuted by analysis and evidence, and should be dead — but won’t stay dead because it serves a political purpose, appeals to prejudices, or both.”

The point is that to a techie–or anyone engaged in intellectual honesty–is that they are often posed in the form of question begging, that is, they advance invalid assumptions in the asking or the telling.  Most often they take the form of the assertive half of the same coin derived from “when did you stop beating your wife?”-type questions.  I’ve compiled a few of these for this post and it is important to understand the purpose for doing so.  It is not to take individuals to task or to bash non-techies–who have a valid reason to ask basic questions based on what they’ve heard–but propositions put forth by people who should know better based on their technical expertise or experience.  Furthermore, knowing and understanding technology and its economics is really essential today to anyone operating in the project management domain.

So here are a few zombies that seem to be most common:

a.  More data equals greater expense.  I dealt with this issue in more depth in a previous post, but it’s worth repeating here:  “When we inform Moore’s Law by Landauer’s Principle, that is, that the energy expended in each additional bit of computation becomes vanishingly small, it becomes clear that the difference in cost in transferring a MB of data as opposed to a KB of data is virtually TSTM (“too small to measure”).”  The real reason why we continue to deal with this assertion is both political in nature and also based in social human interaction.  People hate oversight and they hate to be micromanaged, especially to the point of disrupting the work at hand.  We see behavior, especially in regulatory and contractual relationships, where the reporting entity plays the game of “hiding the button.”  This behavior is usually justified by pointing to examples of dysfunction, particularly on the part of the checker, where information submissions lead to the abuse of discretion in oversight and management.  Needless to say, while such abuse does occur, no one has yet to point quantitatively to data (as opposed to anecdotally) that show how often this happens.

I would hazard to guess that virtually anyone with some experience has had to work for a bad boss; where every detail and nuance is microscopically interrogated to the point where it becomes hard to make progress on the task at hand.  Such individuals, who have been advanced under the Peter principle must, no doubt, be removed from such a position.  But this often happens in any organization, whether it be in private enterprise–especially in places where there is no oversight, check-and-balances, means of appeal, or accountability–or government–and is irrelevant to the assertion.  The expense item being described is bad management, not excess data.  Thus, such assertions are based on the antecedent assumption of bad management, which goes hand-in-hand with…

b. More information is the enemy of efficiency.  This is the other half of the economic argument to more data equals greater expense.  And I failed to mention that where the conflict has been engaged over these issues, some unjustifiable figure is given for the additional data that is certainly not supported by the high tech economics cited above.  Another aspect of both of these perspectives also comes from the conception of non-techies that more data and information is equivalent to pre-digital effort, especially in conceptualizing the work that often went into human-readable reports.  This is really an argument that supports the assertion that it is time to shift the focus from fixed report formatting functionality in software based on limited data to complete data, which can be formatted and processed as necessary.  If the right and sufficient information is provided up-front, then additional questions and interrogatories that demand supplemental data and information–with the attendant multiplication of data streams and data islands that truly do add cost and drive inefficiency–are at least significantly reduced, if not eliminated.

c.  Data size adds unmanageable complexity.  This was actually put forth by another software professional–and no doubt the non-techies in the room would have nodded their heads in agreement (particularly given a and b above), if opposing expert opinion hadn’t been offered.  Without putting too fine a point on it, a techie saying this to an open forum is equivalent to whining that your job is too hard.  This will get you ridiculed at development forums, where you will be viewed as an insufferable dilettante.  Digitized technology for well over 40 years has been operating under the phenomenon of Moore’s Law.  Under this law, computational and media storage capability doubles at least every two years under the original definition, though that equation has accelerated to somewhere between 12 and 24 months.  Thus, what was considered big data, say, in 1997 when NASA first coined the term, is not considered big data today.  No doubt, what is considered big data this year will not be considered big data two years from now.  Thus, the term itself is relative and may very well become archaic.  The manner in which data is managed–its rationalization and normalization–is important in successfully translating disparate data sources, but the assertion that big is scary is simply fear mongering because you don’t have the goods.

d.  Big data requires more expensive and sophisticated approaches.  This flows from item c above as well and is often self-serving.  Scare stories abound, often using big numbers which sound scary.  All data that has a common use across domains has to be rationalized at some point if they come from disparate sources, and there are a number of efficient software techniques for accomplishing this.  Furthermore, support for agnostic APIs and common industry standards, such as the UN/CEFACT XML, take much of the rationalization and normalization work out of a manual process.  Yet I have consistently seen suboptimized methods being put forth that essentially require an army of data scientists and coders to essentially engage in brute force data mining–a methodology that has been around for almost 30 years: except that now it carries with it the moniker of big data.  Needless to say this approach is probably the most expensive and slowest out there.  But then, the motivation for its use by IT shops is usually based in rice bowl and resource politics.  This is flimflam–an attempt to revive an old zombie under a new name.  When faced with such assertions, see Moore’s Law and keep on looking for the right answer.  It’s out there.

e.  Performance management and assessment is an unnecessary “regulatory” expense.  This one keeps coming up as part of a broader political agenda beyond just project management.  I’ve discussed in detail the issues of materiality and prescriptiveness in regulatory regimes here and here, and have addressed the obvious legitmacy of organizations to establish one in fiduciary, contractual, and governmental environments.

My usual response to the assertion of expense is to simply point to the unregulated derivatives market largely responsible for the financial collapse, and the resulting deep economic recession that followed once the housing bubble burst.  (And, aside from the cost of human suffering and joblessness, the expenses related to TARP).  Thus we know that the deregulation of banking had gone so well.  Even after the Band-Aid of Dodd-Frank the situation probably requires a bit more vigor, and should include the ratings agencies as well as the real estate market.  But here is the fact of the matter: such expenses cannot be monetized as additive because “regulatory” expenses usually represent an assessment of the day-to-day documentation, systems, and procedures required when performing normal business operations and due diligence in management.  I attended an excellent presentation last week where the speaker, tasked with finding unnecessary regulatory expenses, admitted as much.

Thus, what we are really talking about is an expense that is an essential prerequisite to entry in a particular vertical, especially where monopsony exists as a result of government action.  Moral hazard, then, is defined by the inherent risk assumed by contract type, and should be assessed on those terms.  Given the current trend is to raise thresholds, the question is going to be–in the government sphere–whether public opinion will be as forgiving in a situation where moral hazard assumes $100M in risk when things head south, as they often do with regularity in project management.  The way to reduce that moral hazard is through sufficiency of submitted data.  Thus, we return to my points in a and b above.

f.  Effective project assessment can be performed using high level data.  It appears that this view has its origins in both self-interest and a type of anti-intellectualism/anti-empiricism.

In the former case, the bias is usually based on the limitations of either individuals or the selected technology in providing sufficient information.  In the latter case, the argument results in a tautology that reinforces the fallacy that absence of evidence proves evidence of absence.  Here is how I have heard the justification for this assertion: identifying emerging trends in a project does not require that either trending or lower level data be assessed.  The projects in question are very high dollar value, complex projects.

Yes, I have represented this view correctly.  Aside from questions of competency, I think the fallacy here is self-evident.  Study after study (sadly not all online, but performed within OSD at PARCA and IDA over the last three years) have demonstrated that high level data averages out and masks indicators of risk manifestation, which could have been detected looking at data at the appropriate level, which is the intersection of work and assigned resources.  In plain language, this requires integration of the cost and schedule systems, with risk first being noted through consecutive schedule performance slips.  When combined with technical performance measures, and effective identification of qualitative and quantitative risk tied to schedule activities, the early warning is two to three months (and sometime more) before the risk is reflected in the cost measurement systems.  You’re not going to do this with an Excel spreadsheet.  But, for reference, see my post  Excel is not a Project Management Solution.

It’s time to kill the zombies with facts–and to behead them once and for all.

A Little Bit Moore — Moore’s Law and Public Sector Economics

Back in the saddle and have to just find the time to put some thoughts down.  In dealing with high tech and data issues one of the most frequent counterfactuals that I have been running into lately is in regard to the “cost” associated with data, especially data submissions.  If one approaches this issue using standard economic theory pre-high tech then the positive correlation applies.

But we live in a different world now folks.

I find myself increasingly having to explain that, given that most data concerning business exists in some form other than human readable form, that the classical economic correlation no longer applies.  This assertion is usually met with a blank stare.

Thankfully there is a concept that has been proven correct to support this assertion.  It started out as an anecdote but has a great deal of credibility attached to it.  It is Moore’s Law.  For a pretty complete overview the Wikipedia entry works fine.  When we inform Moore’s Law by Landauer’s Principle, that is, that the energy expended in each additional bit of computation becomes vanishingly small, it becomes clear that the difference in cost in transferring a MB of data as opposed to a KB of data is virtually TSTM (“too small to measure”).

The challenge that we have had to face in leveraging data is not in the cost associated with data, but in two other factors.  These factors involve those of human intervention and software application limitations.  Both of these can be ameliorated and incentivized toward maximizing the greater computational and media storage capabilities that appear every 12 to 24 months through the application of Public Sector economics.

This is an area that is ripe for contributions to the literature only because most of freshwater classical economics has been so blind to it because of ideological reasons.  Our emphasis on free market fundamentalism has caused otherwise rationale people to artificially draw a solid line between the effects of public policy and expenditure, and private market activity.  But it is not that neat and easy.  One effects and influences the other.

The proof to this assertion can be found in practical form on K Street in Washington, D.C., for anyone interested.  On the the macroeconomic front, practical experience since the 2007-08 crash has been a live course in the effectiveness of neo-Keynesian modeling.  Furthermore, in this blog I have documented technologies that were developed in the public sector that were transitioned to the public domain and which created significant private industry market verticals, including the IT sector as it currently exists.

The area that I operate in most often is the Aerospace & Defense market vertical focused on project management where the connection is most strongly seen and felt.  Because of this reason there have been a number of initiatives going back to the 1960s to incentivize this sector to strengthen this interaction between public and private resources in order to leverage technological, process, and other developments in order to optimize performance.

Full disclosure on my part: I have been an active participant in some of these successive initiatives over the years: first, during the round of Acquisition Reform under Defense Secretary Caspar Weinberger in the 1980s, and again in the late 1990s under Vice President Al Gore’s “reinventing government”, so I come to this subject with a bit of history and practical experience.  I am also currently engaged on some issues related to Undersecretary of Defense for Acquisition Frank Kendall’s Better Buying Power initiative.

The danger in highly structured public sector industries is that “good enough” often rules the day.  Incentives exist to play it safe and to avoid risk whenever possible.  This creates an atmosphere of complacency.  From the IT perspective, such concepts as Moore’s Law, which has been around for quite a while, tend to be seen as suspect.  Thus, economic incentives must be established through policymakers to replicate the pressure for performance and excellence found in more competitive markets.

The way of doing this is to provide economic incentives that punish complacency and that force participants to encompass the latest developments in technology.  Keep in mind that we’re talking an 18 to 24 month cycle.  The complaint, of course is that this is seen as too rapid a change.  Of course, how many Boeing 707s do you still find flying in commercial airlines these days?  So the ground rules must be established to encourage a race to the top instead of a race to the status quo.

Welcome to the Hotel (Euro) — You Can Vote “Oxi” Anytime you Like but you Can Never Leave

The recent events in Greece and their ramifications for the European project have been the subject of a good many blogs and news articles lately.  From an economic perspective the most noteworthy are those by Paul Krugman, Brad DeLong, Dean Baker, Yanis Varoufakis who was on the ground as Greece’s Finance Minister, and Joseph Stiglitz, among others.

If one were to read the news in the manner of the U.S. press through its sources of record: The New York Times, Wall Street Journal, Washington Post, not to mention the major news networks with CNN thrown in (avoiding avowedly polemical sources like Fox, MSNBC, and the Huffington Post), one would think that the Greek issue is one caused by a profligate country that borrowed a bit too much and allowed Greeks to live over their heads.  Nothing could be further from the truth.

The bottom line is that Greece and the EU decided to bail out the banks and investors who crossed the line in investing in junk paper by using public funds.  Sound familiar?  Think a Eurozone TARP.  But in the case of the EU the banks and bad paper investment houses–the inmates in this scenario–run the asylum.  With the constant drumbeat from our own oligarchs we have become as a people brainwashed to think that investors and creditors have a right to their money.  Our own more draconian bankruptcy laws imposed by the last financial industry-tainted Congress institutionalized many of these attitudes in law.  But this has not always not been the case and is not part of our legal or economic traditions.  It is certainly not anything like what Adam Smith had in mind.

The operational term in this case is “moral hazard.”  The rhetoric of the moneyed interests and their armies of lawyers have deftly tried to invert the meaning of the concept, but as Steve Waldman clearly explains in his excellent interfluidity blog, “moral hazard” is a concept that overwhelmingly falls on investors and creditors.  It means, quite simply, that you as an investor are responsible for where you put your money at risk–and that risk includes it being completely lost.  There are no guarantees.  This was the original rationale of Glass-Steagall: it was accepted that regular working people don’t have the information, time or resources to place their funds, which are meant for savings, under moral hazard.  Same goes for things like the Social Security Trust Fund.  Play with your own “play” money if you have it, but guaranteed deposits and retirement pensions are off-limits since they are backed by the sovereign currency.  Seed corn is not open to being manipulated by cheap paper–that is, until Glass-Steagall was repealed.

The European condition is a bit more complicated only because the EU has never had a consistent separation between its financial elites and civic institutions, given the differences in national traditions and political systems.  But one should be able to clearly see the connection between what is happening in Europe within the EU and in other places around the world: the attack on democratic legitimacy by oligarchs and economic elites.

As Joe Stiglitz points out in the post cited above, Greece–emerging from years of autocratic rule and third world-like conditions–was doing quite well economically until the financial bubble burst across the developed western world.  Many of the banks that invested in hyper-inflated Greek real estate and other development projects were situated in other countries.  The EU under the Euro project is a currency union, and under the Maastricht Treaty that formed this union there were some economic harmonization rules required, mostly pushed by Germany and France, but there is no lender of last resort, no central banking authority equivalent to our Federal Reserve, no centralized budget authority, nor political or cultural ties that would keep old ethnic or nationalist conflicts from flaring up in a crisis.  As Waldman explains, what these other countries did–in particular Germany–was to bail out the banks and investment houses making the debt on these bad investments public obligations.  This sleight of hand politicized what should otherwise should have simply been written off bad investments.  If the Germans wanted to have their own TARP they should have done so.  But it was so much easier to push the consequences onto the Greeks given their weaker position in the EU.

Jared Bernstein in his Washington Post article following the Greek “no” vote quoted an unnamed German economist asserting: “How do you think the people of Manhattan would like bailing out Texas?”  As Krugman rejoined upon reading the article, Manhattan (and other areas of the country) do that all the time as a matter of course.  It was done during the Savings & Loan crisis that was largely a Texas affair back in the late 1980s.  Anyone who looks at the net benefits of federal tax payments and expenditures by state can see that the southeastern states–in particular those that made up the old Confederacy, including Texas, get more in federal benefits than they pay in.  To Americans this is not a big deal–and my use of the term American to identify my countrymen is at the heart of the question.  I don’t know anyone who in reality is a Floridian.  Only buffoons and idiots identify themselves as Texans over their identity as Americans.

Here we tend to put resources where they are needed, hence the United States of America.  More than two hundred years involving waves of immigrants, over one hundred and fifty years of increasing population mobility, and two major world wars and a cold one–two of these existential in nature–during the 20th century, not to mention 9-11, has militated against the old regionalism.  It is not surprising that the assertion that displays such deep ignorance of our own system and society would come from a German economist.  I mean this as no mean insult.

When I was on active duty as a young U.S. Naval officer I met a Scottish couple in Spain who worked at the U.K. embassy there.  They were amazed by my nonchalance in moving my family from California to a home base in Virginia as part of my assignment.  “Do you now identify yourself as a Virginian?” they asked.  When I explained that–no–it was all part of the U.S., they explained that they would always identify themselves as Scots, and that within Scotland that people associated themselves with a particular village or region.  This was almost 30 years ago, and I am told that such attitudes are changing, but it does point to a weakness in the “European” project, especially given that in the most recent U.K. parliamentary elections that the Scottish nationalist party was overwhelming elected to the House of Commons.

Given my own expertise in history and political science, my concern is directed more to the consequences of Greece capitulating to the EU’s economically and politically disastrous demands.  Just ten days ago 60% of the Greek people voted against the conditions imposed by the EU, yet their parliament just approved a package that is punitive by any reasonable and objective measure.  Even the IMF has belatedly–and in a largely dishonest manner which I can only explain as some type of EU face-saving approach–clearly stated that the conditions imposed are unsustainable.

The role of Germany is certainly one of the main issues in this condition.  Given the way they handled the bad paper of their bankers, Merkel and her party have backed themselves into a corner.  So they have done what all desperate politicians do–they have demonized the Greeks.  This is all for mercenary purposes, of course, and without consideration for the long term consequences for both the Greek people and the EU.  What they have done is show the contradictory fault lines in the entire “European” project.  German Finance Minister Schaubel, by attempting to co-opt the Greek threat of a Euro exit by making such terms seem disastrous, has inadvertently made such an exit virtually inevitable.  Greece, not wanting to be left out of “Europe” has just voted against its own best interests, its government never really having a strategy for a “Grexit” because they assumed that their negotiating partners were both rational and well-meaning.  The government very well may fall as a result.

For what the Greek crisis has shown is that the European project under the Euro is neither a completely democratic one nor is it “European.”  The elites in Brussels certainly felt that they had no obligation to consider the Greek referendum on the bailout terms.  To them only the banks, the oligarchs, and the political survival of the political parties in the main assemblies of the nations that support them matter.  The “democratic deficit” of the EU, in the words of the late historian Tony Judt, and the damage that it can cause, is now on full display.  It is not yet clear what will happen, given the contradictory impulses of countries wanting to stay within the single market that “Europe” afford them, the cultural and psychological association to be part of the project, and the punishing loss of national autonomy and democratic legitimacy as the price that must be paid. (Aside from the economic depression and poverty conditions imposed by the EU as the Greeks follow the conditions imposed on them).

One final note:  I can’t help but be impressed by the ideological arguments being used as a matter of course for “helping” the Greek people in the long run.  As John Maynard Keynes noted, in the long run we are all dead.  The tremendous amount of suffering imposed by the EU on the Greek people for their own long-term good sounds much like the fabrications of the old Communists of the former Eastern Block countries who inflicted all sorts of horrors on their own populations for the “long term” good of reaching the perfect socialist state.  Now such arguments are deployed in favor of the perfect capitalist state.  It is “reform” turned on its head, like “moral hazard.”

 

 

 

 

Gotta Serve Somebody — The Proper Balance of Duties in Business–and Project–Management

While traveling over the last couple of weeks I was struck by this article in the Wall Street Journal entitled: “Pharmaceutical Companies Buy Rivals’ Drugs, Then Jack Up the Prices.”  The reporter of the article stated in a somewhat matter-of-fact manner that the reason for this behavior was the need for maximization of stockholder value.  Aside from the fact that, with the poorly vetted excuse mongering in the article about fewer opportunities for development and limitations on payments under healthcare, U.S. drugs tend to be significantly higher than generics found overseas, the assumption regarding maximizing stockholder value is misplaced.

As Steven Pearlstein pointed out in this Wonkblog piece back in 2013, maximizing shareholder’s value is not part of the fiduciary duties of the company nor its CEO.  Such a view is not supported in either tradition or law.  A corporation, as any Business 101 student will tell you, is an artificial person established for the purposes defined in its charter.  This charter is issued by the sovereign–in our case the sovereign consists of the citizens of the states that issue corporate charters through their representative governments.

As Pearlstein rightly points out:  “The fiduciary duty, in fact, is owed simply to the corporation, which is owned by no one, just as you and I are owned by no one — we are all “persons” in the eyes of the law. Shareholders, however, have a contractual claim to the “residual value” of the corporation once all its other obligations have been satisfied — and even then directors are given wide latitude to make whatever use of that residual value they choose, as long they’re not stealing it for themselves.”

The obligations of a company include not only shareholders but also customers, suppliers, and employees and others with whom the corporation establishes contractual relations.  What Pearlstein’s article also shows is that companies that take the position that customer and employee satisfaction comes first are those that maintain and grow market share for a sustained period of time.  This should be no surprise.  Short term management yields either short term results or total failure.  This instability due to the shift in marking for the market for stockholder value was reflected in the 2007-09 Great Recession, where companies with little ability to weather the storm found themselves on the rocks and shoals, and with angry customers and employees left in the lurch.

My question for my colleagues is always this:  are you in it for the long or short term?  If I get the latter answer then I say good luck to you, but we steer this ship on a different course.

The ramifications for project management probably make this conflict between the push for stockholder primacy against other co-equal interests more salient.  To the project manager, the primacy of the project is defined by its contractual line items that are designed to meet customer requirements and expectations.  Stockholders don’t deliver products and services.  Those who wish to extract resources prior to those needed to satisfy customer requirements and contractual commitments–and pay employees and subcontractors in those efforts–whether they know it or not, are, at best, engaging in bad business practices, and, at worst, are engaging in fraud.

Once again, this ship steers a different course.  Staying customer-focused is what has and will continue to win out at the end of the day.