Three’s a Crowd — The Nash Equilibrium, Computer Science, and Economics (and what it means for Project Management theory)

Over the last couple of weeks reading picked up on an interesting article via Brad DeLong’s blog, who picked it up from Larry Hardesty at MIT News.  First a little background devoted to defining terms.  The Nash Equilibrium is a part of Game Theory in measuring how and why people make choices in social networks.  As defined in this Columbia University paper:

A game (in strategic or normal form) consists of the following three elements: a set of players, a set of actions (or pure-strategies) available to each player, and a payoff (or utility) function for each player. The payoff functions represent each player’s preferences over action profiles, where an action profile is simply a list of actions, one for each player. A pure-strategy Nash equilibrium is an action profile with the property that no single player can obtain a higher payoff by deviating unilaterally from this profile.

John Von Neumann developed Game Theory to measure, in a mathematical model, the dynamic of conflicts and cooperation between intelligent rational decision-makers in a system.  All social systems can be measured by the application of Game Theory models.  But with all mathematical modeling, there are limitations to what can be determined.  Unlike science, mathematics can only measure and model what we observe, but it can provide insights that would otherwise go unnoticed.  As such, Von Newmann’s work (along with Oskar Morgenstern and Leonid Kantorovich) in this area has become the cornerstone of mathematical economics.

When dealing with two players in a game, a number of models have been developed to explain the behavior that is observed.  For example, most familiar to us are zero-sum games and tit-for-tat games.  Many of us in business, diplomacy, the military profession, and engaging in old-fashioned office politics have come upon such strategies in day-to-day life.  In the article from MIT News that describes the latest work of Constantinos Daskalakis, an assistant professor in MIT’s Computer Science and Artificial Intelligence Laboratory:

In the real world, competitors in a market or drivers on a highway don’t (usually) calculate the Nash equilibria for their particular games and then adopt the resulting strategies. Rather, they tend to calculate the strategies that will maximize their own outcomes given the current state of play. But if one player shifts strategies, the other players will shift strategies in response, which will drive the first player to shift strategies again, and so on. This kind of feedback will eventually converge toward equilibrium:…The argument has some empirical support. Approximations of the Nash equilibrium for two-player poker have been calculated, and professional poker players tend to adhere to them — particularly if they’ve read any of the many books or articles on game theory’s implications for poker.

Anyone who has engaged in two-player games can intuitively understand this insight, from anything from card games to chess.  But in modeling behavior, when a third player is added to the mix, the mathematics in describing market or system behavior becomes “intractable.”  That is, all of the computing power in the world cannot calculate the Nash equilibrium.

Part of this issue is the age-old paradox, put in plain language, that everything that was hard to do for the first time in the past is now easy to do and verify today.  This includes everything from flying aircraft to dealing with quantum physics.  In computing and modeling, the issue is that every hard problem that has to be computed to solved requires far less resources to be verified.  This is known as the problem of P=NP.

We deal with P=NP problems all the time when developing software applications and dealing with ever larger sets of data.  For example, I attended a meeting recently where a major concern among the audience was over the question of scalability, especially in dealing with large sets of data.  In the past “scalability” to the software publisher simply meant the ability of the application to be used over a large set of users via some form of distributed processing (client-server, shared services, desktop virtualization, or a browser-based deployment).  But now with the introduction of KDD (knowledge discovery in databases) scalability now also addresses the ability of technologies to derive importance from the data itself outside of the confines of a hard-coded application.

The search for optimum polynomial algorithms to reduce the speed of time-intensive problems forces the developer to find the solution (the proof of NP-completeness) in advance and then work toward the middle in developing the appropriate algorithm.  This should not be a surprise.  In breaking Enigma during World War II Bletchley Park first identified regularities in the messages that the German high command was sending out.  This then allowed them to work backwards and forwards in calculating how the encryption could be broken.  The same applies to any set of mundane data, regardless of size, which is not trying hard not to be deciphered.  While we may be faced with a Repository of Babel, it is one that badly wants to be understood.

While intuitively the Nash equilibrium does exist, its mathematically intractable character has demanded that new languages and approaches to solving it be developed.  In the case of Daskalakis, he has proposed three routes.  These are:

  1. “One is to say, we know that there exist games that are hard, but maybe most of them are not hard.  In that case you can seek to identify classes of games that are easy, that are tractable.”
  2. Find mathematical models other than Nash equilibria to characterize markets — “models that describe transition states on the way to equilibrium, for example, or other types of equilibria that aren’t so hard to calculate.”
  3. Approximation of the Nash equilibrium, “where the players’ strategies are almost the best responses to their opponents’ strategies — might not be. In those cases, the approximate equilibrium could turn out to describe the behavior of real-world systems.”

This is the basic engineering approach to any complex problem (and a familiar approach to anyone schooled in project management):  break the system down into smaller pieces to solve.

So what does all of this mean for the discipline of project management?  In modeling complex systems behavior for predictive purposes, our approach must correspondingly break down the elements of systems behavior into their constituent parts, but then integrate them in such as way as to derive significance.  The key to this lies in the availability of data and our ability to process it using methods that go beyond trending data for individual variables.

 

 

 

Walk This Way — DoD IG Reviews DCMA Contracting Officer Business Systems Deficiencies

The sufficiency and effectiveness of business systems is an essential element in the project management ecosystem.  Far beyond performance measurement of the actual effort, the sufficiency of the business systems to support the effort are essential in its success.  If the systems in place do not properly track and record the transactions behind the work being performed, the credibility of the data is called into question.  Furthermore, support and logistical systems, such as procurement, supply, and material management, contribute in a very real way, to work accomplishment.  If that spare part isn’t in-house on time, the work stops.

In catching up on reading this month, I found that the DoD Inspector General issued a report on October 1 showing that of 21 audits demonstrating business system deficiencies, contracting officer timeliness in meeting DFARS deadlines at various milestones existed in every case.  For example, in 17 of those cases Contracting Officers did not issue final determination letters within 30 days of the report as required by the DFARS.  In eight cases required withholds were not assessed.

For those of you who are unfamiliar with the six business systems assessed under DoD contractor project management, they consist of accounting, estimating, material management, purchasing, earned value management, and government property.  The greater the credibility and fidelity of these systems, the greater level of confidence that the government can have in ensuring that the data received in reporting on execution of public funds under these contracts.

To a certain extent the deadlines under the DFARS are so tightly scheduled that they fail to take into account normal delays in operations.  Forbid that the Contracting Officer may be on leave when the audit is received or is engaged in other detailed negotiations.  In recent years the contracting specialty within the government, like government in general, has been seriously understaffed, underfunded, and unsupported.  Given that oftentimes the best and the brightest soon leave government service for greener pastures in the private sector, what is often left are inexperienced and overworked (though mostly dedicated) personnel who do not have the skills or the time to engage in systems thinking in approaching noted deficiencies in these systems.

This pressure for staff reduction, even in areas that have been decimated by austerity politics, is significant.  In the report I could not help but shake my head when an Excel spreadsheet was identified as the “Contractor Business System Determination Timeline Tracking Tool.”  This reminds me of my initial assignment as a young Navy officer and my first assignment as a contract negotiator where I also performed collateral duties in building simple automated tools.  (This led to me being assigned later as the program manager of the first Navy contract and purchase order management system.) That very first system that I built, however, was tracking contract milestone deadlines.  It was done in VisiCalc and the year was 1984.

That a major procurement agency of the U.S. Department of Defense is still using a simple and ineffective spreadsheet tracking “tool” more than 30 years after my own experience is both depressing and alarming.  There is a long and winding history on why they would find themselves in this condition, but some additional training, which was the agency’s response to the IG, is not going to solve the problem.  In fact, such an approach is so ineffective it’s not even a Band-Aid.  It’s a bureaucratic function of answering the mail.

The reason why it won’t solve the problem is because there is no magic wand to get those additional contract negotiators and contracting officers in place.  The large intern program of recruiting young people from colleges to grow talent and provide people with a promising career track is long gone.  Interdisciplinary and cross-domain expertise required in today’s world to reflect the new realities when procuring products and services are not in the works.  In places where they are being attempted, outmoded personnel classification systems based on older concepts of division of labor stand in the way.

The list of systemic causes could go on, but in the end it’s not in the DCMA response because no one cares, and if they do care, they can’t do anything about it.  It’s not as if “BEST TALENT LEAVES DUE TO PUBLIC HOSTILITY TO PUBLIC SERVICE”  was a headline of any significance.  The Post under Bezos is not going to run that one anytime soon, though we’ve been living under it since 1981.  The old “thank you for your service” line for veterans has become a joke.  Those who use this line might as well say what that really means, which is: “I’m glad it was you and not me.”

The only realistic way to augment an organization in this state in order the break the cycle is to automate the system–and to do it in a way as to tie together the entire system.  When I run into my consulting friends and colleagues and they repeat the mantra: “software doesn’t matter, it’s all based on systems” I can only shake my head.  I have learned to be more tactful.

In today’s world software matters.  Try doing today what we used to do with slide rules, scientific calculators, and process charts absent software.  Compare organizations that use the old division-of-labor, “best of breed” tool concept against those who have integrated their systems and use data across domains effectively.  Now tell me again why “software doesn’t matter.”  Not only does it matter but “software” isn’t all the same.  Some “software” consists of individual apps that do one thing.  Some “software” is designed to address enterprise challenges.  Some “software” is designed not only to enterprise challenges, but also to address the maximization of value in enterprise data.

In the case of procurement and business systems assessment, the only path forward for the agency will be to apply data-driven measures to the underlying systems and tie those assessments into a systemic solution that includes the contracting officers, negotiators, administrators, contracting officer representatives, the auditors, analysts, and management.  One can see, just in writing one line, how much more complex are the requirements for the automated panacea to replace “Contractor Business System Determination Timeline Tracking Tool.”  Is there any question why the “tool” is ineffective?

If this were the 1990s, though the practice still persists, we would sit down, perform systems analysis, outline the systems and subsystem solutions, and then through various stages of project management, design the software system to reflect the actual system in place as if organizational change did not exist.  This is the process that has a 90% failure rate across government and industry.  The level of denial to this figure is so great that I run into IT managers and CIOs every day that fail to know it or, if they do, believe that it will apply to them–and these are brilliant people.  It is selection bias and optimism, with a little (or a lot) of narcissism, run amok.  The physics and math on this are so well documented that you might as well take your organization’s money and go to Vegas with it.  Your local bookie could give you better odds.

The key is risk handling (not the weasel word “management,” not “mitigation” since some risks must simply be accepted, and certainly not the unrealistic term “avoidance”), and the deployment of technology that provides at least a partial solution to the entire problem, augmented by incremental changes to incorporate each system into the overall solution. For example, DeLong and Froomkin’s seminal paper on what they called “The Next Economy” holds true today.  The lack of transparency in software technologies requires a process whereby the market is surveyed, vendors must go through a series of assessments and demonstration tests, and where the selected technology then goes through stage gates: proof-of-concept, pilot, and, eventually deployment.  Success at each level gets rewarded with proceeding to the next step.

Thus, ideally the process includes introducing into the underlying functionality the specific functionality required by the organization through Agile processes where releasable versions of the solution are delivered at the end of each sprint.  One need not be an Agile Cultist to do this.  In my previous post I referred to Neil Killick’s simple checklist for whether you are engaged in Agile.  It is the best and most succinct distillation of both the process and value inherent in Agile that I have found to date, with all of the “woo-woo” taken out.  For an agency as Byzantine as DCMA, this is really the only realistic and effective approach.

DCMA is an essential agency in DoD acquisition management, but it cannot do what it once did under a more favorable funding environment.  To be frank, it didn’t even do its job all that well when a more favorable condition was in place, though things were better.  But this is also a factor in why it finds itself in its current state.  It was punished for its transgressions, perhaps too much.  Several waves of personnel cuts, staff reductions, and domain and corporate knowledge loss on top of the general trend has created an agency in a condition of siege.  As with any organization under siege, backbiting and careerism for those few remaining is rewarded.  Iconoclasts and thought leaders stay for a while before being driven away.  They are seen as being too risky.

This does not create a condition for an agency ready to accept or quickly execute change through new technology.  What it does do is allow portions of the agency to engage in cargo cult change management.  That is, it has the appearance of change but keeps self-interest comfortable and change in its place.  Over time–several years–with the few remaining resources committed to this process, they will work the “change.”  Eventually, they may even get something tangible, though suboptimized to conform to rice bowls; preferably after management has their retirement plans secured.

Still, the reality is that DCMA must be made to do it’s job because it is in the best interests of the U.S. Department of Defense.  The panacea will not be found through “collaboration” with industry, which consists of the companies which DCMA is tasked with overseeing and regulating.  We all know how well deregulation and collaboration has worked in the financial derivatives, banking, mortgage, and stock markets.  Nor will it come from organic efforts within an understaffed and under-resourced agency that will be unable to leverage the best and latest technology solutions under the unforgiving math of organic IT failure rates.  Nor will deploying the long outmoded approach of deploying suboptimized “tools” to address a particular problem.  The proper solution is to leverage effective COTS solutions that facilitate the challenge of systems integration and thinking.

 

 

Family Affair — Part III — Private Monopsony, Monopoly, and the Disaccumulation of Capital

It’s always good to be ahead of the power curve.  I see that the eminent Paul Krugman had an editorial in the New York Times about the very issues that I’ve dealt with in this blog, his example in this case being Amazon.  This is just one of many articles that have been raised about the monopsony power as a result of the Hatchette controversy.  In The New Republic Franklin Foer also addresses this issue at length in the article “Amazon Must Be Stopped.”  In my last post on this topic I discussed public monopsony, an area in which I have a great deal of expertise.  But those of us in the information world that are not Microsoft, Oracle, Google, or one of the other giants also live in the world of private monopsony.

For those or you late to these musings (or skipped the last ones), this line of inquiry when my colleague Mark Phillips made the statement at a recent conference that, while economic prospects for the average citizen are bad, that the best system that can be devised is one based on free market competition, misquoting Churchill.  The underlying premise of the statement, of course, is that this is the system that we currently inhabit, and that it is the most efficient way to distribute resources.  There also is usually an ideological component involved regarding some variation of free market fundamentalism and the concept that the free market is somehow separate from and superior to the government issuing the currency under which the system operates.

My counter to the assertions found in that compound statement is to prove that the economic system that we inhabit is not a perfectly competitive one, that there are large swaths of the market that are dysfunctional and that have given rise to monopoly, oligopoly, and monopsony power.  In addition, the ideological belief–which is very recent–that the roots of private economic activity is one that had arisen almost spontaneously with government being a separate component that can only be an imposition, is also false, given that nation-states and unions of nation states (as in the case of the European Union) are the issuers of sovereign currency, and so choose through their institutions the amount of freedom, regulation, and competition that their economies foster.  Thus, the economic system that we inhabit is the result of political action and public policy.

The effects of the distortions of monopoly and oligopoly power in the so-called private sector is all around us.  But when one peels back the onion we can see clearly the interrelationships between the private and public sectors.

For example, patent monopolies in the pharmaceutical industry allow for prices to be set, not based on the marginal value of the drug that would be set by a competitive market, but based on the impulse for profit maximization.  A recent example in the press lately–and critiqued by economist Dean Baker–has concerned the hepatitis-C drug Sovaldi, which goes for $84,000 a treatment, compared to markets in which the drug has not been granted a patent monopoly, where the price is about $900 a treatment.  Monopoly power, in the words of Baker, impose 10,000 percent tariff on those who must live under that system.  This was one of the defects in a system that I wrote about in my blog posts regarding tournaments and games of failure, though in pharmaceuticals the comparison seems to be more in line with gambling and lotteries.  The financial risks of investors, who often provide funds based on the slimmest thread of a good idea and talent, are willing to put great sums of money at risk in order to strike it rich and realize many times their initial investment.  The distorting incentives on this system are well documented: companies tend to focus on those medications and drugs with the greatest potential financial rate of return guaranteed by the patent monopoly system*, drug trials that downplay the risks and side-effects of the medications, and the price of medications is placed at so high a level as to eliminate them from all but the richest members of society since few private drug insurance plans will authorize such treatments given the cost–at least not without a Herculean effort on the part of individual patients.

We can also see the monopoly power at work first hand with the present lawsuits between Apple and Samsung regarding the smartphone market.  For many years (until very recently) the U.S. patent office took a permissive stand in allowing technology firms to essentially patent the look and feel of a technology, as well as features that could be developed and delivered by any number of means.  The legal system, probably more technologically challenged than other areas of society, has been inconsistent in determining how to deal with these claims.  The fact finder in many cases has been juries, who are not familiar with the nuances of the technology.  One need not make a stretch to pick out practical analogies of these decisions.  If applied to automobiles, for example, the many cases that have enforced these patent monopolies would have restricted windshield wipers to the first company that delivered the feature.  Oil filters, fuel filters, fuel injection, etc. would all have been restricted to one maker.

The stakes are high not only for these two giant technology companies but also for consumers.  They have already used their respective monopoly power established by their sovereign governments to pretty effectively ensure that the barriers to entry in the smartphone market are quite high.  Now they are unleashing these same forces on one another.  In the end, the manufacturing costs of the iPhone 6–which is produced by slave labor under the capitalist variant of Leninist China–are certainly much lower than the $500 and more that they demand (along with the anti-competitive practice of requiring a cellular agreement with one of their approved partners).  The tariff that consumers pay for the actual cost of production and maintenance on smartphones is significant.  This is not remedied by the oft-heard response to “simply not buy a smartphone,” since it shifts responsibility for the establishment of the public policy that allows this practice to flourish, to individuals who are comparatively powerless against the organized power of lobbyists who influenced public representatives to make these laws and institute the policy.

The fight over IP and patent (as well as net neutrality) are important for the future of technological innovation.  Given the monopsony power of companies that also exert monopoly power in particular industries, manufacturers are at risk of being squeezed in cases where prices are artificially reduced through the asymmetrical relationship between large buyers and relatively small sellers.  Central planning, regardless of whether it is exerted by a government or a large corporation, is dysfunctional.  When those same corporations seek to not only exert monopoly and monopsony power, but also to control information and technology, they seek to control all aspects of an economic activity not unlike the trusts of the time of the Robber Barons.  Amazon and Walmart are but two of the poster children of this situation.

The saving grace of late has been technological “disruption,” but this term has been misused to also apply to rent-seeking behavior.  I am not referring only to the kind of public policy rent-seeking that Amazon achieves when it avoids paying local taxes that apply to its competitors, or that Walmart achieves when it shifts its substandard pay and abusive employee policies to local, state, and federal public assistance agencies.  I am also referring to the latest controversies regarding AirBnB, Lyft, and Uber, which use loopholes in dealing with technology to sidestep health and safety laws in order to gain entry into a market.

Technological disruption, instead, is a specific phenomenon, based on the principle that the organic barriers to entry in a market are significantly reduced due to the introduction of technology.  The issue over the control of and access to information and innovation is specifically targeted at this phenomenon.  Large companies aggressively work to keep out new entries and to hinder innovations except those that they can control, conspiring against the public good.

The reason for why these battles are lining up resides in the modern phenomena known as disaccumulation of capital, which was first identified by social scientist Martin J. Sklar.  What this means is that the accumulation of capital, which is the time it takes to reproduce the existing material conditions of civilization, began declining in the 1920s.  As James Livingston points out in the same linked article in The Nation, “economic growth no longer required net additions either to the capital stock or the labor force….for the first time in history, human beings could increase the output of goods without increasing the essential inputs of capital and labor—they were released from the iron grip of economic necessity.”

For most of the history of civilization, the initial struggle of economics has been the ability for social organization to provide sufficient food, clothing, shelter, and medical care to people.  The conflicts between competing systems has been centered on their ability to most efficiently achieve these purposes without sacrificing individual liberty, autonomy, and dignity.  The technical solution for these goals has largely been achieved, but the efficient distribution of these essential elements of human existence has not been solved.  With the introduction of more efficient methods of information processing as well as production (digital printing is just the start), we are at the point where the process of less capital in the aggregate being required to produce the necessities and other artifacts of civilization is accelerating exponentially.

Concepts like full employment will increasingly become meaningless, because the same relationship of labor input to production that we came to expect in the recent past has changed within our own lifetimes.  Very small companies, particularly in technology, can have and have had a large impact.  In more than one market, even technology companies are re-learning the lesson of the “mythical man-month.”  Thus, the challenge in our time is to rethink the choices we have made and are making in terms of incentives and distribution that maximizes human flourishing.  But I will leave that larger question to another blog post.

For the purposes of this post focused on technology and project management, these developments call for a new microeconomics.  The seminal paper that identified this need early on was by Brad DeLong and Michael Froomkin in 1997 entitled “The Next Economy.”  While some of the real life examples they give from our perspective today provides a stroll down the digital memory-lane,  their main conclusions are relevant in how information differs from physical goods.  These are:

a.  Information is non-rivalrous.  That is, one person consuming information does not preclude someone else from consuming that information.  That is, information that is produced can be economically reproduced to operate in other environments at little to no marginal cost.  What they are talking about here is application software and the labor involved in producing a version of it.

b.  Information without exterior barriers is non-exclusive.  That is, if information is known it is almost impossible for others to know it.  For example, Einstein was the first to observe the mathematics of relativity but now every undergraduate physics student is expected to fully understand the theory.

c.  Information is not transparent.  That is, oftentimes in order to determine whether a piece of software will achieve its intended purpose, effort and resources must be invested to learn it and, oftentimes, apply it if initially only in a pilot program.

The attack coming from monopsony power is directed at the first characteristic of information.  The attack coming from monopoly power is currently directed at the second.    Doing so undermines both competition and innovation.  The first by denying the ability of small technology companies to capitalize sufficiently to develop the infrastructure necessary to become sustainable.  Oftentimes this reduces a market to one dominant supplier.  The second by restricting the application of new technologies and lessons learned based on the past.  The nature of information asymmetry is a problem for the third aspect of information, since oftentimes bad actors are economically rewarded at the expense of high quality performers as first identified in the automobile industry in George Akerlof’s paper “The Market for Lemons” (paywall).

The strategy of some entrepreneurs in small companies in reaction to these pressures has been to either sell out and be absorbed by the giants, or to sell out to private equity firms that “add value” by combining companies in lieu of organic growth, loading them down with debt from non-sustainable structuring, and selling off the new entity or its parts.  The track record for the sustainability of the applications involved in these transactions (and the satisfaction of customers) is a poor one.

One of the few places where competition still survives is among small to medium sized technology companies.  In order for these companies (and the project managers in them) to survive independently requires not only an understanding of the principles elucidated by DeLong and Froomkin.  Information also shares several tendencies with other technological innovation, but in ways that are unique to it, in improving efficiency and productivity; and in reducing the input of labor and capital.

The key is in understanding how to articulate value, how to identify opportunities for disruption, and to understand the nature of the markets in which one operates.  One’s behavior will be different if the market is diverse and vibrant, with many prospective buyers and diverse needs, as opposed to one dominated by one or a few buyers.  In the end it comes down to understanding the pain of the customer and having the agility and flexibility to solve that pain in areas where larger companies are weak or complacent.

 

*Where is that Ebola vaccine–which mainly would have benefited the citizens of poor African countries and our own members of the health services and armed forces–that would have averted public panic today?

Take Me Out to the Ballgame — Tournaments and Games of Failure

“Baseball teaches us, or has taught most of us, how to deal with failure. We learn at a very young age that failure is the norm in baseball and, precisely because we have failed, we hold in high regard those who fail less often – those who hit safely in one out of three chances and become star players. I also find it fascinating that baseball, alone in sport, considers errors to be part of the game, part of it’s rigorous truth.” — Fay Vincent, former Commissioner of Baseball (1989-1992)

“Baseball is a game of inches.”  — Branch Rickey, Quote Magazine, July 31, 1966

I have been a baseball fan just about as long as I have been able to talk.  My father played the game and tried out for both what were the New York Giants and Yankees–and was a pretty well known local hero in Weehawken back in the 1930s and 1940s.  I did not have my father’s athletic talents–a four letter man in high school–but I was good at hitting a baseball from the time he put a bat in my hands and so I played–and was sought after–into my college years.  Still, like many Americans who for one reason or another could not or did not pursue the game, I live vicariously through the players on the field.  We hold those who fail less in the game in high regard.  Some of them succeed for many years and are ensconced in the Hall of Fame.

Others experienced fleeting success.  Anyone who watches ESPN’s or the Yes Channel’s classic games, particularly those from the various World Series, can see this reality in play.  What if Bill Buckner in 1986 hadn’t missed that ball?  What if Bobby Richardson had not been in perfect position to catch what would have been a game and series winning liner by Willie McCovey in 1962?  Would Brooklyn have every won a series if Amoros hadn’t caught Berra’s drive down the left field line in 1955?  The Texas Rangers might have their first World Series ring if not for a plethora of errors, both mental and physical, in the sixth game of the 2011 Series.  The list can go on and it takes watching just a few of these games to realize that luck plays a big part in who is the victor.

There are other games of failure that we deal with in life, though oftentimes we don’t recognize them as such.  In economics these are called “tournaments,” and much like their early Medieval predecessor (as opposed to the stylized late Medieval and Renaissance games), the stakes are high.  In pondering the sorry state of my favorite team–the New York Yankees–as I watched seemingly minor errors and failures cascade into a humiliating loss, I came across a blog post by Brad DeLong, distinguished professor of economics at U.C. Berkeley, entitled “Over at Project Syndicate/Equitable Growth: What Do We Deserve Anyway?”  Dr. DeLong makes the very valid point, verified not only by anecdotal experience but years of economic research, that most human efforts, particularly economic ones, fail, and that the key determinants aren’t always–or do not seem in most cases–to be due to lack of talent, hard work, dedication, or any of the attributes that successful people like to credit for their success.

Instead, much of the economy, which in its present form is largely based on a tournament-like structure, allows only a small percentage of entrants to extract their marginal product from society in the form of extremely high levels of compensation.  The fact that these examples exist is much like a lottery, as the following quote from Dr. DeLong illustrates.

“If you win the lottery–and if the big prize in the lottery that is given to you is there in order to induce others to overestimate their chances and purchase lottery tickets and so enrich the lottery runner–do you “deserve” your winnings? It is not a win-win-win transaction: you are happy being paid, the lottery promoter is happy paying you, but the others who purchase lottery tickets are not happy–or, perhaps, would not be happy in their best selves if they understood what their chances really were and how your winning is finely-tuned to mislead them, for they do voluntarily buy the lottery tickets and you do have a choice.”  — Brad DeLong, Professor of Economics, U.C. Berkeley

So even though participants have a “choice,” it is one that is based on an intricately established system based on self-delusion.  It was about this time that I came across the excellent HBO Series “Silicon Valley.”  The tournament aspect of the software industry is apparent in the conferences and competitions for both customers and investors in which I have participated over the years.  In the end, luck and timing seem to play the biggest role in success (apart from having sufficient capital and reliable business partners).

I hope this parody ends my colleagues’ (and future techies’) claims to making the claim to “revolutionize” and “make the world a better place” through software.