Do You Know Where You’re Going To? — SecDef Ash Carter talks to Neil DeGrasse Tyson…and some thoughts on the international technology business

It’s time to kick off my 2017 blogging activity and my readers have asked about my absence on this blog.  Well because of the depth and research required by some of the issues that I consider essential, most of my blogging energy has been going to contributions to AITS.org.  I strongly recommend that you check out the site if you haven’t already.  A great deal of useful PM information and content can be found there–and they have a strong editorial staff so that what does get to publication is pretty well sourced.  My next post on the site is scheduled for 25 January.  I will link to it once it becomes available.

For those of us just getting back into the swing of things after the holidays, there were a number of interesting events that occurred during that time that I didn’t get a chance to note.  Among these is that SecDef Ash Carter appeared (unfortunately a subscription wall) on an episode of Neil DeGrasse Tyson’s excellent show “StarTalk“, which appears on the National Geographic Channel.

Secretary Carter had some interesting things to say, among them are:

a. His mentors in science, many of whom were veterans of the Second World War, instilled in him the concept of public service and giving back to the country.

b.  His experience under former SecDef Perry, when he was Assistant Secretary of Defense for International Security Policy, taught him that the DoD needed to be the “petri dish” for R&D in new technologies.

c.  That the approach of the DoD has been to leverage the R&D into new technologies that can be leveraged from the international technology industry, given that there are many good ideas and developments that occur outside of the United States.

d.  He encouraged more scientists to serve in the federal government and the Department of Defense, if even for a short while to get a perspective on how things work at that level.

e.  He doesn’t see the biggest source of instability will necessarily be from nation states, but that small groups of individuals, given that destructive power is becoming portable, will be the emerging threat that his successor will face.

f. There imperative that the U.S. maintain its technological edge is essential in guaranteeing international stability and peace.

Secretary Carter’s comments, in particular, in realizing that the technology industry is an international one strikes a particular personal cord with me since my present vocation has caused me to introduce new capabilities in the U.S. market built from technologies that were developed by a close European ally.  The synergy that this meeting of the minds has created has begun to have a positive impact on the small portion of the market that my firm inhabits, changing the way people do business and shifting the focus from “tools” as the source of information to data, and what the data suggests.

This is not to say that cooperation in the international technology market is not fraught with the same rocks and shoals found in any business area.  But it is becoming increasingly apparent that new information technologies can be used as a means of evening the playing field because of the asymmetrical nature of information itself, which then lends itself to leverage given relatively small amounts of effort.

This also points to the importance of keeping an open mind and encouraging international trade, especially among our allies that are among the liberal democracies.  Recently my firm was the target of a protest for a government contract where this connection to international trade was used as a means of questioning whether the firm was, indeed, a bonafide U.S. business.  The answer under U.S. law is a resounding “yes”–and that first decision was upheld on appeal.  For what we have done is–under U.S. management–leveraged technology first developed elsewhere, extended its capabilities, designed, developed, and localized it for the U.S. market, and in the process created U.S. jobs and improved U.S. processes.  This is a good deal all around.

Back in the day when I wore a U.S. Navy uniform during the Cold War military, many of us in the technology and acquisition specialties looked to reform our systems and introduce innovative methods from wherever we could find them, whether they came from private industry or other government agencies.  When coming upon resistance because something was “the way it always was done” our characterization of that attitude was “NIH”.  That is, “Not Invented Here.”  NIH was a term that, in shorthand, described an invalid counterargument against process improvement that did not rely on the merits or evidence.

And so it is today.  The world is always changing, but given new technologies the rate of change is constantly accelerating.  Adapting and adopting the best technologies available will continue to give us the advantage as a nation.  It simply requires openness and the ability to identify innovation when we see it.

Sunday Contemplation — Finding Wisdom — Daniel Dennett in “Darwin’s Dangerous Idea”

Daniel_Dennett

“The Darwinian Revolution is both a scientific and a philosophical revolution, and neither revolution could have occurred without the other. As we shall see, it was the philosophical prejudices of the scientists, more than their lack of scientific evidence, that prevented them from seeing how the theory could actually work, but those philosophical prejudices that had to be overthrown were too deeply entrenched to be dislodged by mere philosophical brilliance. It took an irresistible parade of hard-won scientific facts to force thinkers to take seriously the weird new outlook that Darwin proposed…. If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law. But it is not just a wonderful scientific idea. It is a dangerous idea.”

Daniel Dennett (pictured above thanks to Wikipedia) is the Co-Director of the Center for Cognitive Studies and Austin B. Fletcher Professor of Philosophy at Tufts University.  He is also known as “Dawkins’ Bulldog”, for his pointed criticism of what he viewed as unnecessary revisions to Darwinian Theory by Stephen Jay Gould, who was also a previous subject of this blog, and others.  In popular culture he has also been numbered among the “Four Horsemen” of the so-called “New Atheism”.  His intellectual and academic achievements are many, and his insights into evolution, social systems, cognition, consciousness, free will, philosophy, and artificial intelligence are extremely influential.

Back in 1995, when I was a newly minted Commander in the United States Navy, I happened across an intriguing book in a Jacksonville, Florida bookshop during a temporary duty assignment.  The book was entitled Darwin’s Dangerous Idea: Evolution and the Meanings of Life.  I opened it that afternoon during a gentle early spring Florida day and found myself astounded and my mind liberated, as if chains which I had not previously noticed, but which had bound my mind, had been broken and released me, so great was the influence of the philosophical articulation of this “dangerous idea”.

Here, for the first time, was a book that took what we currently know about the biological sciences and placed them within the context of other scientific domains–and done so in a highly organized, articulate, and readable manner.  The achievement of the book was not so much in deriving new knowledge, but in presenting an exposition of the known state of the science and tracing its significance and impact–no mean achievement given the complexity of the subject matter and the depth and breadth of knowledge being covered.  The subject matter, of course, is highly controversial only because it addresses subjects that engender the most fear: the facts of human origins, development, nature, biological interconnectedness, and the inevitability of mortality.

Dennett divides his thesis into three parts: the method of developing the theory and its empirical proofs, it’s impact on the biological sciences, and the impact on other disciplines, especially regarding consciousness, philosophy, sociology, and morality.  He introduces and develops several concepts, virtually all of which have since become cornerstones in human inquiry, and not only among the biological sciences.

Among these are the concepts of design space, of natural selection behaving as an algorithm, of Darwinism acting as a “universal acid” that transforms the worldview of everything it touches, and of the mental concepts of skyhooks, cranes and “just-so” stories–fallacious and magical ways of thinking that have no underlying empirical foundation to explain natural phenomena.

The concept of the design space has troubled many, though not most evolutionary biologists and physicists, only because Dennett posits a philosophical position in lieu of a mathematical one.  This does not necessarily undermine his thesis, simply because one must usually begin with a description of a thesis before one can determine whether it can be disproven.  Furthermore, Dennett is a philosopher of the analytical school and so the scope of his work is designed from that perspective.

But there are examples that approach the analogue of design space in physics–those that visualize space-time and general relativity as at this site.  It is not a stretch to understand that our reality–the design space that the earth inhabits among many alternative types of design spaces that may exist that relate to biological evolution–can eventually be mathematically formulated.  Given that our knowledge of comparative planetary and biological physics is still largely speculative and relegated to cosmological speculation, the analogy for now is sufficient and understandable.  It also gives a new cast to the concept of adaptation away from the popular (and erroneous) concept of “survival of the fittest”, since fitness is based on the ability to adapt to environmental pressures and to find niches that may exist in that environment.  With our tracing of the effects of climate change on species, we will be witnessing first hand the brutal concept of design space.

Going hand-in-hand with design space is the concept that Darwinian evolution through the agent of natural selection is an algorithmic process.  This understanding becomes “universal acid” that, according to Dennett, “eats through just about every traditional concept and leaves in its wake a revolutionized world-view.”

One can understand the objection of philosophers and practitioners of metaphysics to this concept, which many of them have characterized as nihilistic.  This, of course, is argument from analogy–a fallacious form of rhetoric.  The objection to the book through these arguments, regardless of the speciousness of their basis, is premature and a charge to which Dennett effectively responds through his book Consciousness Explained.  It is in this volume that Dennett addresses the basis for the conscious self, “intentionality”, and the concept of free will (and its limitations)–what in the biological and complexity sciences is described as emergence.

What Dennett has done through describing the universal acid of Darwinian evolution is to describe a phenomenon: the explanatory reason for rapid social change that we have and are witnessing, and the resulting reaction and backlash to it.  For example, the revolution that was engendered from the Human Genome Project not only has confirmed our species’ place in the web of life on Earth and our evolutionary place among primates, but also the interconnections deriving from descent from common ancestors of the entire human species, exploding the concept of race and any claim to inherent superiority or inferiority to any cultural grouping of humans.

One can clearly see the threat this basic truth has to entrenched beliefs deriving from conservative philosophy, cultural tradition, metaphysics, religion, national borders, ethnic identity, and economic self-interest.

For it is apparent to me, given my reading not only of Dennett, but also that of both popularizers and the leading minds in the biological sciences that included Dawkins, Goodall, Margulis, Wilson, Watson, Venter, Crick, Sanger, and Gould; in physics from Hawking, Penrose, Weinberg, Guth, and Krauss, in mathematics from Wiles, Witten, and Diaconis; in astrophysics from Sandage, Sagan, and deGrasse Tyson; in climate science from Hansen and many others; and in the information sciences from Moore, Knuth, and Berners-Lee, that we are in the midst of another intellectual revolution.  This intellectual revolution far outstrips both the Renaissance and the Enlightenment as periods of human achievement and advancement, if only because of the widespread availability of education, literacy, healthcare, and technology, as well as human diversity, which both accelerates and expands many times over the impact of each increment in knowledge.

When one realizes that both of those earlier periods of scientific and intellectual advance engendered significant periods of social, political, and economic instability, upheaval, and conflict, then the reasons for many of the conflicts in our own times become clear.  It was apparent to me then–and even more apparent to me now–that there will be a great overturning of the institutional, legal, economic, social, political, and philosophic ideas and structures that now exist as a result.  We are already seeing the strains in many areas.  No doubt there are interests looking to see if they can capitalize on or exploit these new alignments.  But for those overarching power structures that exert control, conflict, backlash, and eventual resolution is inevitable.

In this way Fukuyama was wrong in the most basic sense in his thesis in The End of History and the Last Man to the extent that he misidentified ideologies as the driving force behind the future of human social organization.  What he missed in his social “science” (*) is the shift to the empirical sciences as the nexus of change.  The development of analytical philosophy (especially American Pragmatism) and more scientifically-based modeling in the social sciences are only the start, but one can make the argument that these ideas have been more influential in clearly demonstrating that history, in Fukuyama’s definition, is not over.

Among the first shots over the bow from science into the social sciences have come works from such diverse writers as Jared Diamond (Guns, Germs, and Steel: The Fates of Human Societies (1997)) and Sam Harris (The Moral Landscape: How Science Can Determine Human Values (2010)).  The next wave will, no doubt, be more intense and drive further resistance and conflict.

The imperative of science informing our other institutions is amply demonstrated by two facts.

  1. On March 11, 2016 an asteroid that was large enough to extinguish a good part of all life on earth came within 19,900 miles of our planet’s center.  This was not as close, however, as the one that passed on February 25 (8,900 miles).  There is no invisible shield or Goldilocks Zone to magically protect us.  The evidence of previous life-ending collisions are more apparent with each new high resolution satellite image of our planet’s surface.  One day we will look up and see our end slowly but inevitably making its way toward us, unless we decide to take measures to prevent such a catastrophe.
  2. Despite the desire to deny that it’s happening, 2015 was the hottest recorded year on record and 2016 thus far is surpassing that, providing further empirical evidence of the validity of Global Warming models.  In fact, the last four consecutive years fall within the four hottest years on record (2014 was the previous hottest year).  The outlier was 2010, another previous high, which is hanging in at number 3 for now.  2013 is at number 4 and 2012 at number 8.  Note the general trend.  As Jared Diamond has convincingly demonstrated–the basis of conflict and societal collapse is usually rooted in population pressures exacerbated by resource scarcity.  We are just about to the point of no return, given the complexity of the systems involved, and can only mitigate the inevitable–but we must act now to do.

What human civilization does not want to be is on the wrong side of history in how to deal with these challenges.  Existing human power structures and interests would like to keep the scientific community within the box of technology–and no doubt there are still scientists that are comfortable to stay within that box.

The fear regarding allowing science to move beyond the box of technology and general knowledge is its misuse and misinterpretation, usually by non-scientists, such as the reprehensible meme of Social Darwinism (which is neither social nor Darwinian).**  This fear is oftentimes transmitted by people with a stake in controlling the agenda or interpreting what science has determined.  Its contingent nature also is a point of fear.  While few major theories are usually completely overturned as new knowledge is uncovered, the very nature of revision and adjustment to theory is frightening to people who depend on, at least, the illusion of continuity and hard truths.  Finally, science puts us in our place within the universe.  If there are millions of planets that can harbor some kind of life, and a sub-set of those that have the design space to allow for some kind of intelligent life (as we understand that concept), are we really so special after all?

But not only within the universe.  Within societies, if all humans have developed from a common set of ancestors, then our basic humanity is a shared one.  If the health and sustainability of an ecology is based on its biodiversity, then the implication for human societies is likewise found in diversity of thought and culture, eschewing tribalism and extreme social stratification.  If the universe is deterministic with only probability determining ultimate cause and effect, then how truly free is free will?  And what does this say about the circumstances in which each of us finds him or herself?

The question now is whether we embrace our fears, manipulated by demagogues and oligarchs, or embrace the future, before the future overwhelms and extinguishes us–and to do so in a manner that is consistent with our humanity and ethical reasoning.

 

Note:  Full disclosure.  As a senior officer concerned with questions of AI, cognition, and complex adaptive systems, I opened a short correspondence with Dr. Dennett about those subjects.  I also addressed what I viewed as his unfair criticism (being Dawkins’ Bulldog) of punctuated equilibrium, spandrels, and other minor concepts advanced by Stephen Jay Gould, offering a way that Gould’s concepts were well within Darwinian Theory, as well as being both interesting and explanatory.  Given that less complex adaptive systems that can be observed do display punctuated periods of rapid development–and also continue to have the vestiges of previous adaptations that no longer have a purpose–it seemed to me that larger systems must also do so, the punctuation being on a different time-scale, and that any adaptation cannot be precise given that biological organisms are imprecise.  He was most accommodating and patient, and this writer learned quite a bit in our short exchange.  My only regret was not to continue the conversation.  I do agree with Dr. Dennett (and others) on their criticism of non-overlapping magisteria (NOMA), as is apparent in this post.

You Know I’m No Good: 2016 Election Polls and Predictive Analytics

While the excitement and emotions of this past election work themselves out in the populace at large, as a writer and contributor to the use of predictive analytics, I find the discussion about “where the polls went wrong” to be of most interest.  This is an important discussion, because the most reliable polling organizations–those that have proven themselves out by being right consistently on a whole host of issues since most of the world moved to digitization and the Internet of Things in their daily lives–seemed to be dead wrong in certain of their predictions.  I say certain because the polls were not completely wrong.

For partisans who point to Brexit and polling in the U.K., I hasten to add that this is comparing apples to oranges.  The major U.S. polling organizations that use aggregation and Bayesian modeling did not poll Brexit.  In fact, there was one reliable U.K. polling organization that did note two factors:  one was that the trend in the final days was toward Brexit, and the other is that the final result was based on turnout, where greater turnout favored the “stay” vote.

But aside from these general details, this issue is of interest in project management because, unlike national and state polling, where there are sufficient numbers to support significance, at the micro-microeconomic level of project management we deal with very small datasets that expand the range of probable results.  This is not an insignificant point that has been made time-and-time again over the years, particularly given single-point estimates using limited time-phased data absent a general model that provides insight into what are the likeliest results.  This last point is important.

So let’s look at the national polls on the eve of the election according to RealClear.  IBD/TIPP Tracking had it Trump +2 at +/-3.1% in a four way race.  LA Times/USC had it Trump +3 at the 95% confidence interval, which essentially means tied.  Bloomberg had Clinton +3, CBS had Clinton +4, Fox had Clinton +4, Reuters/Ipsos had Clinton +3, ABC/WashPost, Monmouth, Economist/YouGov, Rasmussen, and NBC/SM had Clinton +2 to +6.  The margin for error for almost all of these polls varies from +/-3% to +/-4%.

As of this writing Clinton sits at about +1.8% nationally, the votes are still coming in and continue to confirm her popular vote lead, currently standing at about 300,000 votes.  Of the polls cited, Rasmussen was the closest to the final result.  Virtually every other poll, however, except IBD/TIPP, was within the margin of error.

The polling that was off in predicting the election were those that aggregated polls along with state polls, adjusted polling based on non-direct polling indicators, and/or then projected the chances of winning based on the probable electoral vote totals.  This is where things were off.

Among the most popular of these sites is Nate Silver’s FiveThirtyEight blog.  Silver established his bonafides in 2008 by picking winners with incredible accuracy, particularly at the state level, and subsequently in his work at the New York Times which continued to prove the efficacy of data in predictive analytics in everything from elections to sports.  Since that time his significant reputation has only grown.

What Silver does is determine the probability of an electoral outcome by using poll results that are transparent in their methodologies and that have a high level of confidence.  Silver’s was the most conservative of these types of polling organizations.  On the eve of the election Silver gave Clinton a 71% chance of winning the presidency. The other organizations that use poll aggregation, poll normalization, or other adjusting indicators (such as betting odds, financial market indicators, and political science indicators) include the New York Times Upshot (Clinton 85%), HuffPost (Clinton 98%), PredictWise (Clinton 89%), Princeton (Clinton >99%), DailyKos (Clinton 92%), Cook (Lean Clinton), Roth (Lean Clinton), and Sabato (Lean Clinton).

In order to understand what probability means in this context, the polls were using both bottom-up state polling to track the electoral college combined with national popular vote polling.  But keep in mind that, as Nate Silver wrote over the course of the election, that just a 17% chance of winning “is the same as your chances of losing a “game” of Russian roulette”.  Few of us would take that bet, particularly since the result of losing the game is finality.

Still, except for FiveThirtyEight, none of the other methods using probability got it right.  None, except FiveThirtyEight, left enough room for drawing the wrong chamber.  Also, in fairness, the Cook, Rothenberg, and Sabato projections also left enough room to see a Trump win if the state dominoes fell right.

The place that the models failed were in the states of Florida, North Carolina, Pennsylvania, Michigan, and Wisconsin.  In particular, even with Florida (result Trump +1.3%) and North Carolina (result Trump +3.8%), Trump would not win if Pennsylvania (result Trump +1.2%), Michigan (result Trump +.3), and Wisconsin (result Trump +1.0)–supposed Clinton firewall states–were not breached.  So what happened?

Among the possible factors are the effect of FBI Director Comey’s public intervention, which was too soon to the election to register in the polling; ineffective polling methods in rural areas (garbage in-garbage out), bad state polling quality, voter suppression, purging, and restrictions (of the battleground states this includes Florida, North Carolina, Wisconsin, Ohio, and Iowa), voter turnout and enthusiasm (aside from the factors of voter suppression), and the inability to peg the way the high level of undecided voters would go at the last minute.

In hindsight, the national polls were good predictors.  The sufficiency of the data in drawing significance, and the high level of confidence in their predictive power is borne out by the final national vote totals.

I think that where the polling failed in the projections of the electoral college was from the inability to take into account non-statistical factors, selection bias, and that the state poll models probably did not accurately reflect the electorate in the states given the lessons from the primaries.  Along these lines, I believe that if pollsters look at the demographics in the respective primaries that they will find that both voter enthusiasm and composition provide the corrective to their projections. Given these factors, the aggregators and probabilistic models should all have called the race too close to call.  I think both Monte Carlo and Bayesian methods in simulations will bear this out.

For example, as one who also holds a political science degree and so will put on that hat.  It is a basic tenet that negative campaigns depress voter participation.  This causes voters to select the lesser of two evils (or lesser of two weevils).  Voter participation was down significantly due to a unprecedentedly negative campaign.  When this occurs, the most motivated base will usually determine the winner in an election.  This is why midterm elections are so volatile, particularly after a presidential win that causes a rebound of the opposition party.  Whether this trend continues with the reintroduction of gerrymandering is yet to be seen.

What all this points to from a data analytics perspective is that one must have a model to explain what is happening.  Statistics by themselves, while correct a good bit of the time, will cause one to be overconfident of a result based solely on the numbers and simulations that give the false impression of solidity, particularly when one is in a volatile environment.  This is known as reification.  It is a fallacious way of thinking.  Combined with selection bias and the absence of a reasonable narrative model–one that introduces the social interactions necessary to understand the behavior of complex adaptive systems–one will often find that invalid results result.

The Revolution Will Not Be Televised — The Sustainability Manifesto for Projects

While doing stuff and living life (which seems to take me away from writing) there were a good many interesting things written on project management.  The very insightful Dave Gordon at his blog, The Practicing IT Project Manager, provides a useful weekly list of the latest contributions to the literature that are of note.  If you haven’t checked it out please do so–I recommend it highly.

While I was away Dave posted to an interesting link on the concept of sustainability in project management.  Along those lines three PM professionals have proposed a Sustainability Manifesto for Projects.  As Dave points out in his own post on the topic, it rests on three basic principles:

  • Benefits realization over metrics limited to time, scope, and cost
  • Value for many over value of money
  • The long-term impact of our projects over their immediate results

These are worthy goals and no one needs to have me rain on their parade.  I would like to see these ethical principles, which is what they really are, incorporated into how we all conduct ourselves in business.  But then there is reality–the “is” over the “ought.”

For example, Dave and I have had some correspondence regarding the nature of the marketplace in which we operate through this blog.  Some time ago I wrote a series of posts here, here, and here providing an analysis of the markets in which we operate both in macroeconomic and microeconomic terms.

This came in response to one my colleagues making the counterfactual assertion that we operate in a “free market” based on the concept of “private enterprise.”  Apparently, such just-so stories are lies we have to tell ourselves to make the hypocrisy of daily life bearable.  But, to bring the point home, in talking about the concept of sustainability, what concrete measures will the authors of the manifesto bring to the table to counter the financialization of American business that has occurred of the past 35 years?

For example, the news lately has been replete with stories of companies moving plants from the United States to Mexico.  This despite rising and record corporate profits during a period of stagnating median working class incomes.  Free trade and globalization have been cited as the cause, but this involves more hand waving and the invocation of mantras, rather than analysis.  There has also been the predictable invocations of the Ayn Randian cult and the pseudoscience* of Social Darwinism.  Those on the opposite side of the debate characterize things as a morality play, with the public good versus greed being the main issue.  All of these explanations miss their mark, some more than others.

An article setting aside a few myths was recently published by Jonathan Rothwell at Brookings, which came to me via Mark Thoma’s blog, in the article, “Make elites compete: Why the 1% earn so much and what to do about it”.  Rothwell looks at the relative gains of the market over the last 40 years and finds that corporate profits, while doing well, have not been the driver of inequality that Robert Reich and other economists would have it be.  In looking at another myth that has been promulgated by Greg Mankiw, he finds that the rewards of one’s labors is not related to any special intelligence or skill.  On the contrary, one’s entry into the 1% is actually related to what industry one chooses to enter, regardless of all other factors.  This disparity is known as a “pay premium”.  As expected, petroleum and coal products, financial instruments, financial institutions, and lawyers, are at the top of the pay premium.  What is not, against all expectations of popular culture and popular economic writing, is the IT industry–hardware, software, etc.  Though they are the poster children of new technology, Bill Gates, Mark Zuckerburg, and others are the exception to the rule in an industry that is marked by a 90% failure rate.  Our most educated and talented people–those in science, engineering, the arts, and academia–are poorly paid–with negative pay premiums associated with their vocations.

The financialization of the economy is not a new or unnoticed phenomenon.  Kevin Phillips, in Wealth and Democracy, which was written in 2003, noted this trend.  There have been others.  What has not happened as a result is a national discussion on what to do about it, particularly in defining the term “sustainability”.

For those of us who have worked in the acquisition community, the practical impact of financialization and de-industrialization have made logistics challenging to say the least.  As a young contract negotiator and Navy Contracting Officer, I was challenged to support the fleet when any kind of fabrication or production was involved, especially in non-stocked machined spares of any significant complexity or size.  Oftentimes my search would find that the company that manufactured the items was out of business, its pieces sold off during Chapter 11, and most of the production work for those items still available done seasonally out of country.  My “out” at the time–during the height of the Cold War–was to take the technical specs, which were paid for and therefore owned by the government, to one of the Navy industrial activities for fabrication and production.  The skillset for such work was still fairly widespread, supported by the quality control provided by a fairly well-unionized and trade-based workforce–especially among machinists and other skilled workers.

Given the new and unique ways judges and lawyers have applied privatized IP law to items financed by the public, such opportunities to support our public institutions and infrastructure, as I was able, have been largely closed out.  Furthermore, the places to send such work, where possible, have also gotten vanishingly smaller.  Perhaps digital printing will be the savior for manufacturing that it is touted to be.  What it will not do is stitch back the social fabric that has been ripped apart in communities hollowed out by the loss of their economic base, which, when replaced, comes with lowered expectations and quality of life–and often shortened lives.

In the end, though, such “fixes” benefit a shrinkingly few individuals at the expense of the democratic enterprise.  Capitalism did not exist when the country was formed, despite the assertion of polemicists to link the economic system to our democratic government.  Smith did not write his pre-modern scientific tract until 1776, and much of what it meant was years off into the future, and its relevance given what we’ve learned over the last 240 years about human nature and our world is up for debate.  What was not part of such a discussion back then–and would not have been understood–was the concept of sustainability.  Sustainability in the study of healthy ecosystems usually involves the maintenance of great diversity and the flourishing of life that denotes health.  This is science.  Economics, despite Keynes and others, is still largely rooted in 18th and 19th century pseudoscience.

I know of no fix or commitment to a sustainability manifesto that includes global, environmental, and social sustainability that makes this possible short of a major intellectual, social or political movement willing to make a long-term commitment to incremental, achievable goals toward that ultimate end.  Otherwise it’s just the mental equivalent to camping out in Zuccotti Park.  The anger we note around us during this election year of 2016 (our year of discontent) is a natural human reaction to the end of an idea, which has outlived its explanatory power and, therefore, its usefulness.  Which way shall we lurch?

The Sustainability Manifesto for Projects, then, is a modest proposal.  It may also simply be a sign of the times, albeit a rational one.  As such, it leaves open a lot of questions, and most of these questions cannot be addressed or determined by the people to which it is targeted: project managers, who are usually simply employees of a larger enterprise.  People behave as they are treated–to the incentives and disincentives presented to them, oftentimes not completely apparent on the conscious level.  Thus, I’m not sure if this manifesto hits its mark or even the right one.

*This term is often misunderstood by non-scientists.  Pseudoscience means non-science, just as alternative medicine means non-medicine.  If any of the various hypotheses of pseudoscience are found true, given proper vetting and methodology, that proposition would simply be called science.  Just as alternative methods of treatment, if found effective and consistent, given proper controls, would simply be called medicine.

Legitimacy and the EU Democratic Deficit

Turning to political science again, Kevin O’Rourke has an important article regarding the democratic deficit and types of legitimacy in Critical Quarterly, particularly in light of the events surrounding the Greek crisis.  He cites the late political scientist Peter Mair’s book, Ruling the Void, as providing a solid framework for understanding what is happening in Europe, and to some extent within all democracies as a result of wealth and power concentration among an increasingly transnational economic elite.

The issue that O’Rourke tackles based on Mair’s insights, is one of democratic legitimacy.  For economists and financiers who seem to have (I would argue) taken an illegitimately outsized role in determining what is good for Greece, even if Greece disagrees, the dichotomy here seems to be between what has been called input vs. output legitimacy.  I understand what he is saying here, but in political science “legitimacy” is not the same as “democratic legitimacy” and, in the end I think, this is his point.

O’Rourke, an economist himself, tackles how using this argument, particularly in regard to output legitimacy, has been hijacked so that concerns about distribution have been stripped out of the definition by the application of technocrat-speak.  I have a backlog of items for the Devil’s Phraseology from “Structural Reform” to other euphemisms for, essentially, screwing working people over, especially right now if they are Greek, Italian, Spanish, or Irish.

His article is important in tracing the subtle transformation of legitimacy over time.  For those unfamiliar with this terminology, legitimacy in this sense–if you remember nothing else but your Lincoln or Jefferson–in democratic societies is properly derived by the people.  This concept, which can be measured on the input side, is reinforced by processes and institutions that support it.  So clean elections which seek to maximize participation of the adult population; freedoms that support human rights, particularly those concerning speech, free association, and free movement; institutions that are largely responsive to the democratic will but which possess limitations to prevent the denial of human rights; and an independent judiciary that metes out justice based on due process; the absence of corruption, undue influence, unequal treatment, or graft in these institutions, etc. are all indicators of “legitimacy.”  In the context of the European debate this is known as “input” legitimacy.

Then there is “output” legitimacy.  This is the type of legitimacy on which the EU rests, since it obviously–especially since the last Greek referendum on the terms of the Troika’s terms–doesn’t seem to be based on any kind of “input” legitimacy.  Here legitimacy is based on a utilitarian measure–the ability of the EU to raise the aggregate euro value at any one time.  This is the “rising tide lifts all boats” trope.  Nice imagery, what with the JFK connection and all, but the rules of the game and economic environment have changed since 1963 to the extent that the analogy no longer applies.  A rising tide lifts all boats only if everyone has a stake in the tide rising.  Feel free to add any additional analogies now that we are beginning to understand the effect of rising tides on coastal cities as the earth warms.  An actual rising tide certainly didn’t do anyone in NOLA’s Lower Ninth and Lakeside neighborhoods any favors, but we do know that it impacted people residing in different economic strata differently.

Furthermore, output legitimacy as a utilitarian project sounds a lot like “we made the trains run on time”.  Furthermore, it wasn’t all that long ago that more than half of Europe suffered under authoritarian regimes.  Output legitimacy, I would argue, by definition is the opposite of democratic legitimacy, not one of two types of democratic legitimacy.  As O’Rourke points out, one cannot take politics out of policy, so the way in which decisions are made is important in defining the type and level of legitimacy.

Post-1989 generations have not had to come to an understanding of the fact that even oppressive regimes can possess political legitimacy that is sufficient for them to survive.  From an historical standpoint, all of those German people in the streets chanting “Heil Hitler” weren’t doing so at gun point.  The block captains and those others who denounced family members in the old Eastern Block countries largely acted independently and voluntarily.  Many Russians today pine for the days under the old Soviet Union and have a leader in Putin that channels that nostalgia.  Autocratic and authoritarian regimes simply possess legitimacy through institutions and processes that are more restrictive than those found in democratic societies, but which rests on elites, centers of power, and pluralities that allow them to function.

Thus, whether the EU will admit it publicly or not, one need only do a Google search to see that this is a generally recognized issue that the European countries seem unwilling or unable to address.  The recent charging of Yanis Varoufakis, the former Greek minister, of treason at the instigation of Greek and European elites raises the ante and strips whatever remaining veil there was to hide the anti-democratic roots of the Greek crisis.  Apparently the 60% of the Greek people who voted “No” to the Troika were also traitors.

That this is happening is Greece is also problematic due to its geographical location in the eastern Mediterranean and its fairly recent transition to democratic processes and institutions.  De-legitimization of democracies is an all too familiar event in the history of the European continent and can only lead to radicalization, especially given the pain being inflicted on the Greek people.  What Europe’s technocrats have done is turn an economic recession and market failure–that could have been ameliorated and solved given the proper solutions learned by hard experience from the 1930s and immediately following the Second World War–rejected those methods and, as a result, though obstinance, tyrannical actions, corruption, and greed, have created a political and economic disaster that threatens the legitimacy of the EU.

Time to reform the reformers.

Welcome to the Hotel (Euro) — You Can Vote “Oxi” Anytime you Like but you Can Never Leave

The recent events in Greece and their ramifications for the European project have been the subject of a good many blogs and news articles lately.  From an economic perspective the most noteworthy are those by Paul Krugman, Brad DeLong, Dean Baker, Yanis Varoufakis who was on the ground as Greece’s Finance Minister, and Joseph Stiglitz, among others.

If one were to read the news in the manner of the U.S. press through its sources of record: The New York Times, Wall Street Journal, Washington Post, not to mention the major news networks with CNN thrown in (avoiding avowedly polemical sources like Fox, MSNBC, and the Huffington Post), one would think that the Greek issue is one caused by a profligate country that borrowed a bit too much and allowed Greeks to live over their heads.  Nothing could be further from the truth.

The bottom line is that Greece and the EU decided to bail out the banks and investors who crossed the line in investing in junk paper by using public funds.  Sound familiar?  Think a Eurozone TARP.  But in the case of the EU the banks and bad paper investment houses–the inmates in this scenario–run the asylum.  With the constant drumbeat from our own oligarchs we have become as a people brainwashed to think that investors and creditors have a right to their money.  Our own more draconian bankruptcy laws imposed by the last financial industry-tainted Congress institutionalized many of these attitudes in law.  But this has not always not been the case and is not part of our legal or economic traditions.  It is certainly not anything like what Adam Smith had in mind.

The operational term in this case is “moral hazard.”  The rhetoric of the moneyed interests and their armies of lawyers have deftly tried to invert the meaning of the concept, but as Steve Waldman clearly explains in his excellent interfluidity blog, “moral hazard” is a concept that overwhelmingly falls on investors and creditors.  It means, quite simply, that you as an investor are responsible for where you put your money at risk–and that risk includes it being completely lost.  There are no guarantees.  This was the original rationale of Glass-Steagall: it was accepted that regular working people don’t have the information, time or resources to place their funds, which are meant for savings, under moral hazard.  Same goes for things like the Social Security Trust Fund.  Play with your own “play” money if you have it, but guaranteed deposits and retirement pensions are off-limits since they are backed by the sovereign currency.  Seed corn is not open to being manipulated by cheap paper–that is, until Glass-Steagall was repealed.

The European condition is a bit more complicated only because the EU has never had a consistent separation between its financial elites and civic institutions, given the differences in national traditions and political systems.  But one should be able to clearly see the connection between what is happening in Europe within the EU and in other places around the world: the attack on democratic legitimacy by oligarchs and economic elites.

As Joe Stiglitz points out in the post cited above, Greece–emerging from years of autocratic rule and third world-like conditions–was doing quite well economically until the financial bubble burst across the developed western world.  Many of the banks that invested in hyper-inflated Greek real estate and other development projects were situated in other countries.  The EU under the Euro project is a currency union, and under the Maastricht Treaty that formed this union there were some economic harmonization rules required, mostly pushed by Germany and France, but there is no lender of last resort, no central banking authority equivalent to our Federal Reserve, no centralized budget authority, nor political or cultural ties that would keep old ethnic or nationalist conflicts from flaring up in a crisis.  As Waldman explains, what these other countries did–in particular Germany–was to bail out the banks and investment houses making the debt on these bad investments public obligations.  This sleight of hand politicized what should otherwise should have simply been written off bad investments.  If the Germans wanted to have their own TARP they should have done so.  But it was so much easier to push the consequences onto the Greeks given their weaker position in the EU.

Jared Bernstein in his Washington Post article following the Greek “no” vote quoted an unnamed German economist asserting: “How do you think the people of Manhattan would like bailing out Texas?”  As Krugman rejoined upon reading the article, Manhattan (and other areas of the country) do that all the time as a matter of course.  It was done during the Savings & Loan crisis that was largely a Texas affair back in the late 1980s.  Anyone who looks at the net benefits of federal tax payments and expenditures by state can see that the southeastern states–in particular those that made up the old Confederacy, including Texas, get more in federal benefits than they pay in.  To Americans this is not a big deal–and my use of the term American to identify my countrymen is at the heart of the question.  I don’t know anyone who in reality is a Floridian.  Only buffoons and idiots identify themselves as Texans over their identity as Americans.

Here we tend to put resources where they are needed, hence the United States of America.  More than two hundred years involving waves of immigrants, over one hundred and fifty years of increasing population mobility, and two major world wars and a cold one–two of these existential in nature–during the 20th century, not to mention 9-11, has militated against the old regionalism.  It is not surprising that the assertion that displays such deep ignorance of our own system and society would come from a German economist.  I mean this as no mean insult.

When I was on active duty as a young U.S. Naval officer I met a Scottish couple in Spain who worked at the U.K. embassy there.  They were amazed by my nonchalance in moving my family from California to a home base in Virginia as part of my assignment.  “Do you now identify yourself as a Virginian?” they asked.  When I explained that–no–it was all part of the U.S., they explained that they would always identify themselves as Scots, and that within Scotland that people associated themselves with a particular village or region.  This was almost 30 years ago, and I am told that such attitudes are changing, but it does point to a weakness in the “European” project, especially given that in the most recent U.K. parliamentary elections that the Scottish nationalist party was overwhelming elected to the House of Commons.

Given my own expertise in history and political science, my concern is directed more to the consequences of Greece capitulating to the EU’s economically and politically disastrous demands.  Just ten days ago 60% of the Greek people voted against the conditions imposed by the EU, yet their parliament just approved a package that is punitive by any reasonable and objective measure.  Even the IMF has belatedly–and in a largely dishonest manner which I can only explain as some type of EU face-saving approach–clearly stated that the conditions imposed are unsustainable.

The role of Germany is certainly one of the main issues in this condition.  Given the way they handled the bad paper of their bankers, Merkel and her party have backed themselves into a corner.  So they have done what all desperate politicians do–they have demonized the Greeks.  This is all for mercenary purposes, of course, and without consideration for the long term consequences for both the Greek people and the EU.  What they have done is show the contradictory fault lines in the entire “European” project.  German Finance Minister Schaubel, by attempting to co-opt the Greek threat of a Euro exit by making such terms seem disastrous, has inadvertently made such an exit virtually inevitable.  Greece, not wanting to be left out of “Europe” has just voted against its own best interests, its government never really having a strategy for a “Grexit” because they assumed that their negotiating partners were both rational and well-meaning.  The government very well may fall as a result.

For what the Greek crisis has shown is that the European project under the Euro is neither a completely democratic one nor is it “European.”  The elites in Brussels certainly felt that they had no obligation to consider the Greek referendum on the bailout terms.  To them only the banks, the oligarchs, and the political survival of the political parties in the main assemblies of the nations that support them matter.  The “democratic deficit” of the EU, in the words of the late historian Tony Judt, and the damage that it can cause, is now on full display.  It is not yet clear what will happen, given the contradictory impulses of countries wanting to stay within the single market that “Europe” afford them, the cultural and psychological association to be part of the project, and the punishing loss of national autonomy and democratic legitimacy as the price that must be paid. (Aside from the economic depression and poverty conditions imposed by the EU as the Greeks follow the conditions imposed on them).

One final note:  I can’t help but be impressed by the ideological arguments being used as a matter of course for “helping” the Greek people in the long run.  As John Maynard Keynes noted, in the long run we are all dead.  The tremendous amount of suffering imposed by the EU on the Greek people for their own long-term good sounds much like the fabrications of the old Communists of the former Eastern Block countries who inflicted all sorts of horrors on their own populations for the “long term” good of reaching the perfect socialist state.  Now such arguments are deployed in favor of the perfect capitalist state.  It is “reform” turned on its head, like “moral hazard.”

 

 

 

 

Upper Volta with Missiles — Overreach, Putin, and the Russian Crash

Starting out the new year with some additional notes on international affairs.

The reference in the title is from a comment from former German Chancellor Helmut Schmidt in once referring to the Soviet Union.  Of course, as Tony Judt noted in his magisterial book Postwar: A History of Europe Since 1945, there are those missiles.  Thus, this is a topic of concern to everyone, particularly in regard to the events surrounding Crimea and Ukraine.  This past April I noted the threat implicit in Putin’s actions and the need for European solidarity in opposing his actions to maintain the peace and stability of the region.  When combined with Russian violations of nuclear arms treaties this is cause for concern.

Since April much has happened, including measured sanctions by the European Union and the United States, to prevent the Russian Federation from leveraging its economic power to gain an advantage over Ukrainian sovereignty.  In addition, the depressed state of the world economy, among other factors, has created an oil glut that has also reduced Russia’s ability to leverage its oil reserves against any countries that would oppose it.  As a result, the ruble has taken a hit and Russia has made all of the wrong moves to bolster its currency.

On the middle point, certain notable voices here in the United States have pointed to an increase in oil production as the main cause but the numbers do not support this contention.  Instead, a combination of factors: alternative energy production, more efficient fuel consumption, and a drop in consumer demand have all conspired to, well, act as a market is supposed to behave.  Combine this with the refusal of major producers to reduce output to manipulate the market in order to prop up the price and you have what commodities do most often–rise and fall on the whims of the demand of the moment.  I have no doubt that eventually the world economy will recover, but keep in mind that the very real threat of Global Warming will continue to drive societies to find alternatives to fossil fuel.  That is, given that they continue to recognize the existential threat that it poses to humanity (aside from the dysfunctional geopolitics that fossil fuels seem to drive).  In the meantime, seeing the handwriting on the wall, net exporters like Saudi Arabia have little incentive to reduce production when they can sell as much as possible and gain a larger share of the market against their competitors.

For the uninitiated like Fifth Column blogger Patrick Smith at Salon.com, who apparently only sees conspiracies and control of a kind that–well–actually exists in Putin’s Russia, this is known as market competition.  Nary a peep from Mr. Smith has emanated lately (or from our own right wing plutocrats) about the Russian oligarch being a statesman running rings around our democratically-elected U.S. president or his decorated former U.S. Navy officer (and later antiwar activist) Secretary of State.  Were it only possible for the state controlled Russian press to have the freedom to make such alternative observations of its own leadership in their country.  Okay–enough sarcasm for today, but I think I made my point: mendacity and irrationality make for strange bedfellows.

Along these lines some interesting insights about Putin’s Russia have come out in the book entitled Putin’s Kleptocracy: Who Owns Russia? by Karen Dawisha.  This is a brave undertaking given that a lot of critical writing about Russia, apart from the abolition of a free press there, has been taken down from websites.  This is not because of some mysterious ability on the part of Putin and his cronies but because of their immense international (until recently) financial power and the expensive lawyers that such money can buy.  Cambridge University Press, for example, because of the U.K.’s lax libel laws, declined to publish the book.  Thus, a U.S. publisher had to be found.  In addition, Russia has bought off columnists and politicians around the world to muddy the waters about the reality of the regime.  A very enlightening review of the book and the history surrounding it appears in The New York Review of Books by Washington Post and Slate columnist Anne Applebaum.

In summary, Dawisha’s book demonstrates that during the period when Gorbachev was desperately attempting to reform a crumbling and inefficient system that had plodded along under the Brezhnev doldrums, that KBG agents like Putin were moving Russian currency assets aboard in Europe with the intent of eventually using their economic leverage to retake the country when all of the hullaballoo blew over.  Thus, rather than a failing attempt at liberalization and democracy, what we see is the reinstitution of authoritarian rule after a brief respite.  The same old corrupt elites that had run the old Soviet Union under central planning are now simply wearing capitalist oligarch clothing.  This probably explains why the Russian central bank is moving to bolster the ruble through higher interest rates, which will only exacerbate the economic collapse.  But the general welfare is not their concern.  It’s all about the value of Russian reserves and the economic leverage that such value and power lends to control.

Globalization has made this a small world, but one still fraught with dangers.  For companies in my industry and policymakers here in the United States, I would recommend that a wall of separation be established from companies–particularly those technology companies in information systems–with ties to Russian oil and its oligarchs.

Real World — Normalization of Relations with Cuba

Family and holiday routine has interrupted regular blogging.  I’ve been working on new posts on project management and high tech that will shortly appear at AITS.org, as well as here.

In the meantime, much has happened in the world.  Among these events was the President’s announcement on normalizing relations with Cuba.  The usual suspects have squeaked but this is a policy a long time coming in sweeping away the last vestiges of the old Cold War.  As a student of both history and political science I cannot let this go by without some comment.

Those of who served during the latter part of that long standoff known as the Cold War understand that, after the initial institution of containment of Soviet imperial ambitions, that the Iron Curtain fell only after the implementation of policies such as the West German Ostpolitik under Willy Brandt and greater U. S. engagement through rapprochement.

Low level social and cultural contacts are more effective in bringing about change in oppressive regimes than isolation. An isolated people are more easily manipulated, tending to play into the regime’s hands fostering paranoia, xenophobia, and social control. This is not just opinion but the result of numerous studies tracking the incremental changes that led to liberalization and liberation in central and Eastern Europe.

It is harder to blow up the world if one realizes and acknowledges the basic humanity of one’s adversaries. We had to learn that lesson anew after the nearly disastrous ramifications from FleetEx ’83 and the similarly foolish brinkmanship of Able Archer ’83. Both of these Reagan era exercises almost led to thermonuclear war. (I participated in the first, on an LST just off the Kamchatka Peninsula, as part of the greatest naval armada ever assembled).

I think Mr. Obama has once again proven himself clear-eyed and level-headed in changing a failed policy that nonetheless has managed to survive due to political intransigence and perceived electoral politics.  The repressive regime in Cuba is no less hesitant to fully embrace this change than our own extremists. I think that this alone is a good indication that this president has made the right decision.

Note:  The links for this post did not appear in the first version.  I have refreshed them for update.

Finding Wisdom — Marshall McLuhan

Marshall McLuhan

“Ours is the first age in which many thousands of the best-trained individual minds have made it a full-time business to get inside the collective public mind. To get inside in order to manipulate, exploit, control is the object now. And to generate heat not light is the intention. To keep everybody in the helpless state engendered by prolonged mental rutting is the effect of many ads and much entertainment alike.”Marshall McLuhan in the preface to The Mechanical Bride, 1951

One cannot fully comprehend modern human society without Marshall McLuhan, especially those of us who use the relatively new technologies borne of the television, the personal computer, the smartphone, social media, political spin and manipulation, social control, advertising, and digitized systems.  With McLuhan recent phenomena like Gamergate become intelligible.

He began as an earnest Canadian English teacher who found his intellectual pursuits influenced by the rise of new technologies–both historical and contemporaneous–that would soon transform mediums of literature, art, and learning and become what is now known as popular culture and mass media.  Along the way he also found himself bound up in both that popular culture and mass media which, as all artifacts of human narcissism, cannot help but be fascinated and thus flattered by those who study it.  Then for a while he was largely forgotten and ignored by these same artifacts of modern life once the freshness of his ideas passed and it became apparent that his observations were simply that, and not usually positive.

He introduced into the popular lexicon the phrase, via Dr. Timothy Leary, “tune in, turn on, drop out,” when commenting on advertising during a lunch the two had in New York City, with McLuhan substituting a pitch for psychedelic drugs in the lyrics of a popular Pepsi commercial tune at the time.  He is also remembered, at the height of his popularity, for this cameo in the Woody Allen movie “Annie Hall”:

But more significantly, McLuhan is known for establishing the link between modes of transmitting knowledge and the way they influence the structures of the mind, of how knowledge is viewed and used depending on the medium, and its effects on the individual and society, which were not originally anticipated.  His concepts have been summarized by the phrases “the medium is the message,” and “the medium is the massage.”  He also was the first to describe the manner in which the world is connected by various types of media using the phrase “global village,” and anticipated the Internet that we know today, years before it became a fact, describing how it would significantly alter all means of human understanding.

Among the significant works in McLuhan’s canon are The Mechanical Bride: Folklore of Industrial Man (1951), The Gutenberg Galaxy: The Making of Typographic Man (1962), Understanding Media: The Extensions of Man (1964), which was the work that brought him fame and fortune in this country, The Medium is the Massage: An Inventory of Effects (with Quentin Fiore) (1967), War and Peace in the Global Village (with Quentin Fiore and Jerome Agel) (1968), From Cliché to Archetype (with Wilfred Watson) (1970), and the posthumous The Global Village: Transformations in World Life and Media in the 21st Century (with Bruce Powers) (1989).

My intent is not to delve deeply into McLuhan’s work.  There is a small McLuhan industry of academics in the world who both support and criticize his observations, as well as the interpretations of those observations.  The Wikipedia summary of McLuhan is excellent, as is the in-depth work of the McLuhan Galaxy blog here on WordPress.  There is also a website for his estate that has a wealth of information on his writings.

Instead, what I intend to do is summarize the essential wisdom and understanding in his work.  For it is apparent–and was apparent from the first time that I picked up his anthology of media as an undergraduate student and news editor of my college newspaper in 1972–that the insights he provided constituted both a deep understanding of the world that was to come, and that not understanding that world–and the essential wisdom of what he observed about it–would spell disaster for many of us who cared about the democratic ideal and the transmittal of knowledge.  To paraphrase one of Ray Bradbury’s short story characters, a people who fail to grasp the future will find themselves soon overtaken by it.

McLuhan’s approach that would mark him both as a modernist and an unconventional analyst began in The Mechanical Bride.  The quote found at the beginning of this post is from the preface to that work.  Here he addresses the rising popular culture with its armies hired by corporations and political organizations all dedicated to manipulating the way people think.  The book is filled with advertisements, comics, and articles of the time related to the various essays in the book, which are designed to be read in any order that the reader decides.  His rhetorical position, in lieu of outrage or the tone of the reformist, is to use humor and amusement.  He uses the analogy of Edgar Allen Poe’s character in the story “A Descent into the Maelstrom,” who finds himself in the grip of the whirlpool from which he cannot escape, and has no choice but to ride it out and use it for his own amusement.  Unable to reverse the new machine of persuasion and manipulation, he takes the position of exposing the obvious motivations behind the content in the examples provided, and therefore makes the reader aware of what is being attempted.

He then moved on in The Gutenberg Galaxy, which was awarded Canada’s highest literary prize for non-fiction, to look at the development of different mediums of transmitting information.  He traced the effect of the transformation from the oral to print to visual mediums, like television on human understanding, and anticipated new mass electronic media.  This came at a time, in the early 1960s, that saw a rapid expansion of literacy, the consumer acceptance of television, and the mass introduction of paperback books.  While the effect of television was just beginning to be realized (the “vast wasteland”), mass electronic media that combined all of the capabilities of previous media was still the topic of science fiction.  Yet McLuhan successfully identified the emerging computerization of data and its future possible role, characterizing it as the “global village.”  It is also here that we find the first use of the term “surfing” to describe a means of electronically navigating to find information.  In the global village, unlike in the world of print, knowledge would become individualized and fragmented.

Unlike the world of phonetic written language based on movable type, the electronic global village would undermine the preciseness of language and understanding that print was able to enforce.  For McLuhan, the process of the medium of books and other written mediums was an individual one between author and reader that fostered–and made possible–such civilizing concepts as objective analysis, democracy, and individual rights.  Print moved the human species from mere tribal, mythical, and parochial concerns to those that transcended the shackles of human understanding.  The effects on cognition by the electronic global village, he posited, would once again transform the world around us in unexpected ways from this level of stability.  Technology itself possesses no morality, it shapes society’s and the individual’s self-conception.

Thus, it is soon that we are brought to his most popular and influential, if not fully coherent work, in Understanding Media.  Here we are introduced to the McLuhan Equation, which is summarized popularly by the phrase “the medium is the message”–a further development of the thesis regarding media that he wrote about in The Gutenberg Galaxy.  This equation has been largely misunderstood, oftentimes in the most extreme ways, of positing that the content of the medium being used doesn’t matter.  This is not true.  What McLuhan was observing, instead, was that content is a medium of its own, but the manner in which it is conveyed also has its own dynamism and effect.  The means of conveyance comes with its own message that may alter the way that people think and learn, that will influence the way in which the content is received.

For example, I am sitting at my desk writing this post.  The medium in which it is being transmitted to you, the reader, is via the web that is accessed by your PC, laptop, notepad, or smartphone.  Having been raised and disciplined in an educational environment that requires focus, concentration, and constant fact-checking, my content is presented in the form of the essay.  The medium in which my ideas are transmitted, however, undermines such discipline.

You may scan this post, mark it for further reading when you have a chance, and then move on to other things like looking at the weather for the upcoming Thanksgiving holiday, perhaps doing some shopping on-line in advance of the end of year holiday rush, take a look at headline news–which invariably nowadays is selected based on how and who is presenting the information to reinforce your personal worldview–and then continue surfing for some other bit of information.  During this process you will be bombarded by ads and other forms of information designed to draw your attention.  The content for each of these items is different and will affect you in different ways, but the medium in which it is presented provides you with that information in a linear, immediate, and flat manner.  No attempt is made to filter much of that information for accuracy or significance.  The brain registers it all as if it all has equal value.

To elaborate on his concepts he introduced the concept of “hot” and “cool” media.  Hot media, such as books, movies, and lectures, engage the individual through one primary sense, require immersion and analytical thought.  Cool media are those like television and, in our own day, gaming and the internet, which provide substantial stimulus and, oftentimes, active participation by the user involving many senses.  His later works, which on the whole are less compelling but which provide many elements of significance and insight, elaborate on these foundations.  In particular, The Medium is the Massage, describes the ability of different media to engage the user and massage the senses.  The additional speculations on the global village, in particular the means in which communication and propaganda has been used to justify war, and what they would look like, have proved prescient.

To wonder about social and political polarization, neo-Medieval forms of thinking, or the basest motivations of the human psyche reemerging given the effect of this technology, which reinforces individuation, alienation, and fragmentation, is to ignore the elephant in the room.  It is not that these forms of dysfunction have not survived throughout the modern era, given that all mediums exist simultaneously.  It is that they have not been transmitted and influenced human agency so quickly, effectively, and widely.  This is also the crux of the issue regarding net neutrality and other forms of surveillance, behavioral advertising, and social control by both business and government.

Taken with other contemporaneous and subsequent works such as those of B. F. Skinner, Vance Packard in The Hidden Persuaders, the recent work in social psychology by Albert Bandura, McLuhan’s work provides a great deal of insight into how media both reflects the society at large and, at the same time, influences it as well.  We must be aware of the ways in which we are manipulated and influenced by those whose sole goal is to have us do their bidding.  The nature of democracy and human autonomy depends on it.

Highway to the (Neutral) Zone — Net Neutrality and More on Information Economics

Net Neutrality was very much in the news this week.  First, the President came out in favor of Net Neutrality on Monday.  Then later in the week the chair of the FCC, Tom Wheeler, who looked like someone caught with his hands in the cookie jar, vacillated on how the agency sees the concept of Net Neutrality.  Some members of Congress have taken exception.

For those of us in the software business, the decision of the FCC will determine whether the internet which was created by public investment, will be taken over and dominated by a few large corporations.  The issue isn’t a hard one to understand.  Internet service providers, which is an area dominated by large telecommunications and cable oligopolies, would like to take lay claim to the internet’s bandwidth and charge for levels of access and internet speed.  A small business, a startup, any small enterprise would be stuck in a slower internet, while those with the financial resources would be able to push their products and services into internet “fast lanes” by paying fees for the privilege and therefore be able to have an advantage in terms of visibility, raising the barriers of entry to would-be competitors, and to defend market share.  Conceivably, since these companies often provide their own products or are aligned with other large companies both vertically and horizontally, there would be little to stop a provider from controlling all aspects of the information that is available to consumers, teachers, citizens, researchers–virtually anyone who accesses the internet–which is virtually everyone today.  Those who claim that such use of power is unlikely because Comcast et al have committed themselves to the now defunct 2010 rules apparently haven’t read the fine print, are unfamiliar with recent economic history (such as Comcast’s throttling of BitTorrent in the early 2000s, Cox Cable’s blocking of some downloading, and other similar examples), or haven’t heard of Lord Acton.

When combined with attacks on public investments for community broadband (also known as public high speed internet) in cities and communities, we are seeing an orchestrated campaign by a few corporations to not only dictate the terms of the market, but also to control the market itself.  This is the classic definition of a corporate trust and monopoly.  It is interesting that those who constantly advocate for a free, competitive market are the first to move against them where they do exist.

Jeffrey Dorfman at Forbes–to pick just one example–falls into this category, seemingly twisting logic into pretzels to make his argument.  He addresses analogies when we only have to point out the conditions in the real world.  For example, I love the following statement:  “The key point that President Obama has missed along with all the rabid supporters of net neutrality is that ISPs and the companies that control the Internet backbone infrastructure that knits everything together do not have the power to pick winners and losers either. Consumers decide what products and services are successful because we adopt them. If an ISP blocks because of the bandwidth it requires, consumers who want Netflix will take their business elsewhere. If enough people do so, the ISP will have to change policies or go out of business.”  Hmmm.  So in large swaths of the United States where there is only one ISP, how will consumers choose Netflix or drive the ISP out of business?  What market mechanism or model applies to this scenario?  I cannot find in either Samuelson or Friedman (or Smith, Ricardo, Keynes, Classical or neo-Classical economics, etc.)–or a historical example for that matter–where a company exerting monopoly power has been driven out of business due to consumer preference for a product.  More to the point, if an ISP prevents a company like Netflix to provide its service over the internet backbone how would consumers know about it in the first place, especially if the monopoly substitutes its own equivalent service instead?

But Mr. Dorfman’s non-sequiturs get better.  He follows up with the following statement:  “As the former chief economist for the FCC, Thomas Hazlett, pointed out  this week in Time, Facebook, Instagram, Twitter, LinkedIn, (and many, many more success stories of innovation) all emerged without the benefit of net neutrality.”  Aside from committing the fallacy of argumentation from authority, he can’t get his facts right.

The internet as we know it really didn’t begin to come into existence and open up to commercial traffic until the late 1990s.  The FCC created the first voluntary net neutrality rules in 2004, but the internet was still largely open with many competing ISPs well into the new century, thus net neutrality was largely a de facto condition.  In 2008 the FCC auctioned wireless spectrum with tight rules ensuring net neutrality and followed this up with a broader set of requirements in 2010.  These 2010 rules did not apply to all ISPs because of restrictions by the courts, but functioned pretty well.  It wasn’t until 2014 that the 2010 rules were once again overturned by the courts.  Mr. Hazlett’s cited “point” then is factually inaccurate, since the companies he references did come into existence in an environment of de facto–partly voluntary and partly enforced–net neutrality.  What has changed is the use of the courts by corporations and revolving door lawyers like Mr. Hazlett to undermine that condition.  What Mr. Hazlett would like to do is shut the door to new companies succeeding under the same set of rules as those earlier ones.

So what “net neutrality” is about is addressing a problem that is supported by concrete examples where both the public interest and open market principles were violated when the fences came down.  In the scenario that Mr. Dorfman proposes to defend corporate power, consumers don’t get a vote, to use the canard by ideologues that consumers “vote” to begin with.  The market sets price.  Consumer preferences are shaped by other factors outside of the market, information being one of those factors.

As I noted in a previous blog post, research into the economics of information has revealed that it is a discipline with several unique characteristics, among these being that information is easily transferrable but, in order to determine its utility, requires some knowledge and investment of time.  Along with the insight of social scientist Martin Sklar that the capital investment required to replace the existing material conditions of civilization has been falling steadily, what we see happening is that there has been another explosion of technological innovation to disrupt capital intensive markets where information technologies substitute labor and processing.  And this is only the beginning.  A company need not be merely complacent to be overtaken–it just need to be a little less agile, a bit more inflexibly structured.

All of that can be undermined, however, if a group or organization is able to control the means of obtaining and disseminating information.  This is why non-democratic regimes in the Middle East and China go to great lengths to control the internet backbone.  Here in the U.S. Comcast has argued that it doesn’t want to undermine neutrality (with some important exceptions and contrary history, by the way); that it simply intends take a percentage of the take from what runs through the plumbing.  But, ignoring the contradictory facts of their history, their stated intent is  rent seeking behavior.  All arguments to the contrary Comcast–and the other ISPs and telecom giants–haven’t hesitated to use both the courts and government power to increase their market power, and then to leverage the financial power that comes with that new advantage to greater advantage.  Historical comparisons to the 19th century Robber Barons of the railroads is both accurate and instructive.  It’s an old playbook.

It will be interesting to see what the FCC does, given that Mr. Obama appointed a telecommunications lobbyist to run an agency formed to rein in those very industries.  The proponents of undermining net neutrality have co-opted the use of the term “innovation” so that it is meaningless unless you are a cable company or ISP that can find another fee for service scheme.  Apparently innovation is only important for those private companies who have the bucks.  Rarely, however, do those with the bucks want to see the next Buck Rogers pass them by–and that, my friends, is the crux of the issue.