Friday Hot Washup: Daddy Stovepipe sings the Blues, and Net Neutrality brought to you by Burger King

Daddy Stovepipe sings the Blues — Line and Staff Organizations (and how they undermine organizational effectiveness)

In my daily readings across the web I came upon this very well written blog post by Glen Alleman at his Herding Cat’s blog. The eternal debate in project management surrounds when done is actually done–and what is the best measurement of progress toward the completion of the end item application?

Glen rightly points to the specialization among SMEs in the PM discipline, and the differences between their methods of assessment. These centers of expertise are still aligned along traditional line and staff organizations that separate scheduling, earned value, system engineering, financial management, product engineering, and other specializations.

I’ve written about this issue where information also follows these stove-piped pathways–multiple data streams with overlapping information, but which resists effective optimization and synergy because of the barriers between them. These barriers may be social or perceptual, which then impose themselves upon the information systems that are constructed to support them.

The manner in which we face and interpret the world is the core basis of epistemology. When we develop information systems and analytical methodologies, whether we are consciously aware of it or not, we delve into the difference between justified belief and knowledge. I see the confusion of these positions in daily life and in almost all professions and disciplines. In fact, most of us find ourselves jumping from belief to knowledge effortlessly without being aware of this internal contradiction–and the corresponding reduction in our ability to accurately perceive reality.

The ability to overcome our self-imposed constraints is the key but, I think, our PM organizational structures must be adjusted to allow for the establishment of a learning environment in relation to data. The first step in this evolution must be the mentoring and education of a discipline that combines these domains. What this proposes is that no one individual need know everything about EVM, scheduling, systems engineering, and financial management. But the business environment today is such, if the business or organization wishes to be prepared for the world ahead, to train transition personnel toward a multi-disciplinary project management competency.

I would posit, contrary to Glen’s recommendation, that no one discipline claim to be the basis for cross-functional integration, only because it may be a self-defeating one. In the book Networks, Crowds, and Markets: Reasoning about a Highly Connected World by David Easley and Jon Kleinberg of Cornell, our social systems are composed of complex networks, but where negative perceptions develop when the network is no longer considered in balance. This subtle and complex interplay of perceptions drive our ability to work together.

It also affects whether we will stay safe the comfort zone of having our information systems tell us what we need to analyze, or whether we apply a more expansive view of leveraging new information systems that are able to integrate ever expanding sets of relevant data to give us a more complete picture of what constitutes “done.”

Hold the Pickle, Hold the Lettuce, Special Orders Don’t Upset Us: Burger King explains Net Neutrality

The original purpose of the internet has been the free exchange of ideas and knowledge. Initially, under ARPANET, Lawrence Roberts and later Bob Kahn, the focus was on linking academic and research institutions so that knowledge could be shared resulting in collaboration that would overcome geographical barriers. Later the Department of Defense, NASA, and other government organizations highly dependent on R&D were brought into the new internet community.

To some extent there still are pathways within what is now broadly called the Web, to find and share such relevant information with these organizations. With the introduction of commercialization in the early 1990s, however, it has been increasingly hard to perform serious research.

For with the expansion of the internet to the larger world, the larger world’s dysfunctions and destructive influences also entered. Thus, the internet has transitioned from a robust First Amendment free speech machine to a place that also harbors state-sponsored psy-ops and propaganda. It has gone form a safe space for academic freedom and research to a place of organized sabotage, intrusion, theft, and espionage. It has transitioned from a highly organized professional community that hewed to ethical and civil discourse, to one that harbors trolls, prejudice, hostility, bullying, and other forms of human dysfunction. Finally and most significantly, it has become dominated by commercial activity, dominated by high tech giants that stifle innovation, and social networking sites that also allow, applying an extreme Laissez-faire attitude, magnify and spread the more dysfunctional activities found in the web as a whole.

At least for those who still looked to the very positive effects of the internet there was net neutrality. The realization that blogs like this one and the many others that I read on a regular basis, including mainstream news, and scientific journals still were available without being “dollarized” in the words of the naturalist John Muir.

Unfortunately this is no longer the case, or will no longer be the case, perhaps, when the legal dust settles. Burger King has placed it marker down and it is a relevant and funny one. Please enjoy and have a great weekend.

 

Highway to the (Neutral) Zone — Net Neutrality and More on Information Economics

Net Neutrality was very much in the news this week.  First, the President came out in favor of Net Neutrality on Monday.  Then later in the week the chair of the FCC, Tom Wheeler, who looked like someone caught with his hands in the cookie jar, vacillated on how the agency sees the concept of Net Neutrality.  Some members of Congress have taken exception.

For those of us in the software business, the decision of the FCC will determine whether the internet which was created by public investment, will be taken over and dominated by a few large corporations.  The issue isn’t a hard one to understand.  Internet service providers, which is an area dominated by large telecommunications and cable oligopolies, would like to take lay claim to the internet’s bandwidth and charge for levels of access and internet speed.  A small business, a startup, any small enterprise would be stuck in a slower internet, while those with the financial resources would be able to push their products and services into internet “fast lanes” by paying fees for the privilege and therefore be able to have an advantage in terms of visibility, raising the barriers of entry to would-be competitors, and to defend market share.  Conceivably, since these companies often provide their own products or are aligned with other large companies both vertically and horizontally, there would be little to stop a provider from controlling all aspects of the information that is available to consumers, teachers, citizens, researchers–virtually anyone who accesses the internet–which is virtually everyone today.  Those who claim that such use of power is unlikely because Comcast et al have committed themselves to the now defunct 2010 rules apparently haven’t read the fine print, are unfamiliar with recent economic history (such as Comcast’s throttling of BitTorrent in the early 2000s, Cox Cable’s blocking of some downloading, and other similar examples), or haven’t heard of Lord Acton.

When combined with attacks on public investments for community broadband (also known as public high speed internet) in cities and communities, we are seeing an orchestrated campaign by a few corporations to not only dictate the terms of the market, but also to control the market itself.  This is the classic definition of a corporate trust and monopoly.  It is interesting that those who constantly advocate for a free, competitive market are the first to move against them where they do exist.

Jeffrey Dorfman at Forbes–to pick just one example–falls into this category, seemingly twisting logic into pretzels to make his argument.  He addresses analogies when we only have to point out the conditions in the real world.  For example, I love the following statement:  “The key point that President Obama has missed along with all the rabid supporters of net neutrality is that ISPs and the companies that control the Internet backbone infrastructure that knits everything together do not have the power to pick winners and losers either. Consumers decide what products and services are successful because we adopt them. If an ISP blocks because of the bandwidth it requires, consumers who want Netflix will take their business elsewhere. If enough people do so, the ISP will have to change policies or go out of business.”  Hmmm.  So in large swaths of the United States where there is only one ISP, how will consumers choose Netflix or drive the ISP out of business?  What market mechanism or model applies to this scenario?  I cannot find in either Samuelson or Friedman (or Smith, Ricardo, Keynes, Classical or neo-Classical economics, etc.)–or a historical example for that matter–where a company exerting monopoly power has been driven out of business due to consumer preference for a product.  More to the point, if an ISP prevents a company like Netflix to provide its service over the internet backbone how would consumers know about it in the first place, especially if the monopoly substitutes its own equivalent service instead?

But Mr. Dorfman’s non-sequiturs get better.  He follows up with the following statement:  “As the former chief economist for the FCC, Thomas Hazlett, pointed out  this week in Time, Facebook, Instagram, Twitter, LinkedIn, (and many, many more success stories of innovation) all emerged without the benefit of net neutrality.”  Aside from committing the fallacy of argumentation from authority, he can’t get his facts right.

The internet as we know it really didn’t begin to come into existence and open up to commercial traffic until the late 1990s.  The FCC created the first voluntary net neutrality rules in 2004, but the internet was still largely open with many competing ISPs well into the new century, thus net neutrality was largely a de facto condition.  In 2008 the FCC auctioned wireless spectrum with tight rules ensuring net neutrality and followed this up with a broader set of requirements in 2010.  These 2010 rules did not apply to all ISPs because of restrictions by the courts, but functioned pretty well.  It wasn’t until 2014 that the 2010 rules were once again overturned by the courts.  Mr. Hazlett’s cited “point” then is factually inaccurate, since the companies he references did come into existence in an environment of de facto–partly voluntary and partly enforced–net neutrality.  What has changed is the use of the courts by corporations and revolving door lawyers like Mr. Hazlett to undermine that condition.  What Mr. Hazlett would like to do is shut the door to new companies succeeding under the same set of rules as those earlier ones.

So what “net neutrality” is about is addressing a problem that is supported by concrete examples where both the public interest and open market principles were violated when the fences came down.  In the scenario that Mr. Dorfman proposes to defend corporate power, consumers don’t get a vote, to use the canard by ideologues that consumers “vote” to begin with.  The market sets price.  Consumer preferences are shaped by other factors outside of the market, information being one of those factors.

As I noted in a previous blog post, research into the economics of information has revealed that it is a discipline with several unique characteristics, among these being that information is easily transferrable but, in order to determine its utility, requires some knowledge and investment of time.  Along with the insight of social scientist Martin Sklar that the capital investment required to replace the existing material conditions of civilization has been falling steadily, what we see happening is that there has been another explosion of technological innovation to disrupt capital intensive markets where information technologies substitute labor and processing.  And this is only the beginning.  A company need not be merely complacent to be overtaken–it just need to be a little less agile, a bit more inflexibly structured.

All of that can be undermined, however, if a group or organization is able to control the means of obtaining and disseminating information.  This is why non-democratic regimes in the Middle East and China go to great lengths to control the internet backbone.  Here in the U.S. Comcast has argued that it doesn’t want to undermine neutrality (with some important exceptions and contrary history, by the way); that it simply intends take a percentage of the take from what runs through the plumbing.  But, ignoring the contradictory facts of their history, their stated intent is  rent seeking behavior.  All arguments to the contrary Comcast–and the other ISPs and telecom giants–haven’t hesitated to use both the courts and government power to increase their market power, and then to leverage the financial power that comes with that new advantage to greater advantage.  Historical comparisons to the 19th century Robber Barons of the railroads is both accurate and instructive.  It’s an old playbook.

It will be interesting to see what the FCC does, given that Mr. Obama appointed a telecommunications lobbyist to run an agency formed to rein in those very industries.  The proponents of undermining net neutrality have co-opted the use of the term “innovation” so that it is meaningless unless you are a cable company or ISP that can find another fee for service scheme.  Apparently innovation is only important for those private companies who have the bucks.  Rarely, however, do those with the bucks want to see the next Buck Rogers pass them by–and that, my friends, is the crux of the issue.