Back in the Saddle Again — Putting the SME into the UI Which Equals UX

“Any customer can have a car painted any colour that he wants so long as it is black.”  — Statement by Henry Ford in “My Life and Work”, by Henry Ford, in collaboration with Samuel Crowther, 1922, page 72

The Henry Ford quote, which he made half-jokingly to his sales staff in 1909, is relevant to this discussion because the information sector has developed along the lines of the auto and many other industries.  The statement was only half-joking because Ford’s cars could be had in three colors.  But in 1909 Henry Ford had found a massive market niche that would allow him to sell inexpensive cars to the masses.  His competition wasn’t so much as other auto manufacturers, many of whom catered to the whims of the rich and more affluent members of society, but against the main means of individualized transportation at the time–the horse and buggy.  The color was not so much important to this market as was the need for simplicity and utility.

Since the widespread adoption of the automobile and the expansion of the market with multiple competitors, high speed roadways, a more affluent society anchored by a middle class, and the impact of industrial and information systems development in shaping societal norms, the automobile consumer has, over time, become more varied and sophisticated.  Today automobiles have introduced a number of new features (and color choices)–from backup cameras, to blind spot and back-up warning signals, to lane control, auto headline adjustment, and many other innovations.  Enhancements to industrial production that began with the introduction of robotics into the assembly line back in the late 1970s and early 1980s, through to the adoption of Just-in-Time (JiT) and Lean principles in overall manufacturing, provide consumers a a multitude of choices.

We are seeing a similar evolution in information systems, which leads me to the title of this post.  During the first waves of information systems development and introduction into our governing and business systems, the process has been one in which software is developed first to address an activity that is completed manually.  There would be a number of entries into a niche market (or for more robustly capitalized enterprises into an entire vertical).  The software would be fairly simplistic and the features limited, the objects (the way the information is presented and organized on the screen, the user selections, and the charts, graphs, and analytics allowed to enhance information visibility) well defined, and the UI (user interface) structured along the lines of familiar formats and views.

To include the input of the SME into this process, without specific soliciting of advice, was considered both intrusive and disruptive.  After all, software development largely was an activity confined to a select and highly trained specialty involving sophisticated coding languages that required a good deal of talent to be considered “elegant”.  I won’t go into a definition of elegance here, which I’ve addressed in previous posts, but for a short definition it is this:  the fewest bits of code possible that both maximizes computing power and provides the greatest flexibility for any given operation or set of operations.

This is no mean feat and a great number of software applications are produced in this way.  Since the elegance of any line of code varies widely by developer and organization, the process of update and enhancement can involve a great deal of human capital and time.  Thus, the general rule has been that the more sophisticated that any software application is, the more effort and thus less flexibility that the application possesses.  Need a new chart?  We’ll update you next year.  Need a new set of views or user settings?  We’ll put your request on the road-map and maybe you’ll see something down the road.

It is not as if the needs and requests of users have always been ignored.  Most software companies try to satisfy the needs of their customer, balancing the demands of the market against available internal resources.  Software websites, such as at UXmatters in this article, have advocated the ways that the SME (subject-matter expert) needs to be at the center of the design process.

With the introduction of fourth-generation adaptive software environments–that is, those systems that leverage underlying operating environments and objects such as .NET and WinForms, that are open to any data through OLE DB and ODBC, and that leave the UI open to simple configuration languages that leverage these underlying capabilities and place them at the feet of the user–put the SME at the center of the design process into practice.

This is a development in software as significant as the introduction of JiT and Lean in manufacturing, since it removes both the labor and time-intensiveness involved in rolling out software solutions and enhancements.  Furthermore, it goes one step beyond these processes by allowing the SME to roll out multiple software solutions from one common platform that is only limited by access to data.  It is as if each organization and SME has a digital printer for software applications.

Under this new model, software application manufacturers have a flexible environment to pre-configure the 90% solution to target any niche or market, allowing their customers to fill in any gaps or adapt the software as they see fit.  There is still IP involved in the design and construction of the “canned” portion of the solution, but the SME can be placed into the middle of the design process for how the software interacts with the user–and to do so at the localized and granular level.

This is where we transform UI into UX, that is, the total user experience.  So what is the difference?  In the words of Dain Miller in a Web Designer Depot article from 2011:

UI is the saddle, the stirrups, and the reigns.

UX is the feeling you get being able to ride the horse, and rope your cattle.

As we adapt software applications to meet the needs of the users, the role of the SME can answer many of the questions that have vexed many software implementations for years such as user perceptions and reactions to the software, real and perceived barriers to acceptance, variations in levels of training among users, among others.  Flexible adaptation of the UI will allow software applications to be more successfully localized to not only meet the business needs of the organization and the user, but to socialize the solution in ways that are still being discovered.

In closing this post a bit of full disclosure is in order.  I am directly involved in such efforts through my day job and the effects that I am noting are not simply notional or aspirational.  This is happening today and, as it expands throughout industry, will disrupt the way in which software is designed, developed, sold and implemented.

Over at AITS.org — Black Swans: Conquering IT Project Failure & Acquisition Management

It’s been out for a few days but I failed to mention the latest article at AITS.org.

In my last post on the Blogging Alliance I discussed information theory, the physics behind software development, the economics of new technology, and the intrinsic obsolescence that exists as a result. Dave Gordon in his regular blog described this work as laying “the groundwork for a generalized theory of managing software development and acquisition.” Dave has a habit of inspiring further thought, and his observation has helped me focus on where my inquiries are headed…

To read more please click here.

Super Doodle Dandy (Software) — Decorator Crabs and Wirth’s Law

decorator-crab[1]

The song (absent the “software” part) in the title is borrowed from the soundtrack of the movie, The Incredible Mr. Limpet.  Made in the day before Pixar and other recent animation technologies, it remains a largely unappreciated classic; combining photography and animation in a time of more limited tools, but with Don Knotts creating another unforgettable character beyond Barney Fife.  Somewhat related to what I am about to write, Mr. Limpet taught the creatures of the sea new ways of doing things, helping them overcome their mistaken assumptions about the world.

The photo that opens this post is courtesy of the Monterey Aquarium and looks to be the crab Oregonia gracilis, commonly referred to as the Graceful Decorator Crab.  There are all kinds of Decorator Crabs, most of which belong to the superfamily Majoidea.  The one I most often came across and raised in aquaria was Libinia dubia, an east coast cousin.  You see, back in a previous lifetime I had aspirations to be a marine biologist.  My early schooling was based in the sciences and mathematics.  Only later did I gradually gravitate to history, political science, and the liberal arts–finally landing in acquisition and high tech project management, which tends to borrow something from all of these disciplines.  I believe that my former concentration of studies have kept me grounded in reality–in viewing life the way it is and the mysteries that are yet to be solved in the universe absent resort to metaphysics or irrationality–while the latter concentrations have connected me to the human perspective in experiencing and recording existence.

But there is more to my analogy than self-explanation.  You see, software development exhibits much of the same behavior of Decorator Crabs.

In my previous post I talk about Moore’s Law and the compounding (doubling) of greater processor power in computing every 12 to 24 months.  (It does not seem to be as much a physical law as an observation, and we can only guess how long this trend will continue).  We also see a corresponding reduction in cost vis-à-vis this greater capability.  Yet, despite these improvements, we find that software often lags behind and fails to leverage this capability.

The observation that has recorded this phenomenon is found in Wirth’s Law, which posits that software is getting slower at a faster rate than computer hardware is getting faster.  There are two variants of this law, one ironic and the other only less so.  These are May’s and Gates’ variants.  Basically these posit that software speed halves every 18 months, thereby negating Moore’s Law.  But why is this?

For first causes one need only look to the Decorator Crab.  You see, the crab, all by itself, is a typical crab: an arthropod invertebrate with a hard carapace, spikes on its exoskeleton, segmented body with jointed limbs, five pairs of legs, the first pair of legs usually containing chelae (the familiar pincers and claws).  There are all kinds of crabs in salt, fresh, and brackish water.  They tend to be well adapted to their environment.  But they are also tasty and high in protein value, thus having a number of predators.  So the Decorator Crab has determined that what evolution has provided is not enough–it borrows features and items from its environment to enhance its capabilities as a defense mechanism.  There is a price to being a Decorator Crab.  Encrustations also become encumbrances.  Where crabs have learned to enhance their protections, for example by attaching toxic sponges and anemones, these enhancements may also have made them complaisant because, unlike most crabs, Decorator Crabs don’t tend to scurry from crevice to crevice, but tend to walk awkwardly and more slowing than many of their cousins in the typical sideways crab gait.  This behavior makes them interesting, popular, and comical subjects in both public and private aquaria.

In a way, we see an analogy in the case of software.  In earlier generations of software design, applications were generally built to solve a particular challenge that mimicked the line and staff structure of the organizations involved–designed to fit its environmental niche.  But over time, of course, people decide that they want enhancements and additional features.  The user interface, when hardcoded, must be adjusted every time a new function or feature is added.

Rather than rewriting the core code from scratch–which will take time and resource-consuming reengineering and redesign of the overall application–modules, subroutines, scripts, etc. are added to software to adapt to the new environment.  Over time, software takes on the characteristics of the Decorator Crab.  The new functions are not organic to the core structure of the software, just as the attached anemone, sponges, and algae are not organic features of the crab.  While they may provide the features desired, they are not optimized, tending to use brute force computing power as the means of accounting for lack of elegance.  Thus, the more powerful each generation of hardware computing power tends to provide, the less effective each enhancement release of software tends to be.

Furthermore, just as when a crab tends to look less like a crab, it requires more effort and intelligence to identify the crab, so too with software.  The greater the encrustation of features that tend to attach themselves to an application, the greater the effort that is required to use those new features.  Learning the idiosyncrasies of the software is an unnecessary barrier to the core purposes of software–to increase efficiency, improve productivity, and improve speed.  It serves only one purpose: to increase the “stickiness” of the application within the organization so that it is harder to displace by competitors.

It is apparent that this condition is not sustainable–or acceptable–especially where the business environment is changing.  New software generations, especially Fourth Generation software, provide opportunities to overcome this condition.

Thus, as project management and acquisition professionals, the primary considerations that must be taken into account are optimization of computing power and the related consideration of sustainability.  This approach militates against complacency because it influences the environment of software toward optimization.  Such an approach will also allow organizations to more fully realize the benefits of Moore’s Law.

Over at AITS.org — Maxwell’s Demon: Planning for Obsolescence in Acquisitions

I’ve posted another article at AITS.org’s Blogging Alliance, this one dealing with the issue of software obsolescence and the acquisition strategy that applies given what we know about the nature of software.  I also throw in a little background on information theory and the physical limitations of software as we now know it (virtually none).  As a result, we require a great deal of agility inserted into our acquisition systems for new technologies.  I’ll have a follow up article over there that provides specifics on acquisition planning and strategies.  Random thoughts on various related topics will also appear here.  Blogging has been sporadic of late due to op-tempo but I’ll try to keep things interesting and more frequent.

Brother Can You (Para)digm? — Four of the Latest Trends in Project Management

At the beginning of the year we are greeted with the annual list of hottest “project management trends” prognostications.  We are now three months into the year and I think it worthwhile to note the latest developments that have come up in project management meetings, conferences, and in the field.  Some of these are in alignment with what you may have seen in some earlier articles, but these are four that I find to be most significant thus far, and there may be a couple of surprises for you here.

a.  Agile and Waterfall continue to duke it out.  As the term Agile is adapted and modified to real world situations, the cult purists become shriller in attempting to enforce the Manifesto that may not be named.  In all seriousness, it is not as if most of these methods had not been used previously–and many of the methods, like scrum, also have their roots in Waterfall and earlier methods.  A great on-line overview and book on the elements of scrum can be found at Agile Learning Labs.  But there is a wide body of knowledge out there concerning social and organizational behavior that is useful in applying what works and doesn’t work.  For example, the observational science behind span of control, team building, the structure of the team in supporting organizational effectiveness, and the use of sprints in avoiding the perpetual death-spiral of adding requirements and not defining “done”, are best practices that identify successful teams (depending how you define success–keeping in mind that a successful team that produces the product often still fails as a going concern, and thus falls into obscurity).

All that being said, if you want to structure these best practices into a cohesive methodology, call it Agile, Waterfall or Harry, and can make money at it while helping people succeed in a healthy work environment, all power to you.  In IT, however, it is this last point that makes this particular controversy seem like we’ve been here before.  When woo-woo concepts like #NoEstimates and self-organization are thrown about, the very useful and empirical nature of the enterprise enters into magical thinking and ideology.  The mathematics of unsuccessful IT projects has not changed significantly since the shift to Agile.  From what one can discern from the so-called studies on the market, which are mostly anecdotal or based on unscientific surveys, somewhere north of 50% of IT projects fail, failure defined as behind schedule and over cost, or failing to meet functionality requirements.

Given this, Agile seems to be the latest belle to the ball and virtually any process improvement introducing scrum, teaming, and sprints seems to get the tag.  Still, there is much blood and thunder being expended for a result that amounts to the same (and probably less than the) mathematical chance of success as found in the coin flip.  I think for the remainder of the year the more acceptable and structured portions of Agile will get the nod.

b.  Business technology is now driving process.  This trend, I think, is why process improvements like Agile, that claim to be the panacea, cannot deliver on their promises.  As best practices they can help organizations avoid a net negative, but they rarely can provide a net positive.  Applying new processes and procedures while driving blind will still run you off the road.  The big story in 2015, I think, is the ability to handle big data and to integrate that data in a manner to more clearly reveal context to business stakeholders.  For years in A&D, DoD, governance, and other verticals engaged in complex, multi-year project management, we have seen the push and pull of interests regarding the amount of data that is delivered or reported.  With new technologies this is no longer an issue.  Delivering a 20GB file has virtually the same marginal cost as delivering a 10GB file.  Sizes smaller than 1G aren’t even worth talking about.

Recently I heard someone refer to the storage space required for all this immense data, it’s immense I tell you!  Well storage is cheap and large amounts of data can be accessed through virtual repositories using APIs and smart methods of normalizing data that requires integration at the level defined by the systems’ interrelationships.  There is more than one way to skin this cat, and more methods for handling bigger data are coming on-line every year.  Thus, the issue is not more or less data, but better data regardless of the size of the underlying file or table structure or the amount of information.  The first go-round of this process will require that all of the data available already in repositories be surveyed to determine how to optimize the information it contains.  Then, once transformed into intelligence, to determine the best manner of delivery so that it provides both significance and context to the decision maker.  For many organizations, this is the question that will be answered in 2015 and into 2016.  At that point it is the data that will dictate the systems and procedures needed to take advantage of this powerful advance in business intelligence.

c.  Cross-functional teams will soon morph into cross-functional team members.  As data originating from previously stove-piped competencies is integrated into a cohesive whole, the skillsets necessary to understand the data, know how to convert it into intelligence, and act appropriately on that intelligence will begin to shift to require a broader, multi-disciplinary understanding.  Businesses and organizations will soon find that they can no longer afford the specialist who only understands cost, schedule, risk, or any one aspect of the other various specialties that were dictated by the old line-and-staff and division of labor practices of the 20th century.  Businesses and organizations that place short term, shareholder, and equity holder interests ahead of the business will soon find themselves out of business in this new world.  The same will apply to organizations that continue to suppress and compartmentalize data.  This is because a cross-functional individual that can maximize the use of this new information paradigm requires education and development.  To achieve this goal dictates the need for the establishment of a learning organization, which requires investment and a long term view.  A learning organization exposes its members to become competent in each aspect of the business, with development including successive assignments of greater responsibility and complexity.  For the project management community, we will increasingly see the introduction of more Business Analysts and, I think, the introduction of the competency of Project Analyst to displace–at first–both cost analyst and schedule analyst.  Other competency consolidation will soon follow.

d.  The new cross-functional competencies–Business Analysts and Project Analysts–will take on an increasing role in design and deployment of technology solutions in the business.  This takes us full circle in our feedback loop that begins with big data driving process.  We are already seeing organizations that have implemented the new technologies and are taking advantage of new insights not only introducing new multi-disciplinary competencies, but also introducing new technologies that adapt the user environment to the needs of the business.  Once the business and project analyst has determined how to interact with the data and the systems necessary to the decision-making process that follows, adaptable technologies that do not take the hard-coded “one size fits all” user interfaces are, and will continue, to find wide acceptance.  Fewer off-line and one-off utilities that have been used to fill the gaps resulting from the deficiencies in inflexible hard-coded business applications will allow innovative approaches to analysis to be mainstreamed into the organization.  Once again, we are already seeing this effect in 2015 and the trend will only accelerate as possessing greater technological knowledge becomes an essential element of being an analyst.

Despite dire predictions regarding innovation, it appears that we are on the cusp of another rapid shift in organizational transformation.  The new world of big data comes with both great promise and great risks.  For project management organizations, the key in taking advantage of its promise and minimizing its risks is to stay ahead of the transformation by embracing it and leading the organization into positioning itself to reap its benefits.

One-Trick Pony — Software apps and the new Project Management paradigm

Recently I have been engaged in an exploration and discussion regarding the utilization of large amounts of data and how applications derive importance from that data.  In an on-line discussion with the ever insightful Dave Gordon, I first postulated that we need to transition into a world where certain classes of data are open so that the qualitative content can be normalized.  This is what for many years was called the Integrated Digital Environment (IDE for short).  Dave responded with his own post at the AITS.org blogging alliance, countering that while such standards are necessary in very specific and limited applications, that modern APIs provide most of the solution.  I then responded directly to Dave here, countering that IDE is nothing more than data neutrality.  Then also at AITS.org I expanded on what I proposed to be a general approach in understanding big data, noting the dichotomy in the software approaches that organize the external characteristics of the data to generalize systems and note trends, as opposed to those that are focused on the qualitative content within the data.

It should come as no surprise then, given these differences in approaching data, that we also find similar differences in the nature of applications that are found on the market.  With the recent advent of on-line and hosted solutions, there are literally thousands of applications in some categories of software that propose to do one thing with data, or that are focused one-trick pony applications that can be mixed and matched to somehow provide an integrated solution.

There are several problems with this sudden explosion of applications of this nature.

The first is in the very nature of the explosion.  This is a classic tech bubble, albeit limited to a particular segment of the software market, and it will soon burst.  As soon as consumers find that all of that information traveling over the web with the most minimal of protections is compromised by the next trophy hack, or that too many software providers have entered the market prematurely–not understanding the full needs of their targeted verticals–it will hit like the last one in 2000.  It only requires a precipitating event that triggers a tipping point.

You don’t have to take my word for it.  Just type in a favorite keyword into your browser now (and I hope you’re using VPN doing it) for a type of application for which you have a need–let’s say “knowledge base” or “software ticket systems.”  What you will find is that there are literally hundreds if not thousands of apps built for this function.  You cannot test them all.  Basic information economics, however, dictates that you must invest some effort in understanding the capabilities and limitations of the systems on the market.  Surely there are a couple of winners out there.  But basic economics also dictates that 95% of those presently in the market will be gone in short order.  Being the “best” or the “best value” does not always win in this winnowing out.  Certainly chance, the vagaries of your standing in the search engine results, industry contacts–virtually any number of factors–will determine who is still standing and who is gone a year from now.

Aside from this obvious problem with the bubble itself, the approach of the application makers harkens back to an earlier generation of one-off applications that attempt to achieve integration through marketing while actually achieving, at best, only old-fashioned interfacing.  In the world of project management, for example, organizations can little afford to revert to the division of labor, which is what would be required to align with these approaches in software design.  It’s almost as if, having made their money in an earlier time, that software entrepreneurs cannot extend themselves beyond their comfort zones in taking advantage of the last TEN software generations that provide new, more flexible approaches to data optimization.  All they can think to do is party like it’s 1995.

For the new paradigm in project management is to get beyond the traditional division of labor.  For example, is scheduling such a highly specialized discipline rising to the level of a profession that it is separate from all of the other aspects of project management?  Of course not.  Scheduling is a discipline–a sub-specialty actually–that is inextricably linked to all other aspects of project management in a continuum.  The artifacts of the process of establishing project systems and controls constitutes the project itself.

No doubt there are entities and companies that still ostensibly organize themselves into specialties as they did twenty years ago: cost analysts, schedule analysts, risk management specialists, among others.  But given that the information from the these systems: schedule, cost management, project financial management, risk management, technical performance, and all the rest, can be integrated at the appropriate level of their interrelationships to provide us a cohesive, holistic view of the complex system that we call a project, is such division still necessary?  In practice the industry has already moved to position itself to integration, realizing the urgency of making the shift.

For example, to utilize an application to query cost management information in 1995 was a significant achievement during the first wave of software deployment that mimicked the division of labor.  In 2015, not so much.  Introducing a one-trick pony EVM “tool” in 2015 is laziness–hoping to turn back the clock in ignoring the obsolescence of such an approach–regardless of which slick new user interface is selected.

I recently attended a project management meeting of senior government and industry representatives.  During one of my side sessions I heard a colleague propose the discipline of Project Management Analyst in lieu of previously stove-piped specialties.  His proposal is a breath of fresh air in an industry that develops and manufacturers the latest aircraft and space technology, but has hobbled itself with systems and procedures designed for an earlier era that no longer align with the needs of doing business.  I believe the timely deployment of systems has suffered as a result during this period of transition. 

Software must lead, and accelerate the transition to the new integration paradigm.

Thus, in 2015 the choice is not between data that adheres to conventions of data neutrality, or to those that utilize data access via APIs, but in favor of applications that do both.

It is not between different hard-coded applications that provide the old “what-you-see-is-what-you-get” approach.  It is instead between such limited hard-coded applications, and those that provide flexibility so that business managers can choose among a nearly unlimited pallet of choices of how and which data, converted into information, is available to the user or classes of user based on their role and need to know; aggregated at the appropriate level of detail for the consumer to derive significance from the information being presented.

It is not between “best-of-breed” and “mix-and-match” solutions that leverage interfaces to achieve integration.  It is instead between such solution “consortiums” that drive up implementation and sustainment costs, bringing with them high overhead, against those that achieve integration by leveraging the source of the data itself, reducing the number of applications that need to be managed, allowing data to be enriched in an open and flexible environment, achieving transformation into useful information.

Finally, the choice isn’t among applications that save their attributes in a proprietary format so that the customer must commit themselves to a proprietary solution.  Instead, it is between such restrictive applications and those that open up data access, clearly establishing that it is the consumer that owns the data.

Note: I have made minor changes from the original version of this post for purposes of clarification.

What did I miss over the holiday? — Dave Gordon at AITS.org

Great post by Dave Gordon and a valid point for anyone undertaking development:  determine what “done” looks like.  Understand that people and systems are not perfect, that version 1.0 doesn’t need to do everything.  Here is Dave for the rest:

“Perfectionists are sadomasochists. They are masochists, because they rarely reach perfection, and can’t maintain it for more than an instant when they do. So they are continually frustrated, in addition to being obsessed. And they are sadists, because they drive everyone around them to pursue the same silly goals that they obsess over….” Read more.

My Generation — Baby Boom Economics, Demographics, and Technological Stagnation

“You promised me Mars colonies, instead I got Facebook.” — MIT Technology Review cover over photo of Buzz Aldrin

“As a boy I was promised flying cars, instead I got 140 characters.”  — attributed to Marc Maron and others

I have been in a series of meetings over the last couple of weeks with colleagues describing the state of the technology industry and the markets it serves.  What seems to be a generally held view is that both the industry and the markets for software and technology are experiencing a hardening of the arteries and a resistance to change not seen since the first waves of digitization in the 1980s.

It is not as if this observation has not been noted by others.  Tyler Cowen at George Mason University noted the trend of technological stagnation in the eBook The Great Stagnation: How America Ate All the Low-Hanging Fruit of Modern History, Got Sick, and Will(Eventually) Feel BetterCowen’s thesis is not only to point out that innovation has slowed since the late 19th century, but that it has slowed a lot, where we have been slow to exploit “low-hanging fruit.”  I have to say that I am not entirely convinced by some of the data, which is anything but reliable in demonstrating causation in the long term trends.  Still, his observations of technological stagnation seem to be on the mark.  His concern, of course, is also directed to technology’s affect on employment, pointing out that, while making some individuals very rich, the effect of recent technological innovation doesn’t result in much employment

Cowen published his work in 2011, when the country was still in the early grip of the slow recovery from the Great Recession, and many seized on Cowen’s thesis as an opportunity for excuse-mongering and looking for deeper causes than the most obvious ones: government shutdowns, wage freezes, reductions in government R&D that is essential to private sector risk handling, and an austerian fiscal policy (with sequestration) in the face of weak demand created by the loss of $8 trillion in housing wealth that translated into a consumption gap of $1.2 trillion in 2014 dollars

Among the excuses that were manufactured is the meme that is still making the rounds about jobs mismatch due to a skills gap.  But, as economist Dean Baker has pointed out again and again, basic economics dictates that the scarcity of a skill manifests itself in higher wages and salaries–a reality not supported by the data for any major job categories.  Unemployment stood at 4.4 percent in May 2007 prior to the Great Recession.  The previous low between recession and expansion was the 3.9 percent rate in December 2000, yet we are to believe that suddenly in the 4 years since the start of one of the largest bubble crashes and resulting economic and financial crisis, that people no longer have the skills need to be employed (or suddenly are more lazy or shiftless).  The data do not cohere.

In my own industry and specialty there are niches for skills that are hard to come by and these people are paid handsomely, but the pressure among government contracting officers across the board has been to drive salaries down–a general trend seen across the country and pushed by a small economic elite and therein, I think lies the answer more than some long-term trend tying patents to “innovation.”  The effect of this downward push is to deny the federal government–the people’s government–from being able to access the high skills personnel needed to make it both more effective and responsive.  Combined with austerity policies there is a race to the bottom in terms of both skills and compensation.

What we are viewing, I think, that is behind our current technological stagnation is a reaction to the hits in housing wealth, in real wealth and savings, in employment, and in the downward pressure on compensation.  Absent active government fiscal policy as the backstop of last resort, there are no other places to make up for $1.2 trillion in lost consumption.  Combine this with the excesses of the patent and IP systems that create monopolies and stifle competition, particularly under the Copyright Term Extension Act and the recent Leahy-Smith America Invents Act.  Both of these acts have combined to undermine the position of small inventors and companies, encouraging the need for large budgets to anticipate patent and IP infringement litigation, and raising the barriers to entry for new technological improvements.

No doubt exacerbating this condition is the Baby Boom.  Since university economists don’t seem to mind horning in on my specialty (as noted in a recent post commenting on the unreliability of data mining by econometrics),  I don’t mind commenting on theirs–and what has always surprised me is how Baby Boom Economics never seems to play a role in understanding trends, nor as predictors of future developments in macroeconomic modeling.  Wages and salaries, even given Cowen’s low-hanging fruit, have not kept pace with productivity gains (which probably explains a lot of wealth concentration) since the late 1970s–a time that coincides with the Baby Boomers entering the workforce in droves.  A large part of this condition has been a direct consequence of government policies–through so-called ‘free trade” agreements–that have exposed U.S. workers in industrial and mid-level jobs to international competition from low-paying economies.

The Baby Boom, given an underperforming economy, saw not only their wages and salaries lag, but also saw their wealth and savings disappear with the Great Recession, when corporate mergers and acquisitions weren’t stealing their negotiated defined benefit plans, which they received in lieu of increases in compensation.  This has created a large contingent of surplus labor.  The size of the long-term unemployed, though falling, is still large compared to historical averages, is indicative of this condition.

With attempts to privatize Social Security and Medicare, workers now find themselves squeezed and under a great deal of economic anxiety.  On the ground I see this anxiety even at the senior executive level.  The workforce is increasingly getting older as people hang on for a few more years, perpetuating older ways of doing things. Even when there is a changeover, oftentimes the substitute manager did not receive the amount of mentoring and professional development expected in more functional times.  In both cases people are risk-averse, feeling that there is less room for error than there was in the past.

This does not an innovative economic environment make.

People who I had known as risk takers in their earlier years now favor the status quo and a quiet glide path to a secure post-employment life.  Politics and voting behavior also follows this culture of lowered expectations, which further perpetuates the race to the bottom.  In high tech this condition favors the perpetuation of older technologies, at least until economics dictates a change.

But it is in this last observation that there is hope for an answer, which does confirm that this is but a temporary condition.  For under the radar there are economies upon economies in computing power and the ability to handle larger amounts of data with exponential improvements in handling complexity.  Collaboration of small inventors and companies in developing synergy between compatible technologies can overcome the tyranny of the large monopolies, though the costs and risks are high.

As the established technologies continue to support the status quo–and postpone needed overhauls of code mostly written 10 to 20 years ago (which is equivalent to 20 to 40 software generations) their task, despite the immense amount of talent and money, is comparable to a Great Leap Forward–and those of you who are historically literate know how those efforts turned out.  Some will survive but there will be monumental–and surprising–falls from grace.

Thus the technology industry in many of its more sedentary niches are due for a great deal of disruption.  The key for small entrepreneurial companies and thought leaders is to be there before the tipping point.  But keep working the politics too.

Take Me Out to the Ballgame — Tournaments and Games of Failure

“Baseball teaches us, or has taught most of us, how to deal with failure. We learn at a very young age that failure is the norm in baseball and, precisely because we have failed, we hold in high regard those who fail less often – those who hit safely in one out of three chances and become star players. I also find it fascinating that baseball, alone in sport, considers errors to be part of the game, part of it’s rigorous truth.” — Fay Vincent, former Commissioner of Baseball (1989-1992)

“Baseball is a game of inches.”  — Branch Rickey, Quote Magazine, July 31, 1966

I have been a baseball fan just about as long as I have been able to talk.  My father played the game and tried out for both what were the New York Giants and Yankees–and was a pretty well known local hero in Weehawken back in the 1930s and 1940s.  I did not have my father’s athletic talents–a four letter man in high school–but I was good at hitting a baseball from the time he put a bat in my hands and so I played–and was sought after–into my college years.  Still, like many Americans who for one reason or another could not or did not pursue the game, I live vicariously through the players on the field.  We hold those who fail less in the game in high regard.  Some of them succeed for many years and are ensconced in the Hall of Fame.

Others experienced fleeting success.  Anyone who watches ESPN’s or the Yes Channel’s classic games, particularly those from the various World Series, can see this reality in play.  What if Bill Buckner in 1986 hadn’t missed that ball?  What if Bobby Richardson had not been in perfect position to catch what would have been a game and series winning liner by Willie McCovey in 1962?  Would Brooklyn have every won a series if Amoros hadn’t caught Berra’s drive down the left field line in 1955?  The Texas Rangers might have their first World Series ring if not for a plethora of errors, both mental and physical, in the sixth game of the 2011 Series.  The list can go on and it takes watching just a few of these games to realize that luck plays a big part in who is the victor.

There are other games of failure that we deal with in life, though oftentimes we don’t recognize them as such.  In economics these are called “tournaments,” and much like their early Medieval predecessor (as opposed to the stylized late Medieval and Renaissance games), the stakes are high.  In pondering the sorry state of my favorite team–the New York Yankees–as I watched seemingly minor errors and failures cascade into a humiliating loss, I came across a blog post by Brad DeLong, distinguished professor of economics at U.C. Berkeley, entitled “Over at Project Syndicate/Equitable Growth: What Do We Deserve Anyway?”  Dr. DeLong makes the very valid point, verified not only by anecdotal experience but years of economic research, that most human efforts, particularly economic ones, fail, and that the key determinants aren’t always–or do not seem in most cases–to be due to lack of talent, hard work, dedication, or any of the attributes that successful people like to credit for their success.

Instead, much of the economy, which in its present form is largely based on a tournament-like structure, allows only a small percentage of entrants to extract their marginal product from society in the form of extremely high levels of compensation.  The fact that these examples exist is much like a lottery, as the following quote from Dr. DeLong illustrates.

“If you win the lottery–and if the big prize in the lottery that is given to you is there in order to induce others to overestimate their chances and purchase lottery tickets and so enrich the lottery runner–do you “deserve” your winnings? It is not a win-win-win transaction: you are happy being paid, the lottery promoter is happy paying you, but the others who purchase lottery tickets are not happy–or, perhaps, would not be happy in their best selves if they understood what their chances really were and how your winning is finely-tuned to mislead them, for they do voluntarily buy the lottery tickets and you do have a choice.”  — Brad DeLong, Professor of Economics, U.C. Berkeley

So even though participants have a “choice,” it is one that is based on an intricately established system based on self-delusion.  It was about this time that I came across the excellent HBO Series “Silicon Valley.”  The tournament aspect of the software industry is apparent in the conferences and competitions for both customers and investors in which I have participated over the years.  In the end, luck and timing seem to play the biggest role in success (apart from having sufficient capital and reliable business partners).

I hope this parody ends my colleagues’ (and future techies’) claims to making the claim to “revolutionize” and “make the world a better place” through software.

I Can’t Get No (Satisfaction) — When Software Tools Go Bad

Another article I came across a couple of weeks ago that my schedule prevented me from highlighting was by Michelle Symonds at PM Hut entitled “5 Tell-Tale Signs That You Need a Better Project Management Tool.”  According to Ms. Symonds, among these signs are:

a.  Additional tools are needed to achieve the intended functionality apart from the core application;

b.  Technical support is poor or nonexistent;

c.  Personnel in the organization still rely on spreadsheets to extend the functionality of the application;

d.  Training on the tool takes more time than training the job;

e.  The software tool adds work instead of augmenting or facilitating the achievement of work.

I have seen situations where all of these conditions are at work but the response, in too many cases, has been “well we put so much money into XYZ tool with workarounds and ‘bolt-ons’ that it will be too expensive/disruptive to change.”  As we have advanced past the first phases of digitization of data, it seems that we are experiencing a period where older systems do not quite match up with current needs, but that software manufacturers are very good at making their products “sticky,” even when their upgrades and enhancements are window dressing at best.

In addition, the project management community, particularly focused on large projects in excess of $20M is facing the challenge of an increasingly older workforce.  Larger economic forces at play lately have exacerbated this condition.  Aggregate demand and, on the public side, austerity ideology combined with sequestration, has created a situation where highly qualified people are facing a job market characterized by relatively high unemployment, flat wages and salaries, depleted private retirement funds, and constant attacks on social insurance related to retirement.  Thus, people are hanging around longer, which limits opportunities for newer workers to grow into the discipline.  Given these conditions, we find that it is very risky to one’s employment prospects to suddenly forge a new path.  People in the industry that I have known for many years–and who were always the first to engage with new technologies and capabilities–are now very hesitant to do so now.  Some of this is well founded through experience and consists of healthy skepticism: we all have come across snake oil salesmen in our dealings at one time or another, and even the best products do not always make it due to external forces or the fact that brilliant technical people oftentimes are just not very good at business.

But these conditions also tend to hold back the ability of the enterprise to implement efficiencies and optimization measures that otherwise would be augmented and supported by appropriate technology.  Thus, in addition to those listed by Ms. Symonds, I would include the following criteria to use in making the decision to move to a better technology:

a.  Sunk and prospective costs.  Understand and apply the concepts of sunk cost and prospective cost.  The first is the cost that has been expended in the past, while the latter focuses on the investment necessary for future growth, efficiencies, productivity, and optimization.  Having made investments to improve a product in the past is not an argument for continuing to invest in the product in the future that trumps other factors.  Obviously, if the cash flow is not there an organization is going to be limited in the capital and other improvements it can make but, absent those considerations, sunk cost arguments are invalid.  It is important to invest in those future products that will facilitate the organization achieving its goals in the next five or ten years.

b.  Sustainability.  The effective life of the product must be understood, particularly as it applies to an organization’s needs.  Some of this overlaps the points made by Ms. Symonds in her article but is meant to apply in a more strategic way.  Every product, even software, has a limited productive life but my concept here goes to what Glen Alleman pointed out in his blog as “bounded applicability.”  Will the product require more effort in any form where the additional effort provides a diminishing return?  For example, I have seen cases where software manufacturers, in order to defend market share, make trivial enhancements such as adding a chart or graph in order to placate customer demands.  The reason for this should be, but is not always obvious.  Oftentimes more substantive changes cannot be made because the product was built on an earlier generation operating environment or structure.  Thus, in order to replicate the additional functionality found by newer products the application requires a complete rewrite.  All of us operating in this industry has seen this; where a product that has been a mainstay for many years begins to lose market share.  The decision, when it is finally made, is to totally reengineer the solution, but not as an upgrade to the original product, arguing that it is a “new” product.  This is true in terms of the effort necessary to keep the solution viable, but that then also completely undermines justifications based on sunk costs.

c.  Flexibility.  As stated previously in this blog, the first generation of digitization mimicked those functions that were previously performed manually.  The applications were also segmented and specialized based on traditional line and staff organizations, and specialties.  Thus, for project management, we have scheduling applications for the scheduling discipline (such as it is), earned value engines for the EV discipline, risk and technical performance applications for risk specialists and systems engineers, analytical software for project and program analysts, and financial management applications that subsumed project and program management financial management professionals.  This led to the deployment of so-called best-of-breed configurations, where a smorgasbord of applications or modules were acquired to meet the requirements of the organization.  Most often these applications had and have no direct compatibility, requiring entire staffs to reconcile data after the fact once that data was imported into a proprietary format in which it could be handled.  Even within so-called ERP environments under one company, direct compatibility at the appropriate level of the data being handled escaped the ability of the software manufacturers, requiring “bolt-ons” and other workarounds and third party solutions.  This condition undermines sustainability, adds a level of complexity that is hard to overcome, and adds a layer of cost to the life-cycle of the solutions being deployed.

The second wave to address some of these limitations focused on data flexibility using cubes, hard-coding of relational data and mapping, and data mining solutions: so-called Project Portfolio Management (PPM) and Business Intelligence (BI).  The problem is that, in the first instance, PPM simply another layer to address management concerns, while early BI systems froze in time single points of failure into hard-coded deployed solutions.

A flexible system is one that leverages the new advances in software operating environments to solve more than one problem.  This, of course, undermines the financial returns in software, where the pattern has been to build one solution to address one problem based on a specialty.  Such a system provides internal flexibility, that is, allows for the application of objects and conditional formatting without hardcoding, pushing what previously had to be accomplished by coders to the customer’s administrator or user level; and external flexibility, where the same application can address, say, EVM, schedule, risk, financial management, KPIs, technical performance, stakeholder reporting, all in the same or in multiple deployed environments without the need for hardcoding.  In this case the operating environment and any augmented code provides a flexible environment to the customer that allows one solution to displace multiple “best-of-breed” applications.

This flexibility should apply not only vertically but also horizontally, where data can be hierarchically organized to allow not only for drill-down, but also for roll-up.  Data in this environment is exposed discretely, providing to any particular user that data, aggregated as appropriate, based on their role, responsibility, or need to know.

d.  Interoperability and open compatibility.  A condition of the “best-of-breed” deployment environment is that it allows for sub-optimization to trump organizational goals.  The most recent example that I have seen of this is where it is obvious that the Integrated Master Schedule (IMS) and Performance Management Baseline (PMB) were obviously authored by different teams in different locations and, most likely, were at war with one another when they published these essential interdependent project management artifacts.

But in terms of sustainability, the absence of interoperability and open compatibility has created untenable situations.  In the example of PMB and IMS information above, in many cases a team of personnel must be engaged every month to reconcile the obvious disconnectedness of schedule activities to control accounts in order to ensure traceability in project management and performance.  Surely, not only should there be no economic rewards for such behavior, I believe that no business would perform in that manner without them.

Thus, interoperability in this case is to be able to deal with data in its native format without proprietary barriers that prevent its full use and exploitation to the needs and demands of the customer organization.  Software that places its customers in a corner and ties their hands in using their own business information has, indeed, worn out its welcome.

The reaction of customer organizations to the software industry’s attempts to bind them to proprietary solutions has been most marked in the public sector, and most prominently in the U.S. Department of Defense.  In the late 1990s the first wave was to ensure that performance management data centered around earned value was submitted in a non-proprietary format known as the ANSI X12 839 transaction set.  Since that time DoD has specified the use of the UN/CEFACT XML D09B standard for cost and schedule information, and it appears that other, previously stove-piped data will be included in that standard in the future.  This solution requires data transfer, but it is one that ensures that the underlying data can be normalized regardless of the underlying source application.  It is especially useful for stakeholder reporting situations or data sharing in prime and sub-contractor relationships.

It is also useful for pushing for improvement in the disciplines themselves, driving professionalism.  For example, in today’s project management environment, while the underlying architecture of earned value management and risk data is fairly standard, reflecting a cohesiveness of practice among its practitioners, schedule data tends to be disorganized with much variability in how common elements are kept and reported.  This also reflects much of the state of the scheduling discipline, where an almost “anything goes” mentality seems to be in play reflecting not so much the realities of scheduling practice–which are pretty well established and uniform–as opposed to the lack of knowledge and professionalism on the part of schedulers, who are tied to the limitations and vagaries of their scheduling application of choice.

But, more directly, interoperability also includes the ability to access data (as opposed to application interfacing, data mining, hard-coded Cubes, and data transfer) regardless of the underlying database, application, and structured data source.  Early attempts to achieve interoperability and open compatibility utilized ODBC but newer operating environments now leverage improved OLE DB and other enhanced methods.  This ability, properly designed, also allows for the deployment of transactional environments, in which two-way communication is possible.

A new reality.  Thus given these new capabilities, I think that we are entering a new phase in software design and deployment, where the role of the coder in controlling the UI is reduced.  In addition, given that the large software companies have continued to support a system that ties customers to proprietary solutions, I do not believe that the future of software is in open source as so many prognosticators stated just a few short years ago.  Instead, I propose that applications that behave like open source but allow those who innovate and provide maximum value, sustainability, flexibility, and interoperability to the customer are those that will be rewarded for their efforts.

Note:  This post was edited for clarity and grammatical errors from the original.