New Directions — Fourth Generation apps, Agile, and the New Paradigm

The world is moving forward and Moore’s Law is accelerating in interesting ways on the technology side, which opens new opportunities, especially in software.  In the past I have spoken of the flexibility of Fourth Generation software, that is, software that doesn’t rely on structured hardcoding, but instead, is focused on the data to deliver information to the user in more interesting and essential ways.  I work in this area for my day job, and so using such technology has tipped over more than a few rice bowls.

The response from entrenched incumbents and those using similar technological approaches in the industry focused on “tools” capabilities has been to declare vices as virtues.  Hard-coded applications that require long-term development and structures, built on proprietary file and data structures are, they declare, the right way to do things.  “We provide value by independently developing IP based on customer requirements,” they declare.  It sounds very reasonable, doesn’t it?  Only one problem: you have to wait–oh–a year or two to get that chart or graph you need, to refresh that user interface, to expand functionality, and you will almost never be able to leverage the latest capabilities afforded by the doubling of computing capability every 12 to 24 months.  The industry is filled with outmoded, poorly supported, and obsolete “tools’ already.  Guess it’s time for a new one.

The motivation behind such assertions, of course, is to slow things down.  Not possessing the underlying technology to provide more, better, and more powerful functionality to the customer quicker and more flexibly based on open systems principles, that is, dealing with data in an agnostic manner, they use their position to try to hold up disruptive entries from leaving them far behind.  This is done, especially in the bureaucratic complexities of A&D and DoD project management, through professional organizations that are used as thinly disguised lobbying opportunities by software suppliers such as the NDIA, or by appeals to contracting rules that they hope will undermine the introduction of new technologies.

All of these efforts, of course, are blowing into the wind.  The economics of the new technologies is too compelling for anyone to last long in their job by partying like it’s still 1997 under the first wave of software solutions targeted at data silos and stove-piped specialization.

The new paradigm is built on Agile and those technologies that facilitate that approach.  In case my regular readers think that I have become one of the Cultists, bowing before the Manfesto That May Not Be Named, let me assure you that is not the case.  The best articulation of Agile that I have read recently comes from Neil Killick, whom I have expressed some disagreement on the #NoEstimates debate and the more cultish aspects of Agile in past posts, but who published an excellent post back in July entitled “12 questions to find out: Are you doing Agile Software Development?”

Here are Neil’s questions:

  1. Do you want to do Agile Software Development? Yes – go to 2. No – GOODBYE.
  2. Is your team regularly reflecting on how to improve? Yes – go to 3. No – regularly meet with your team to reflect on how to improve, go to 2.
  3. Can you deliver shippable software frequently, at least every 2 weeks? Yes – go to 4. No – remove impediments to delivering a shippable increment every 2 weeks, go to 3.
  4. Do you work daily with your customer? Yes – go to 5. No – start working daily with your customer, go to 4.
  5. Do you consistently satisfy your customer? Yes – go to 6. No – find out why your customer isn’t happy, fix it, go to 5.
  6. Do you feel motivated? Yes – go to 7. No – work for someone who trusts and supports you, go to 2.
  7. Do you talk with your team and stakeholders every day? Yes – go to 8. No – start talking with your team and stakeholders every day, go to 7.
  8. Do you primarily measure progress with working software? Yes – go to 9. No – start measuring progress with working software, go to 8.
  9. Can you maintain pace of development indefinitely? Yes – go to 10. No – take on fewer things in next iteration, go to 9.
  10. Are you paying continuous attention to technical excellence and good design? Yes – go to 11. No – start paying continuous attention to technical excellent and good design, go to 10.
  11. Are you keeping things simple and maximising the amount of work not done? Yes – go to 12. No – start keeping things simple and writing as little code as possible to satisfy the customer, go to 11.
  12. Is your team self-organising? Yes – YOU’RE DOING AGILE SOFTWARE DEVELOPMENT!! No – don’t assign tasks to people and let the team figure out together how best to satisfy the customer, go to 12.

Note that even in software development based on Agile you are still “provid(ing) value by independently developing IP based on customer requirements.”  Only you are doing it faster and more effectively.

Now imagine a software technology that is agnostic to the source of data, that does not require a staff of data scientists, development personnel, and SMEs to care and feed it; that allows multiple solutions to be released from the same technology; that allows for integration and cross-data convergence to gain new insights based on Knowledge Discovery in Databases (KDD) principles; and that provides shippable, incremental solutions every two weeks or as often as can be absorbed by the organization, but responsively enough to meet multiple needs of the organization at any one time.

This is what is known as disruptive value.  There is no stopping this train.  It is the new paradigm and it’s time to take advantage of the powerful improvements in productivity, organizational effectiveness, and predictive capabilities that it provides.  This is the power of technology combined with a new approach to “small” big data, or structured data, that is effectively normalized and rationalized to the point of breaking down proprietary barriers, hewing to the true meaning of making data–and therefore information–both open and accessible.

Furthermore, such solutions using the same data streams produced by the measurement of work can also be used to evaluate organizational and systems compliance (where necessary), and effectiveness.  Combined with an effective feedback mechanism, data and technology drive organizational improvement and change.  There is no need for another tool to layer with the multiplicity of others, with its attendant specialized training, maintenance, and dead-end proprietary idiosyncrasies.  On the contrary, such an approach is an impediment to data maximization and value.

Vices are still vices even in new clothing.  Time to come to the side of the virtues.

Over at — Black Swans: Conquering IT Project Failure & Acquisition Management

It’s been out for a few days but I failed to mention the latest article at

In my last post on the Blogging Alliance I discussed information theory, the physics behind software development, the economics of new technology, and the intrinsic obsolescence that exists as a result. Dave Gordon in his regular blog described this work as laying “the groundwork for a generalized theory of managing software development and acquisition.” Dave has a habit of inspiring further thought, and his observation has helped me focus on where my inquiries are headed…

To read more please click here.

Super Doodle Dandy (Software) — Decorator Crabs and Wirth’s Law


The song (absent the “software” part) in the title is borrowed from the soundtrack of the movie, The Incredible Mr. Limpet.  Made in the day before Pixar and other recent animation technologies, it remains a largely unappreciated classic; combining photography and animation in a time of more limited tools, but with Don Knotts creating another unforgettable character beyond Barney Fife.  Somewhat related to what I am about to write, Mr. Limpet taught the creatures of the sea new ways of doing things, helping them overcome their mistaken assumptions about the world.

The photo that opens this post is courtesy of the Monterey Aquarium and looks to be the crab Oregonia gracilis, commonly referred to as the Graceful Decorator Crab.  There are all kinds of Decorator Crabs, most of which belong to the superfamily Majoidea.  The one I most often came across and raised in aquaria was Libinia dubia, an east coast cousin.  You see, back in a previous lifetime I had aspirations to be a marine biologist.  My early schooling was based in the sciences and mathematics.  Only later did I gradually gravitate to history, political science, and the liberal arts–finally landing in acquisition and high tech project management, which tends to borrow something from all of these disciplines.  I believe that my former concentration of studies have kept me grounded in reality–in viewing life the way it is and the mysteries that are yet to be solved in the universe absent resort to metaphysics or irrationality–while the latter concentrations have connected me to the human perspective in experiencing and recording existence.

But there is more to my analogy than self-explanation.  You see, software development exhibits much of the same behavior of Decorator Crabs.

In my previous post I talk about Moore’s Law and the compounding (doubling) of greater processor power in computing every 12 to 24 months.  (It does not seem to be as much a physical law as an observation, and we can only guess how long this trend will continue).  We also see a corresponding reduction in cost vis-à-vis this greater capability.  Yet, despite these improvements, we find that software often lags behind and fails to leverage this capability.

The observation that has recorded this phenomenon is found in Wirth’s Law, which posits that software is getting slower at a faster rate than computer hardware is getting faster.  There are two variants of this law, one ironic and the other only less so.  These are May’s and Gates’ variants.  Basically these posit that software speed halves every 18 months, thereby negating Moore’s Law.  But why is this?

For first causes one need only look to the Decorator Crab.  You see, the crab, all by itself, is a typical crab: an arthropod invertebrate with a hard carapace, spikes on its exoskeleton, segmented body with jointed limbs, five pairs of legs, the first pair of legs usually containing chelae (the familiar pincers and claws).  There are all kinds of crabs in salt, fresh, and brackish water.  They tend to be well adapted to their environment.  But they are also tasty and high in protein value, thus having a number of predators.  So the Decorator Crab has determined that what evolution has provided is not enough–it borrows features and items from its environment to enhance its capabilities as a defense mechanism.  There is a price to being a Decorator Crab.  Encrustations also become encumbrances.  Where crabs have learned to enhance their protections, for example by attaching toxic sponges and anemones, these enhancements may also have made them complaisant because, unlike most crabs, Decorator Crabs don’t tend to scurry from crevice to crevice, but tend to walk awkwardly and more slowing than many of their cousins in the typical sideways crab gait.  This behavior makes them interesting, popular, and comical subjects in both public and private aquaria.

In a way, we see an analogy in the case of software.  In earlier generations of software design, applications were generally built to solve a particular challenge that mimicked the line and staff structure of the organizations involved–designed to fit its environmental niche.  But over time, of course, people decide that they want enhancements and additional features.  The user interface, when hardcoded, must be adjusted every time a new function or feature is added.

Rather than rewriting the core code from scratch–which will take time and resource-consuming reengineering and redesign of the overall application–modules, subroutines, scripts, etc. are added to software to adapt to the new environment.  Over time, software takes on the characteristics of the Decorator Crab.  The new functions are not organic to the core structure of the software, just as the attached anemone, sponges, and algae are not organic features of the crab.  While they may provide the features desired, they are not optimized, tending to use brute force computing power as the means of accounting for lack of elegance.  Thus, the more powerful each generation of hardware computing power tends to provide, the less effective each enhancement release of software tends to be.

Furthermore, just as when a crab tends to look less like a crab, it requires more effort and intelligence to identify the crab, so too with software.  The greater the encrustation of features that tend to attach themselves to an application, the greater the effort that is required to use those new features.  Learning the idiosyncrasies of the software is an unnecessary barrier to the core purposes of software–to increase efficiency, improve productivity, and improve speed.  It serves only one purpose: to increase the “stickiness” of the application within the organization so that it is harder to displace by competitors.

It is apparent that this condition is not sustainable–or acceptable–especially where the business environment is changing.  New software generations, especially Fourth Generation software, provide opportunities to overcome this condition.

Thus, as project management and acquisition professionals, the primary considerations that must be taken into account are optimization of computing power and the related consideration of sustainability.  This approach militates against complacency because it influences the environment of software toward optimization.  Such an approach will also allow organizations to more fully realize the benefits of Moore’s Law.

Over at — Maxwell’s Demon: Planning for Obsolescence in Acquisitions

I’ve posted another article at’s Blogging Alliance, this one dealing with the issue of software obsolescence and the acquisition strategy that applies given what we know about the nature of software.  I also throw in a little background on information theory and the physical limitations of software as we now know it (virtually none).  As a result, we require a great deal of agility inserted into our acquisition systems for new technologies.  I’ll have a follow up article over there that provides specifics on acquisition planning and strategies.  Random thoughts on various related topics will also appear here.  Blogging has been sporadic of late due to op-tempo but I’ll try to keep things interesting and more frequent.

Ch-ch Changes — Software Implementations and Organizational Process Improvement

Dave Gordon at The Practicing IT Project Manager lists a number of factors that define IT project success.  Among these is “Organizational change management efforts were sufficient to meet adoption goals.”  This is an issue that I am grappling with now on many fronts.

The initial question that comes to mind is which comes first–the need for organizational improvement or the transformation that comes results as a result of the introduction of new technology?  “Why does this matter?” one may ask.  The answer is that it defines how things are perceived by those that are being affected (or victimized) by the new technology.  This will then translate into various behaviors.  (Note that I did not say that “Perception is reality.”  For the reason why please consult the Devil’s Phraseology.)

This is important because the groundwork laid (or not laid) for the change that is to come will then translate into sub-factors (accepting Dave’s taxonomy of factors for success) that will have a large impact on the project, and whether it is defined as a success.  In getting something done the most overriding priority is not just “Gettin’ ‘Er Done.”  The manner in which our projects, particularly in IT, are executed and the technology introduced and implemented will determine the success of a number of major factors that contribute to overall project success.

Much has been written lately about “disruptive” change, and that can be a useful analogy when applied to new technologies that transform a market by providing something that is cheaper, better, and faster (with more functionality) than the market norm.  I am driving that type of change in my own target markets.  But that is in a competitive environment.  Judgement–and good judgement–requires that we not inflict this cultural approach on the customer.

The key, I think, is bringing back a concept and approach that seems to have been lost in the shuffle: systems analysis and engineering that works hand-in-hand with the deployment of the technological improvement.  There was a reason for asking for the technology in the first place, whether it be improved communications, improved productivity, or qualitative factors.  Going in willy-nilly with a new technology that provides unexpected benefits–even if those benefits are both useful and will improve the work process–can often be greeted with fear, sabotage, and obstruction.

When those of us who work with digital systems encounter someone challenged by the new introduction of technology or fear that “robots are taking our jobs,” our reaction is often an eye-roll, treating these individuals as modern Luddites.  But that is a dangerous stereotype.  Our industry is rife with stories of individuals who fall into this category.  Many of them are our most experienced middle managers and specialists who predate the technology being introduced.  How long does it take to develop the expertise to fill these positions?  What is the cost to the organization if their corporate knowledge and expertise is lost?  Given that they have probably experienced multiple reorganizations and technology improvements, their skepticism is probably warranted.

I am not speaking of the exception–the individual who would be opposed to any change.  Dave gives a head nod to the CHAOS report, but we also know that we come upon these reactions often enough to be documented from a variety of sources.  So how to we handle these?

There are two approaches.  One is to rely upon the resources and management of the acquiring organization to properly prepare the organization for the change to come, and to handle the job of determining the expected end state of the processes, and the personnel implications that are anticipated.  Another is for the technology provider to offer this service.

From my own direct experience, what I see is a lack of systems analysis expertise that is designed to work hand-in-hand with the technology being introduced.  For example, systems analysis is a skill that is all but gone in government agencies and large companies, which rely more and more on outsourcing for IT support.  Oftentimes the IT services consultant has its own agenda, which oftentimes conflicts with the goals of both the manager acquiring the technology and the technology provider.  Few outsourced IT services contracts anticipate that the consultant must act as an enthusiastic–as opposed to tepid (at best) willing–partner in these efforts.  Some agencies lately have tasked the outsourced IT consultant to act as honest broker to choose the technology, mindless of the strategic partnering and informal relationships that will result in a conflict of interest.

Thus, technology providers must be mindful of their target markets and design solutions to meet the typical process improvement requirements of the industry.  In order to do this the individuals involved must have a unique set of skills that combines a knowledge of the goals of the market actors, their processes, and how the technology will improve those processes.  Given this expertise, technology providers must then prepare the organizational environment to set expectations and to advance the vision of the end state–and to ensure that the customer accepts that end state.  It is then up to the customer’s management, once the terms of expectations and end-state have been agreed, to effectively communicate them to those personnel affected, and to do so in a way to eliminate fear and to generate enthusiasm that will ensure that the change is embraced and not resisted.

Mo’Better Risk — Tournaments and Games of Failure Part II

My last post discussed economic tournaments and games of failure in how they describe the success and failure of companies, with a comic example for IT start-up companies.  Glen Alleman at his Herding Cats blog has a more serious response in handily rebutting those who believe that #NoEstimates, Lean, Agile, and other cult-like fads can overcome the bottom line, that is, apply a method to reduce inherent risk and drive success.  As Glen writes:

“It’s about the money. It’s always about the money. Many want it to be about them or their colleagues, or the work environment, or the learning opportunities, or the self actualization.” — Glen Alleman, Herding Cats

Perfectly good products and companies fail all the time.  Oftentimes the best products fail to win the market, or do so only fleetingly.  Just think of the roles of the dead (or walking dead) over the years:  Novell, WordPerfect, Visicalc, Harvard Graphics; the list can go on and on.  Thus, one point that I would deviate from Glen is that it is not always EBITDA.  If that were true then both Facebook and Amazon would not be around today.  We see tremendous payouts to companies with promising technologies acquired for outrageous sums of money, though they have yet to make a profit.  But for every one of these there are many others that see the light of day for a moment and then flicker out of existence

So what is going on and how does this inform our knowledge of project management?  For the measure of our success is time and money, in most cases.  Obviously not all cases.  I’ve given two cases of success that appeared to be failure in previous posts to this blog: the M1A1 Tank and the ACA.  The reason why these “failures” were misdiagnosed was that the agreed measure(s) of success were incorrect.  Knowing this difference, where, and how it applies is important.

So how do tournaments and games of failure play a role in project management?  I submit that the lesson learned from these observations is that we see certain types of behaviors that are encouraged that tend to “bake” certain risks into our projects.  In high tech we know that there will be a thousand failures for every success, but it is important to keep the players playing–at least it is in the interest of the acquiring organization to do so, and is in the public interest in many cases as well.  We also know that most IT projects by most measures–both contracted out and organic–tend to realize a high rate of failure.  But if you win an important contract or secure an important project, the rewards can be significant.

The behaviors that are reinforced in this scenario on the part of the competing organization is to underestimate the cost and time involved in the effort; that is, so-called “bid to win.”  On the acquiring organization’s part, contracting officers lately have been all too happy to award contracts they know to be too low (and normally out of the competitive range) even though they realize it to be significantly below the independent estimate.  Thus “buying in” provides a significant risk that is hard to overcome.

Other behaviors that we see given the project ecosystem are the bias toward optimism and requirements instability.

In the first case, bias toward optimism, we often hear project and program managers dismiss bad news because it is “looking in the rear view mirror.”  We are “exploring,” we are told, and so the end state will not be dictated by history.  We often hear a version of this meme in cases where those in power wish to avoid accountability.  “Mistakes were made” and “we are focused on the future” are attempts to change the subject and avoid the reckoning that will come.  In most cases, however, particularly in project management, the motivations are not dishonest but, instead, sociological and psychological.  People who tend to build things–engineers in general, software coders, designers, etc.–tend to be an optimistic lot.  In very few cases will you find one of them who will refuse to take on a challenge.  How many cases have we presented a challenge to someone with these traits and heard the refrain:  “I can do that.”?  This form of self-delusion can be both an asset and a risk.  Who but an optimist would take on any technically challenging project?  But this is also the trait that will keep people working to the bitter end in a failure that places the entire enterprise at risk.

I have already spent some bits in previous posts regarding the instability of requirements, but this is part and parcel of the traits that we see within this framework.  Our end users determine that given how things are going we really need additional functionality, features, or improvements prior to the product roll out.  Our technical personnel will determine that for “just a bit more effort” they can achieve a higher level of performance or add capabilities at marginal or tradeoff cost.  In many cases, given the realization that the acquisition was a buy-in, project and program managers allow great latitude in accepting as a change an item that was assumed to be in the original scope.

There is a point where one or more of these factors is “baked in” into the course that the project will take.  We can delude ourselves into believing that we can change the course of the trajectory of the system through the application of methods: Agile, Lean, Six Sigma, PMBOK, etc. but, in the end, if we exhaust our resources without a road map on how to do this we will fail.  Our systems must be powerful and discrete enough to note the trend that is “baked in” due to factors in the structure and architecture of the effort being undertaken.  This is the core risk that must be managed in any undertaking.  A good example that applies to a complex topic like Global Warming was recently illustrated by Neil deGrasse Tyson in the series Cosmos:

In this example Dr. Tyson is climate and the dog is the weather.  But in our own analogy Dr. Tyson can be the trajectory of the system with the dog representing the “noise” of periodic indicators and activity around the effort.  We often spend a lot of time and effort (which I would argue is largely unproductive) on influencing these transient conditions in simpler systems rather than on the core inertia of the system itself.  That is where the risk lies. Thus, not all indicators are the same.  Some are measuring transient anomalies that have nothing to do with changing the core direction of the system, others are more valuable.  These latter indicators are the ones that we need to cultivate and develop, and they reside in an initial measurement of the inherent risk of the system largely based on its architecture that is antecedent to the start of the work.

This is not to say that we can do nothing about the trajectory.  A simpler system can be influenced more easily.  We cannot recover the effort already expended–which is why even historical indicators are important.  It is because they inform our future expectations and, if we pay attention to them, they keep us grounded in reality.  Even in the case of Global Warming we can change, though gradually, what will be a disastrous result if we allow things to continue on their present course.  In a deterministic universe we can influence the outcomes based on the contingent probabilities presented to us over time.  Thus, we will know if we have handled the core risk of the system by focusing on these better indicators as the effort progresses.  This will affect its trajectory.

Of course, a more direct way of modifying these risks is to make systemic adjustments.  Do we really need a tournament-based system as it exists and is the waste inherent in accepting so much failure really necessary?  What would that alternative look like?

Take Me Out to the Ballgame — Tournaments and Games of Failure

“Baseball teaches us, or has taught most of us, how to deal with failure. We learn at a very young age that failure is the norm in baseball and, precisely because we have failed, we hold in high regard those who fail less often – those who hit safely in one out of three chances and become star players. I also find it fascinating that baseball, alone in sport, considers errors to be part of the game, part of it’s rigorous truth.” — Fay Vincent, former Commissioner of Baseball (1989-1992)

“Baseball is a game of inches.”  — Branch Rickey, Quote Magazine, July 31, 1966

I have been a baseball fan just about as long as I have been able to talk.  My father played the game and tried out for both what were the New York Giants and Yankees–and was a pretty well known local hero in Weehawken back in the 1930s and 1940s.  I did not have my father’s athletic talents–a four letter man in high school–but I was good at hitting a baseball from the time he put a bat in my hands and so I played–and was sought after–into my college years.  Still, like many Americans who for one reason or another could not or did not pursue the game, I live vicariously through the players on the field.  We hold those who fail less in the game in high regard.  Some of them succeed for many years and are ensconced in the Hall of Fame.

Others experienced fleeting success.  Anyone who watches ESPN’s or the Yes Channel’s classic games, particularly those from the various World Series, can see this reality in play.  What if Bill Buckner in 1986 hadn’t missed that ball?  What if Bobby Richardson had not been in perfect position to catch what would have been a game and series winning liner by Willie McCovey in 1962?  Would Brooklyn have every won a series if Amoros hadn’t caught Berra’s drive down the left field line in 1955?  The Texas Rangers might have their first World Series ring if not for a plethora of errors, both mental and physical, in the sixth game of the 2011 Series.  The list can go on and it takes watching just a few of these games to realize that luck plays a big part in who is the victor.

There are other games of failure that we deal with in life, though oftentimes we don’t recognize them as such.  In economics these are called “tournaments,” and much like their early Medieval predecessor (as opposed to the stylized late Medieval and Renaissance games), the stakes are high.  In pondering the sorry state of my favorite team–the New York Yankees–as I watched seemingly minor errors and failures cascade into a humiliating loss, I came across a blog post by Brad DeLong, distinguished professor of economics at U.C. Berkeley, entitled “Over at Project Syndicate/Equitable Growth: What Do We Deserve Anyway?”  Dr. DeLong makes the very valid point, verified not only by anecdotal experience but years of economic research, that most human efforts, particularly economic ones, fail, and that the key determinants aren’t always–or do not seem in most cases–to be due to lack of talent, hard work, dedication, or any of the attributes that successful people like to credit for their success.

Instead, much of the economy, which in its present form is largely based on a tournament-like structure, allows only a small percentage of entrants to extract their marginal product from society in the form of extremely high levels of compensation.  The fact that these examples exist is much like a lottery, as the following quote from Dr. DeLong illustrates.

“If you win the lottery–and if the big prize in the lottery that is given to you is there in order to induce others to overestimate their chances and purchase lottery tickets and so enrich the lottery runner–do you “deserve” your winnings? It is not a win-win-win transaction: you are happy being paid, the lottery promoter is happy paying you, but the others who purchase lottery tickets are not happy–or, perhaps, would not be happy in their best selves if they understood what their chances really were and how your winning is finely-tuned to mislead them, for they do voluntarily buy the lottery tickets and you do have a choice.”  — Brad DeLong, Professor of Economics, U.C. Berkeley

So even though participants have a “choice,” it is one that is based on an intricately established system based on self-delusion.  It was about this time that I came across the excellent HBO Series “Silicon Valley.”  The tournament aspect of the software industry is apparent in the conferences and competitions for both customers and investors in which I have participated over the years.  In the end, luck and timing seem to play the biggest role in success (apart from having sufficient capital and reliable business partners).

I hope this parody ends my colleagues’ (and future techies’) claims to making the claim to “revolutionize” and “make the world a better place” through software.