New York Times Says Research and Development Is Hard…but maybe not

At least that is what a reader is led to believe by reading this article that appeared over the weekend.  For those of you who didn’t catch it, Alphabet, which formerly had an R&D shop under the old Google moniker known as Google X, does pure R&D.  According to the reporter, one Conor Doughtery, the problem, you see, is that R&D doesn’t always translate into a direct short-term profit.  He then makes this absurd statement:  “Building a research division is an old and often unsuccessful concept.”  He knows this because some professor at Arizona State University–that world-leading hotbed of innovation and high tech–told him so.  (Yes, there is sarcasm in that sentence).

Had Mr. Doughtery understood new technology, he would know that all technology companies are, at core, research organizations that sometimes make money in the form of net profits, just as someone once accurately described to me that Tesla is a battery company that also makes cars (and lately its showing).  But let’s return the howler of a statement about research divisions being unsuccessful, apply some, you know, facts and empiricist thought, and go from there.

The most obvious example of a research division is Bell Labs.  From the article one would think that Bell Labs is a dinosaur of the past, but no, it still exists as Nokia Bell Labs.  Bell Labs was created in 1925, but has its antecedents in both Western Electric and AT&T, but its true roots go back to 1880 when Alexander Graham Bell, after being awarded the Volta prize for the invention of the telephone, opened Volta Labs in Washington, D.C.  But it was in the 1920s that Bell Labs, “the Idea Factory” really hit its stride.  Its researchers improved telephone switching, sound transmission, and invented radio astronomy, the transistor, the laser, information theory (of which I’ve written about extensively and which directly impacts on computing and software), Unix, the languages C, C++.  Bell established the precedent that researchers kept and were compensated for use of their inventions and IP.  This goes well beyond the assertion in the article that Bell Labs largely made “contributions to basic, university-style research.”  I guess New York Times reporters, fact checkers, and editors don’t have access to the Google search engine or Wikipedia.

Between 1937 and 2014 seventeen of their researchers have been awarded the Nobel Prize or Turing Award.  Even those who never garnered an award like Claude Shannon, of the aforementioned information theory, is among a Who’s Who of researchers into high tech.  What they didn’t invent directly they augmented and facilitated to practical use, with a good deal of their input going into public R&D through consulting and other contracts with the Department of Defense and federal government.

The reason why Bell Labs didn’t continue as a research division of AT&T wasn’t due to some dictate of the market or investor dissatisfaction.  On the contrary, AT&T (Ma Bell) dominated its market, and Bell Labs ensured that it stayed far ahead of any possible entry.  This is why in 1984 the U.S. Justice Department reached a divestiture agreement for AT&T under antitrust laws to split off Bell Labs from its local carriers in order to promote competition.  Whether the divestiture agreement was a good deal for the American people and had positive economic effects is still a cause for debate, but it is likely that the plethora of choices in cell phone and other technologies that have emerged since that time would not have gone to market without that antitrust action.

Since 1984, Bell Labs continued its significant contributions to the high tech industry through AT&T Technologies which was spun off in 1996 as Lucent Technologies, which is probably why Mr. Doughtery didn’t recognize it.  A merger with Alcaltel and then acquisition by Nokia has provided it with its current moniker.  Bell Labs over that period continued to innovate and has contributed significantly to pushing the boundaries of broadband speed and the use of imaging technology in the medical field.

So what this shows is that, while not every bit of R&D leads directly to profit, especially in the short term, a mix of types of R&D do yield practical results.  Anyone who has worked in project management understands that R&D, by definition, represents the handling of risk.  Furthermore, the lessons learned and spin offs are hard to estimate in advance, though they may result in practical technologies in the short and medium term.

When one reads past the lede and the “research division is an old and often unsuccessful concept” gaffe, among others, what you find is that Google specifically wants this portion of the research division to come up with a series of what it calls a “moon shots”.  In techie lingo this is often called a unicorn, and from personal experience I am part of a company that recently was characterized as delivering a unicorn.  This is simply a shorthand term for producing a solution that is practical, groundbreaking, and shifts the dialogue of what is possible.  (Note that I’m avoiding the tech hipster term “disruption”).

Another significant fact that we find out about Google X is the following:

X employees avoid talking about money, but it is not a subject they can ignore. They face financial barriers that can shut down a project if it does not pan out as quickly as planned. And they have to meet various milestones before they can hire more people for their teams.

This sounds a lot like project and risk management.  But Google X goes a bit further.

Failure bonuses are also an example of how X, which was set up independent of Google from the outset, is a leading indicator of sorts for how the autonomous Alphabet could work. In Alphabet, employees who do not work for Mother Google are supposed to have their financial futures tied to their own company instead of Google’s search ads. At X, that means killing things before they become too expensive.

Note that the incentive here, given in terms of a real financial incentive to the team members, is to manage risk.  No doubt, there are no #NoEstimates cultists at Google.  Psychologically, providing an incentive to find failure no doubt defeats Groupthink and optimism selection bias.  Much of this sounds, particularly in the expectation of non-existential failure, amazingly along the lines of an article recently published on AITS.org by yours truly.

The delayed profitability of software and technology companies is commonplace.  The reason for this is that, at least to my thinking, any technology type worth their salt will continue to push the technology once they have their first version marked to market.  If you’re resting on your laurels then you’re no longer in the software technology business, you’re in the retail business and might as well be selling candy bars or any other consumer product.  What you’re not doing is being engaged in providing a solution that is essential to the target domain.  Practically what this means is that, in garnering value, net profitability is not necessary the measure of success, especially in the first years.

For example, such market leaders such as Box, Workday, and Salesforce have gone years without a net profit, though revenues and market share are significant.  Facebook did not turn a profit for five yearsAmazon took six years, and even those figures were questionable.  The competing need for any executive running a company is between value (the intrinsic value of IP, existing customer base, and potential customer base), and profit.  The job of the CEO is not just to stockholders, yet the article in its lede clearly is biased in that way.  The fiduciary and legal responsibility of the CEO is to the customers, the employees, the entity, and the stockholders–and not necessarily in that order.  This is thus a natural conflict in balancing these competing interests.

Overall, if one ignores the contributions of the reporter, the case of Google X is a fascinating one for its expectations and handling or risk in R&D-focused project management.  It takes value where it can and cuts its losses through incentives to find risk that can’t be handled.  An investor that lives in the real world should find this reassuring.  Perhaps these lessons on incentives can be applied elsewhere.

 

Rise of the Machines — Drivers of Change in Business and Project Management

Last week I found myself in business development mode, as I often am, in explaining to a prospective client our future plans in terms of software development.  The point that I was making was that it was not our goal to simply reproduce the functionality that every other software solution provider offered, but to improve how the industry does business by making the drive for change through the application of appropriate technology so compelling through efficiencies, elimination of redundancy, and improved productivity, that not making the change would be deemed foolish.  In sum, we are out to take a process and improve on it through the application of disruptive technology.  I highlighted my point by stating:  “It is not our goal to simply reproduce functionality so we can party like it’s 1998, it’s been eight software generations since that time and technology has provided us smarter and better ways of doing things.”

I received the usual laughter and acknowledgement by some of the individuals to whom I was making this point, but one individual rejoined: “well, I don’t mind doing things like we did it in 1998,” or words to that effect.  I acknowledged the comment, but then reiterated that our goal was somewhat more proactive.  We ended the conversation in a friendly manner and I was invited to come back and show our new solution upon release to market.

Still, the rejoinder of being satisfied with things the way they are has stuck with me.  No doubt that being a nerd (and years as a U.S. Navy officer) have inculcated a drive in me for constant process improvement.  My default position going into a discussion is that the individuals that I am addressing share that attitude with me.  But that is not always the case.

The kneejerk position of other geeks is often of derision when confronted by resistance to change.  But not every critic or skeptic is a Luddite, and it is important to understand the basis for both criticism and skepticism.  For many of our colleagues in the project management world, software technology is a software application, something that “looks into the rear glass window.”  This meme is pervasive out there, but it is wrong.  Understanding why it is wrong is important in addressing the concerns behind them in an appropriate manner.

This view is wrong because the first generations of software that serve this market simply replicated the line and staff, specialization, and business process and analysis regime that existed prior to digitization.  Integration of data that could provide greater insight was not possible at a level of detail needed to establish confidence.  The datasets upon which we derived our data were not flexible, nor did they allow for widespread distribution of more advanced corporate and institutional knowledge.  In fact, the first software generation in project management often supported and sustained the subject matter expert (SME) framework, in which only a few individuals possessed advanced knowledge of methods and analytics, upon which the organization had to rely.

We still see this structure in place in much of industry and government–and it is self-sustaining, since it involves not only individuals within the organization that possess this attribute, but also a plethora of support contractors and consultants who have built their businesses to support it.

Additional resistance comes from individuals who have dealt with new entries in the past, which turned out only to be incremental or marginal improvements for what is already in place, not to mention the few bad actors that come along.  Established firms in the market take this approach in order to defend market share and like the SME structure, it is self-sustaining by attempting to establish a barrier to new entrants into the market.  At the same time they establish an environment of stability and security from which buyers are hesitant to leave, thus the prospective customer is content to “party like it’s 1998.”

Value proposition alone will not change the mind of those who are content.  You sell what a prospective customer needs, not usually solely what they want.  For those introducing disruptive innovation, the key is to be at the forefront in shifting the basis for what defines the basis of market need.

For example, in business and project systems, the focus has always been on “tools.”  Given the engineering domain that is dominant in many project management organizations, such terminology provides a comfortable and familiar way of addressing technology.  Getting the “right set of tools” and “using the right tool for the work” are the implicit assumptions in using such simplistic metaphors.  This has caused many companies and organizations to issue laundry lists of features and functionality in order to compare solutions when doing market surveys.  Such lists are self-limiting, supporting the self-reinforcing systems mentioned above.  Businesses who rely on this approach to the technology market are not open to leveraging the latest capabilities in improving their systems.  The metaphor of the “tool” is an out of date one.

The shift, which is accelerating in the commercial world, is emphasis on software technology that is focused on the capabilities inherent in the effective use of data.  In today’s world data is king, and the core issue is who owns the data.  I have referred to some of the new metaphors in data in my last post and, no doubt, new ones will arise.  What is important to know about the shift to an emphasis on data and its use is that it is driving organizational change that not only breaks down the “tool”-based approach to the market, but also undermines the software market emphasis on tool functionality, and on the organizational structure and support market built on the SME.

There is always fear surrounding such rapid change, and I will not argue against the fact that some of it needs to be addressed.  For example, the rapid displacement through digitization of previously human-centered manual work that previously required expertise and which paid well, will soon become one of the most important challenges of our time.  I am optimistic that the role of the SME simply needs to follow the shift, but I have no doubt that the shift will require fewer SMEs.  This highlights, however, that the underlying economics of the shift will make it both compelling and necessary.

Very soon, it will be impossible to “party like it’s 1998” and still be in business.

Over at AITS.org — Black Swans: Conquering IT Project Failure & Acquisition Management

It’s been out for a few days but I failed to mention the latest article at AITS.org.

In my last post on the Blogging Alliance I discussed information theory, the physics behind software development, the economics of new technology, and the intrinsic obsolescence that exists as a result. Dave Gordon in his regular blog described this work as laying “the groundwork for a generalized theory of managing software development and acquisition.” Dave has a habit of inspiring further thought, and his observation has helped me focus on where my inquiries are headed…

To read more please click here.

Forget Domani — The Inevitability of Software Transitioning and How to Facilitate the Transition

The old Perry Como* chestnut refers to the Italian word “tomorrow” and is the Italian way of repeating–in a more romantic manner–Keyne’s dictum that in the “long run we’ll all be dead.”  Whenever I hear polemicists talk about the long run or invoke the interests of their grandchildren trumping immediate concerns and decisions I always brace myself for the Paleolithic nonsense that is to follow.  While giving such opinions a gloss of plausibility, at worst, they are simply fabrications to hide self-interest, a form of tribalism, or ideology, at best, they are based on fallacious reasoning, fear, or the effects of cognitive dissonance.

While not as important as the larger issues affecting society, we see this same type of thinking when people and industries are faced with rapid change in software.  I was reminded of this when I sat down to lunch with a colleague who was being forced to drop an established software system being used in project management.  “We spent so much time and money to get it to finally work the way we want it, and now we are going to scrap it,” he complained.  Being a good friend–and knowing the individual as being thoughtful when expressing opinions–I pressed him a bit.  “But was your established system doing what it needed to do to meet your needs?”  He thought a moment.  “Well, it served our needs up to now, but it was getting very expensive to maintain and took a lot of workarounds.  Plus the regulatory requirements in our industry are changing and it can’t make the jump.”  When I pointed out that it sounded as the decision to transition then was the right one he ended with:  “Yes, but I’m within a couple of years of retirement and I don’t need another one of these.”

Thus, within the space of one conversation were the reasons that we all usually hear as excuses for not transitioning to a new software.  In markets that are dominated by a few players with aging and soon to be obsolete software this is a common refrain.  Any one of these rationales, put in the mouth of a senior decision-maker, will kill a deal.  Other rationales are based in a Sharks vs. Jets mentality, in which the established software user community rallies around the soon to be obsolete application.  This is particularly prevalent in enterprise software environments.  This is usually combined with uninformed attacks, sometimes initiated by the established market holder directly or through proxies, about the reliability, scale, and functionality of the new entries.  The typical defensive maneuver is to declare that at some undetermined date in the future–domani–that an update is on the way that will match or exceed what the present applications possess.  Hidden from the non-tech savvy is the reality that the established software is written in old technology and language, oftentimes requiring an entire rewrite that will take years.  Though possessing the same brand name the “upgrade” will, in effect, be new, untested software written in haste to defend market.

As a result of many years marketing and selling various software products, certain of which were and are game-changing in their respective markets, I have compiled a list of typical objections to software transitioning and the means of addressing these concerns.  One should not take this as an easy “how-to” guide.  There is no substitute for understanding your market, understanding the needs of the customer, having the requisite technical knowledge of the regulatory and systematic requirements of the market, and possessing a concern for the livelihood for your customers that is then translated into a trusting and mutually respectful relationship.  If software is just a euphemism for making money–and there are some very successful companies that take this approach–this is not the blog post for you: you might as well be selling burgers and tacos.

1.  Sunk vs. Opportunity Costs.  This is an old one and I find it interesting that this thinking persists.  The classic comparison in understanding the fallacy of sunk cost was first brought up in a class when I was attending Pepperdine University many years ago.  A friend of the professor couldn’t decide if he should abandon the expensive TV antenna he had purchased just a year before in favor of the new-fangled cable television hookup that was just introduced into his neighborhood.  The professor explained to his friend that the money he spent on the antenna was irrelevant to his decision.  That money was gone–it was “sunk” into the old technology.  The relevant question was: what was the cost of not taking the best alternative now, that is, what is the cost of not putting a resource to its best use.  When we persist in using old technologies to address new challenges there comes a point where the costs associated with that old technology no longer are the most effective use of resources in that regard.  That is the point at which the change must occur.  In practical matters, if the overhead associated with the old technology is too high given the payoff, there are gaps and workarounds in using the old technology that sub-optimize and waste resources, then it is time to make a change.  The economics dictates it, and this can be both articulated and demonstrated using a business case.

2.  Need vs. Want.  Being a techie, I often fall into the same trap of most techies in which some esoteric operation or functionality is achieved and I marvel at it.  Then when I show it to a non-techie I am puzzled when the intended market responds with a big yawn.  Within this same category are people on the customer side of the equation who are looking at the latest technologies, but do not have an immediate necessity that propels the need for a transition.  This is often looked at as just “checking in” and, on the sales side, the equivalent of kicking the tires.  These opposing examples outline one of the core elements that will support a transition:  in most cases businesses will buy when they have a need, as opposed to a want.  Understanding the customers needs–and what propels a change based on necessity–whether it be due to a shift in the regulatory or technological environment that changes the baseline condition, is the key to understanding how to support a transition.  This assumes, of course, that the solution one is offering meets the baseline condition to support the shift.  Value and pricing also enter into this equation.  I remember dealing with a software company a few years ago where I noted that their pricing was much too high for the intended market.  “But we offer so much more than our competition” came the refrain.  The problem, however, was that the market did not view the additional functionality as essential.  Any price multiplied by zero equals zero, regardless of how we view the value of an offering.

3.  Acts of Omission and Acts of Commission.  The need for technological transition is, once again, dictated by a need of the organization due to either internal or external factors.  In my career as a U.S. Navy officer we are trained to make decisions and take vigorous action whenever presented with a challenge.  The dictum in this case is that an act of commission, that is, having taken diligent and timely action due to a perceived threat, is defensible, even if someone second guesses those decisions and is critical of them down the line, but an act of omission, ignoring a threat or allowing events to unfold on their own, is always unforgiveable.  Despite the plethora of books, courses, and formal education regarding leadership, there is still a large segment of business and government that prefer to avoid risk by avoiding making decisions.  Businesses operating at optimum effectiveness perform under a sense of urgency.  Software providers, however, must remember that their sense of urgency in making a sale does not mean that the prospective customer’s sense of urgency is in alignment.  A variation of the need vs. want factor, in this case understanding the business and then effectively communicating to the customer those events that are likely to occur due to non-action, is the key component in overcoming this roadblock.  Once again, this is assuming that the proposed solution actually addresses the risk associated with an act of omission.

4.  Not Invented Here.  I have dealt with this challenge in a previous blog post.  Establishing a learning organization is essential under the new paradigm of project management, in which there is more emphasis on a broader sense of integration across what were previously identified as the divisions of labor in the community.  Hand-in-hand with this challenge is the perception, often based on lack of information, that the requirements needed by the organization are so unique to it that only a ground-up, customized solution will do, usually militating against commercial-off-the-shelf (COTS) technologies.  This often takes the form of internal IT shops building business cases to internally develop the system directly to code, or in supporting environments in which users have filled the gaps in their systems with Excel spreadsheets that various users had constructed.  In one case the objection to the proposed COTS solution was based on the rationale that the users they “really liked” their pivot tables.  (Repeat after me:  Excel is not a project management system, Excel is not a project management system, Excel is not a project management system).  As we drive toward integration of more data involving millions of records, such rationales are easily engaged.  This assumes, however, that the software provider possesses a solution that is both powerful and flexible, that is, one that can both handle Big Data and integrate data, not just through data conversion, normalization, and rationalization, but also through the precise use of APIs.  In this last case, we are not talking about glorified query engines against SQL tables but systems that have built-in smarts inherited from the expertise of the developers to properly identify and associate data so that it transformed into information that establishes an effective project management and control environment.

5.  I Heard it Through the Grapevine.  Nothing is harder to overcome than a whisper campaign generated by competitors or their proxies.  I know of companies in which enterprise systems involving billions of dollars of project value being successfully implemented only to have the success questioned in a meeting by the spread of disinformation, or the success acknowledged in a backhand manner.  The response to this kind of challenge is to put the decision makers in direct touch with your customers.  In addition, live demos using releasable data or notional data that is equivalent to the customer’s work in demonstrating functionality is essential.  Finally, the basics of software economics dictate that for an organization to understand whether a solution is appropriate for their needs, that there needs to be some effort in terms of time and resources expended in evaluating the product.  For those offering solutions, the key in effectively communicating the value of your product and not falling into a trap of your competitors’ making, is to ensure that the pilot does not fall into a trained monkey test in which potentially unqualified individuals attempt to operate the software on their own with little or no supervision, training, and lacking effective communication to support the pilot in the same way that an implementation would normally be handled.  Propose a pilot that is structured, has a time limit, a limit to scope, and in which direct labor and travel, if necessary, is reimbursed.  If everyone is professional and serious this will be a reasonable approach that will ensure a transparent process for both parties.

6.  The Familiar and the Unknown.  Given the high failure rate associated with IT projects, one can understand the hesitancy of decision makers to take that step.  A bad decision in selecting a system can, and have, brought organizations to their knees.  Furthermore, studies in human behavior demonstrate that people tend to favor those things that are familiar, even in cases where a possible alternative is better, but unknown.  This is known as the mere-exposure effect.  Daniel Kahneman in the groundbreaking book Thinking Fast and Slow, outlines other cognitive fallacies built into our wiring.  New media and technology only magnify these effects.  The challenge, then, is for the new technological solution provider to address the issue of familiarity directly. Toward this end, software providers must establish trust and rapport with their market, prove their expertise not just in technical matters of software and computing, but also regarding the business processes and needs of the market, and establish their competency in issues affecting the market.  A proven track record of honesty, open communication, and fair dealing are also essential to overcoming this last challenge.

*I can’t mention this song without also noting that the Chairman of the Board, Frank Sinatra, recorded a great version of it, as did Mario Lanza, and that Connie Francis also made it a hit in the 1960s.  It was also the song that Katyna Ranieri made famous in the Shirley MacLaine movie The Yellow Rolls Royce.