Sunday Contemplation — Finding Wisdom — Daniel Dennett in “Darwin’s Dangerous Idea”

Daniel_Dennett

“The Darwinian Revolution is both a scientific and a philosophical revolution, and neither revolution could have occurred without the other. As we shall see, it was the philosophical prejudices of the scientists, more than their lack of scientific evidence, that prevented them from seeing how the theory could actually work, but those philosophical prejudices that had to be overthrown were too deeply entrenched to be dislodged by mere philosophical brilliance. It took an irresistible parade of hard-won scientific facts to force thinkers to take seriously the weird new outlook that Darwin proposed…. If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law. But it is not just a wonderful scientific idea. It is a dangerous idea.”

Daniel Dennett (pictured above thanks to Wikipedia) is the Co-Director of the Center for Cognitive Studies and Austin B. Fletcher Professor of Philosophy at Tufts University.  He is also known as “Dawkins’ Bulldog”, for his pointed criticism of what he viewed as unnecessary revisions to Darwinian Theory by Stephen Jay Gould, who was also a previous subject of this blog, and others.  In popular culture he has also been numbered among the “Four Horsemen” of the so-called “New Atheism”.  His intellectual and academic achievements are many, and his insights into evolution, social systems, cognition, consciousness, free will, philosophy, and artificial intelligence are extremely influential.

Back in 1995, when I was a newly minted Commander in the United States Navy, I happened across an intriguing book in a Jacksonville, Florida bookshop during a temporary duty assignment.  The book was entitled Darwin’s Dangerous Idea: Evolution and the Meanings of Life.  I opened it that afternoon during a gentle early spring Florida day and found myself astounded and my mind liberated, as if chains which I had not previously noticed, but which had bound my mind, had been broken and released me, so great was the influence of the philosophical articulation of this “dangerous idea”.

Here, for the first time, was a book that took what we currently know about the biological sciences and placed them within the context of other scientific domains–and done so in a highly organized, articulate, and readable manner.  The achievement of the book was not so much in deriving new knowledge, but in presenting an exposition of the known state of the science and tracing its significance and impact–no mean achievement given the complexity of the subject matter and the depth and breadth of knowledge being covered.  The subject matter, of course, is highly controversial only because it addresses subjects that engender the most fear: the facts of human origins, development, nature, biological interconnectedness, and the inevitability of mortality.

Dennett divides his thesis into three parts: the method of developing the theory and its empirical proofs, it’s impact on the biological sciences, and the impact on other disciplines, especially regarding consciousness, philosophy, sociology, and morality.  He introduces and develops several concepts, virtually all of which have since become cornerstones in human inquiry, and not only among the biological sciences.

Among these are the concepts of design space, of natural selection behaving as an algorithm, of Darwinism acting as a “universal acid” that transforms the worldview of everything it touches, and of the mental concepts of skyhooks, cranes and “just-so” stories–fallacious and magical ways of thinking that have no underlying empirical foundation to explain natural phenomena.

The concept of the design space has troubled many, though not most evolutionary biologists and physicists, only because Dennett posits a philosophical position in lieu of a mathematical one.  This does not necessarily undermine his thesis, simply because one must usually begin with a description of a thesis before one can determine whether it can be disproven.  Furthermore, Dennett is a philosopher of the analytical school and so the scope of his work is designed from that perspective.

But there are examples that approach the analogue of design space in physics–those that visualize space-time and general relativity as at this site.  It is not a stretch to understand that our reality–the design space that the earth inhabits among many alternative types of design spaces that may exist that relate to biological evolution–can eventually be mathematically formulated.  Given that our knowledge of comparative planetary and biological physics is still largely speculative and relegated to cosmological speculation, the analogy for now is sufficient and understandable.  It also gives a new cast to the concept of adaptation away from the popular (and erroneous) concept of “survival of the fittest”, since fitness is based on the ability to adapt to environmental pressures and to find niches that may exist in that environment.  With our tracing of the effects of climate change on species, we will be witnessing first hand the brutal concept of design space.

Going hand-in-hand with design space is the concept that Darwinian evolution through the agent of natural selection is an algorithmic process.  This understanding becomes “universal acid” that, according to Dennett, “eats through just about every traditional concept and leaves in its wake a revolutionized world-view.”

One can understand the objection of philosophers and practitioners of metaphysics to this concept, which many of them have characterized as nihilistic.  This, of course, is argument from analogy–a fallacious form of rhetoric.  The objection to the book through these arguments, regardless of the speciousness of their basis, is premature and a charge to which Dennett effectively responds through his book Consciousness Explained.  It is in this volume that Dennett addresses the basis for the conscious self, “intentionality”, and the concept of free will (and its limitations)–what in the biological and complexity sciences is described as emergence.

What Dennett has done through describing the universal acid of Darwinian evolution is to describe a phenomenon: the explanatory reason for rapid social change that we have and are witnessing, and the resulting reaction and backlash to it.  For example, the revolution that was engendered from the Human Genome Project not only has confirmed our species’ place in the web of life on Earth and our evolutionary place among primates, but also the interconnections deriving from descent from common ancestors of the entire human species, exploding the concept of race and any claim to inherent superiority or inferiority to any cultural grouping of humans.

One can clearly see the threat this basic truth has to entrenched beliefs deriving from conservative philosophy, cultural tradition, metaphysics, religion, national borders, ethnic identity, and economic self-interest.

For it is apparent to me, given my reading not only of Dennett, but also that of both popularizers and the leading minds in the biological sciences that included Dawkins, Goodall, Margulis, Wilson, Watson, Venter, Crick, Sanger, and Gould; in physics from Hawking, Penrose, Weinberg, Guth, and Krauss, in mathematics from Wiles, Witten, and Diaconis; in astrophysics from Sandage, Sagan, and deGrasse Tyson; in climate science from Hansen and many others; and in the information sciences from Moore, Knuth, and Berners-Lee, that we are in the midst of another intellectual revolution.  This intellectual revolution far outstrips both the Renaissance and the Enlightenment as periods of human achievement and advancement, if only because of the widespread availability of education, literacy, healthcare, and technology, as well as human diversity, which both accelerates and expands many times over the impact of each increment in knowledge.

When one realizes that both of those earlier periods of scientific and intellectual advance engendered significant periods of social, political, and economic instability, upheaval, and conflict, then the reasons for many of the conflicts in our own times become clear.  It was apparent to me then–and even more apparent to me now–that there will be a great overturning of the institutional, legal, economic, social, political, and philosophic ideas and structures that now exist as a result.  We are already seeing the strains in many areas.  No doubt there are interests looking to see if they can capitalize on or exploit these new alignments.  But for those overarching power structures that exert control, conflict, backlash, and eventual resolution is inevitable.

In this way Fukuyama was wrong in the most basic sense in his thesis in The End of History and the Last Man to the extent that he misidentified ideologies as the driving force behind the future of human social organization.  What he missed in his social “science” (*) is the shift to the empirical sciences as the nexus of change.  The development of analytical philosophy (especially American Pragmatism) and more scientifically-based modeling in the social sciences are only the start, but one can make the argument that these ideas have been more influential in clearly demonstrating that history, in Fukuyama’s definition, is not over.

Among the first shots over the bow from science into the social sciences have come works from such diverse writers as Jared Diamond (Guns, Germs, and Steel: The Fates of Human Societies (1997)) and Sam Harris (The Moral Landscape: How Science Can Determine Human Values (2010)).  The next wave will, no doubt, be more intense and drive further resistance and conflict.

The imperative of science informing our other institutions is amply demonstrated by two facts.

  1. On March 11, 2016 an asteroid that was large enough to extinguish a good part of all life on earth came within 19,900 miles of our planet’s center.  This was not as close, however, as the one that passed on February 25 (8,900 miles).  There is no invisible shield or Goldilocks Zone to magically protect us.  The evidence of previous life-ending collisions are more apparent with each new high resolution satellite image of our planet’s surface.  One day we will look up and see our end slowly but inevitably making its way toward us, unless we decide to take measures to prevent such a catastrophe.
  2. Despite the desire to deny that it’s happening, 2015 was the hottest recorded year on record and 2016 thus far is surpassing that, providing further empirical evidence of the validity of Global Warming models.  In fact, the last four consecutive years fall within the four hottest years on record (2014 was the previous hottest year).  The outlier was 2010, another previous high, which is hanging in at number 3 for now.  2013 is at number 4 and 2012 at number 8.  Note the general trend.  As Jared Diamond has convincingly demonstrated–the basis of conflict and societal collapse is usually rooted in population pressures exacerbated by resource scarcity.  We are just about to the point of no return, given the complexity of the systems involved, and can only mitigate the inevitable–but we must act now to do.

What human civilization does not want to be is on the wrong side of history in how to deal with these challenges.  Existing human power structures and interests would like to keep the scientific community within the box of technology–and no doubt there are still scientists that are comfortable to stay within that box.

The fear regarding allowing science to move beyond the box of technology and general knowledge is its misuse and misinterpretation, usually by non-scientists, such as the reprehensible meme of Social Darwinism (which is neither social nor Darwinian).**  This fear is oftentimes transmitted by people with a stake in controlling the agenda or interpreting what science has determined.  Its contingent nature also is a point of fear.  While few major theories are usually completely overturned as new knowledge is uncovered, the very nature of revision and adjustment to theory is frightening to people who depend on, at least, the illusion of continuity and hard truths.  Finally, science puts us in our place within the universe.  If there are millions of planets that can harbor some kind of life, and a sub-set of those that have the design space to allow for some kind of intelligent life (as we understand that concept), are we really so special after all?

But not only within the universe.  Within societies, if all humans have developed from a common set of ancestors, then our basic humanity is a shared one.  If the health and sustainability of an ecology is based on its biodiversity, then the implication for human societies is likewise found in diversity of thought and culture, eschewing tribalism and extreme social stratification.  If the universe is deterministic with only probability determining ultimate cause and effect, then how truly free is free will?  And what does this say about the circumstances in which each of us finds him or herself?

The question now is whether we embrace our fears, manipulated by demagogues and oligarchs, or embrace the future, before the future overwhelms and extinguishes us–and to do so in a manner that is consistent with our humanity and ethical reasoning.

 

Note:  Full disclosure.  As a senior officer concerned with questions of AI, cognition, and complex adaptive systems, I opened a short correspondence with Dr. Dennett about those subjects.  I also addressed what I viewed as his unfair criticism (being Dawkins’ Bulldog) of punctuated equilibrium, spandrels, and other minor concepts advanced by Stephen Jay Gould, offering a way that Gould’s concepts were well within Darwinian Theory, as well as being both interesting and explanatory.  Given that less complex adaptive systems that can be observed do display punctuated periods of rapid development–and also continue to have the vestiges of previous adaptations that no longer have a purpose–it seemed to me that larger systems must also do so, the punctuation being on a different time-scale, and that any adaptation cannot be precise given that biological organisms are imprecise.  He was most accommodating and patient, and this writer learned quite a bit in our short exchange.  My only regret was not to continue the conversation.  I do agree with Dr. Dennett (and others) on their criticism of non-overlapping magisteria (NOMA), as is apparent in this post.

I Can’t Drive 55 — The New York Times and Moore’s Law

Yesterday the New York Times published an article about Moore’s Law.  While interesting in that John Markoff, who is the Times science writer, speculates that in about 5 years the computing industry will be “manipulating material as small as atoms” and therefore may hit a wall in what has become a back of the envelope calculation of the multiplicative nature of computing complexity and power in the silicon age.

This article prompted a follow on from Brian Feldman at NY Mag, that the Institute of Electrical and Electronics Engineers (IEEE) has anticipated a broader definition of the phenomenon of the accelerating rate of computing power to take into account quantum computing.  Note here that the definition used in this context is the literal one: the doubling of the number of transistors over time that can be placed on a microchip.  That is a correct summation of what Gordon Moore said, but it not how Moore’s Law is viewed or applied within the tech industry.

Moore’s Law (which is really a rule of thumb or guideline in lieu of an ironclad law) has been used, instead, as a analogue to describe the geometric acceleration that has been seen in computer power over the last 50 years.  As Moore originally described the phenomenon, the doubling of transistors occurred every two years.  Then it was revised later to occur about every 18 months or so, and now it is down to 12 months or less.  Furthermore, aside from increasing transistors, there are many other parallel strategies that engineers have applied to increase speed and performance.  When we combine the observation of Moore’s Law with other principles tied to the physical world, such as Landauer’s Principle and Information Theory, we begin to find a coherence in our observations that are truly tied to physics.  Thus, rather than being a break from Moore’s Law (and the observations of these other principles and theory noted above), quantum computing, to which the articles refer, sits on a continuum rather than a break with these concepts.

Bottom line: computing, memory, and storage systems are becoming more powerful, faster, and expandable.

Thus, Moore’s Law in terms of computing power looks like this over time:

Moore's Law Chart

Furthermore, when we calculate the cost associated with erasing a bit of memory we begin to approach identifying the Demon* in defying the the Second Law of Thermodynamics.

Moore's Law Cost Chart

Note, however, that the Second Law is not really being defied, it is just that we are constantly approaching zero, though never actually achieving it.  But the principle here is that the marginal cost associated with each additional bit of information become vanishingly small to the point of not passing the “so what” test, at least in everyday life.  Though, of course, when we get to neural networks and strong AI such differences are very large indeed–akin to mathematics being somewhat accurate when we want to travel from, say, San Francisco to London, but requiring more rigor and fidelity when traveling from Kennedy Space Center to Gale Crater on Mars.

The challenge, then, in computing is to be able to effectively harness such power.  Our current programming languages and operating environments are only scratching the surface of how to do this, and the joke in the industry is that the speed of software is inversely proportional to the advance in computing power provided by Moore’s Law.  The issue is that our brains, and thus the languages we harness to utilize computational power, are based in an analog understanding of the universe, while the machines we are harnessing are digital.  For now this knowledge can only build bad software and robots, but given our drive into the brave new world of heuristics, may lead us to Skynet and the AI apocalypse if we are not careful–making science fiction, once again, science fact.

Back to present time, however, what this means is that for at least the next decade, we will see an acceleration of the ability to use more and larger sets of data.  The risks, that we seem to have to relearn due to a new generation of techies entering the market which lack a well rounded liberal arts education, is that the basic statistical and scientific rules in the conversion, interpretation, and application of intelligence and information can still be roundly abused and violated.  Bad management, bad decision making, bad leadership, bad mathematics, bad statisticians, specious logic, and plain old common human failings are just made worse, with greater impact on more people, given the misuse of that intelligence and information.

The watchman against these abuses, then, must be incorporated into the solutions that use this intelligence and information.  This is especially critical given the accelerated pace of computing power, and the greater interdependence of human and complex systems that this acceleration creates.

*Maxwell’s Demon

Note:  I’ve defaulted to the Wikipedia definitions of both Landauer’s Principle and Information Theory for the sake of simplicity.  I’ve referenced more detailed work on these concepts in previous posts and invite readers to seek those out in the archives of this blog.

The Revolution Will Not Be Televised — The Sustainability Manifesto for Projects

While doing stuff and living life (which seems to take me away from writing) there were a good many interesting things written on project management.  The very insightful Dave Gordon at his blog, The Practicing IT Project Manager, provides a useful weekly list of the latest contributions to the literature that are of note.  If you haven’t checked it out please do so–I recommend it highly.

While I was away Dave posted to an interesting link on the concept of sustainability in project management.  Along those lines three PM professionals have proposed a Sustainability Manifesto for Projects.  As Dave points out in his own post on the topic, it rests on three basic principles:

  • Benefits realization over metrics limited to time, scope, and cost
  • Value for many over value of money
  • The long-term impact of our projects over their immediate results

These are worthy goals and no one needs to have me rain on their parade.  I would like to see these ethical principles, which is what they really are, incorporated into how we all conduct ourselves in business.  But then there is reality–the “is” over the “ought.”

For example, Dave and I have had some correspondence regarding the nature of the marketplace in which we operate through this blog.  Some time ago I wrote a series of posts here, here, and here providing an analysis of the markets in which we operate both in macroeconomic and microeconomic terms.

This came in response to one my colleagues making the counterfactual assertion that we operate in a “free market” based on the concept of “private enterprise.”  Apparently, such just-so stories are lies we have to tell ourselves to make the hypocrisy of daily life bearable.  But, to bring the point home, in talking about the concept of sustainability, what concrete measures will the authors of the manifesto bring to the table to counter the financialization of American business that has occurred of the past 35 years?

For example, the news lately has been replete with stories of companies moving plants from the United States to Mexico.  This despite rising and record corporate profits during a period of stagnating median working class incomes.  Free trade and globalization have been cited as the cause, but this involves more hand waving and the invocation of mantras, rather than analysis.  There has also been the predictable invocations of the Ayn Randian cult and the pseudoscience* of Social Darwinism.  Those on the opposite side of the debate characterize things as a morality play, with the public good versus greed being the main issue.  All of these explanations miss their mark, some more than others.

An article setting aside a few myths was recently published by Jonathan Rothwell at Brookings, which came to me via Mark Thoma’s blog, in the article, “Make elites compete: Why the 1% earn so much and what to do about it”.  Rothwell looks at the relative gains of the market over the last 40 years and finds that corporate profits, while doing well, have not been the driver of inequality that Robert Reich and other economists would have it be.  In looking at another myth that has been promulgated by Greg Mankiw, he finds that the rewards of one’s labors is not related to any special intelligence or skill.  On the contrary, one’s entry into the 1% is actually related to what industry one chooses to enter, regardless of all other factors.  This disparity is known as a “pay premium”.  As expected, petroleum and coal products, financial instruments, financial institutions, and lawyers, are at the top of the pay premium.  What is not, against all expectations of popular culture and popular economic writing, is the IT industry–hardware, software, etc.  Though they are the poster children of new technology, Bill Gates, Mark Zuckerburg, and others are the exception to the rule in an industry that is marked by a 90% failure rate.  Our most educated and talented people–those in science, engineering, the arts, and academia–are poorly paid–with negative pay premiums associated with their vocations.

The financialization of the economy is not a new or unnoticed phenomenon.  Kevin Phillips, in Wealth and Democracy, which was written in 2003, noted this trend.  There have been others.  What has not happened as a result is a national discussion on what to do about it, particularly in defining the term “sustainability”.

For those of us who have worked in the acquisition community, the practical impact of financialization and de-industrialization have made logistics challenging to say the least.  As a young contract negotiator and Navy Contracting Officer, I was challenged to support the fleet when any kind of fabrication or production was involved, especially in non-stocked machined spares of any significant complexity or size.  Oftentimes my search would find that the company that manufactured the items was out of business, its pieces sold off during Chapter 11, and most of the production work for those items still available done seasonally out of country.  My “out” at the time–during the height of the Cold War–was to take the technical specs, which were paid for and therefore owned by the government, to one of the Navy industrial activities for fabrication and production.  The skillset for such work was still fairly widespread, supported by the quality control provided by a fairly well-unionized and trade-based workforce–especially among machinists and other skilled workers.

Given the new and unique ways judges and lawyers have applied privatized IP law to items financed by the public, such opportunities to support our public institutions and infrastructure, as I was able, have been largely closed out.  Furthermore, the places to send such work, where possible, have also gotten vanishingly smaller.  Perhaps digital printing will be the savior for manufacturing that it is touted to be.  What it will not do is stitch back the social fabric that has been ripped apart in communities hollowed out by the loss of their economic base, which, when replaced, comes with lowered expectations and quality of life–and often shortened lives.

In the end, though, such “fixes” benefit a shrinkingly few individuals at the expense of the democratic enterprise.  Capitalism did not exist when the country was formed, despite the assertion of polemicists to link the economic system to our democratic government.  Smith did not write his pre-modern scientific tract until 1776, and much of what it meant was years off into the future, and its relevance given what we’ve learned over the last 240 years about human nature and our world is up for debate.  What was not part of such a discussion back then–and would not have been understood–was the concept of sustainability.  Sustainability in the study of healthy ecosystems usually involves the maintenance of great diversity and the flourishing of life that denotes health.  This is science.  Economics, despite Keynes and others, is still largely rooted in 18th and 19th century pseudoscience.

I know of no fix or commitment to a sustainability manifesto that includes global, environmental, and social sustainability that makes this possible short of a major intellectual, social or political movement willing to make a long-term commitment to incremental, achievable goals toward that ultimate end.  Otherwise it’s just the mental equivalent to camping out in Zuccotti Park.  The anger we note around us during this election year of 2016 (our year of discontent) is a natural human reaction to the end of an idea, which has outlived its explanatory power and, therefore, its usefulness.  Which way shall we lurch?

The Sustainability Manifesto for Projects, then, is a modest proposal.  It may also simply be a sign of the times, albeit a rational one.  As such, it leaves open a lot of questions, and most of these questions cannot be addressed or determined by the people to which it is targeted: project managers, who are usually simply employees of a larger enterprise.  People behave as they are treated–to the incentives and disincentives presented to them, oftentimes not completely apparent on the conscious level.  Thus, I’m not sure if this manifesto hits its mark or even the right one.

*This term is often misunderstood by non-scientists.  Pseudoscience means non-science, just as alternative medicine means non-medicine.  If any of the various hypotheses of pseudoscience are found true, given proper vetting and methodology, that proposition would simply be called science.  Just as alternative methods of treatment, if found effective and consistent, given proper controls, would simply be called medicine.

For the Weekend: Music, Data, and Florence + The Machine

Saturdays–and some Sundays–have usually have been set aside for music as an interlude from all things data, information technology, and my work in general.  Admittedly, blogging has suffered because of the demands of work and, you know, having a life, especially with family.  But flying back from a series of important meetings that will, no doubt, make up for the lack of blogging in the near future, I settled in finally to listen to Ms. Welch’s latest.

As a fan from the beginning, I have not been impressed with the early singles that were released from her album, How Big, How Blue, How Beautiful.  My reaction to the title song, using a single syllable sound, was “meh.”  Same for the song “What Kind of Man,” which apparently grasping for some kind of significance, I viewed as inarticulate at best and largely muddled.  The message in this case, at least for me, didn’t save the medium.

So I kicked back on the plane after another 12 hour (or so) day and was intent on not giving up on her artistry.  So I listened to the album mostly with eyes closed, but with occasional forays into checking out the beautiful moonlit dome of the sky while traveling over the eastern seaboard with the glittering lights of the houses and towns 35,000 feet below.  (A series of “Supermoon” events are happening).  About four songs in I found myself taken in by what can only be described as another strong song cycle that possesses more subtlety and maturity than the bang-on pyrotechnics of Ceremonials.

The red-headed Celtic Goddess can still drive a tune and a theme that, having experienced one of her concerts in the desert of New Mexico under a cloudless night sky with the expanse of the Milky Way overhead, can become both transcendent and almost spooky, especially as her acolytes dance and sway in the trance state induced by her music.  Thus, I have come to realize that releasing any of her songs on their own from this album is largely a mistake because they cannot hold up as “singles” in the American tradition of Tin Pan Alley–nor even as prog rock.  Listening to the entire album from start to finish gives you the perspective from which you need to assess its artistic merit.  

For me, her lyrics and themes hark back and forth across the dimension of human experience, tying them together and, thus, fusing time in the process, opening up pathways in the mind to an almost elemental suggestion of the essence of existence which is communicated through the beat and expanse of the music.

Therefore, rather than a sample from Youtube, which I usually post at this point, I instead strongly recommend that you give the album a listen.  It’ll keep the band in business making more beautiful music as well.

Before I be accused by some readers of going off the deep end in exhaustion or overstatement in describing the effect of Ms. Welch’s music on me, I would caution that there is a scientific basis for it.  Many other writers and artists have noted the power of music without the need for other stimuli to have this same effect on them, as documented by the recently passed neuroscientist Oliver Sacks.  

Proust used music to delve into his inner consciousness to inform his writings.  Tolstoy was so taken by music that he was careful about when and what it was to which he listened since when he immersed himself in it he felt himself to be taken to an altered mental state.  Clinical experience document that many Parkinson’s and Tourette’s patients are affected–and sometimes coerced–by the power of music into involuntary states.  On the darker side of human experience, it is no coincidence that music is used by oppressive regimes and militaries to coerce, and sometimes manipulate, prisoners and captives.  On the positive side in my own experience, I was able to come to a mathematical solution to a problem in one afternoon by immersing myself fully in John Coltrane’s “A Love Supreme.”

Aside from being an aural experience that stimulates neurobiological systems, underlying music is mathematics, and underlying the mathematics are digital packets of information.  We live in a digital world.  (And–and yes–Madonna is a digital girl).  No doubt the larger implications of this view are somewhat controversial (though compelling) in the scientific community with the questions surrounding it under the discipline of digital physics.

But if we view music as information (which at many levels it is) and our minds as the decoders, then the images and states of consciousness that we enter are implicit in the message, with bias introduced by our conscious minds in attempting to provide both structure and coherence.  It is the same with any data.  We can listen to a single song, but find ourselves placing undue emphasis on just one small aspect of the whole, missing out on what is significant.

Our own digital systems approaches are often similar.  When we concentrate on a sliver of information we bias our perspectives.  We see this all the time in business systems and project management.  Sometimes you just have to listen to the whole album, or step up to bigger data.

Note:  The post has been edited from the original to correct grammatical errors and for clarity.

 

Sunday Contemplation — Finding Wisdom — A General Theory of Love

When I first wrote about the book, A General Theory of Love, by Thomas Lewis, Fari Amini, and Richard Lannon, I said that it was an important book in the category of general psychology and human development.  While my comments for this post reprise some of my earlier observations, I think it is worthwhile to reprise and expand upon them.

Human psychology and social psychology have been ripe with pseudo-scientific methods and explanations.  In many cases ideology and just plain societal prejudice has also played a role.   In this work the authors effectively eviscerate the pre-scientific approach to understanding human behavior and mental health. They posit that an understanding of the physical structure of the brain, and the relationship and interplay of the environment to it, is necessary in understanding the manifestation of behaviors found in our species. In outlining the science of the brain’s structure, the authors effectively undermine the approach that the human mind and our emotional lives are self-contained.

According to Thomas Lewis, “the book describes the nature of 3 fundamental neurophysiologic processes that create and govern love: limbic resonance, the wordless and nearly instantaneous emotional attunement that allows us to sense each other’s feeling states; limbic regulation, the modulation and control of our physiology by our relationships; and limbic revision, the manner in which relationships alter the very structure of our brains. those whom we love, as our book describes, change who we are, and who we can become.”

This concept is not without its own limitations.  In the book the authors discuss the concept of the triune brain, that is, the portions of the brain that are derived from our evolutionary ancestors from our reptilian complex, through the limbic system (paleo-mammalian), and ending with the neo-cortex (neo-mammalian).  This model is an effective one for generalization, but it has not been completely accepted in neuroscience as an accurate model. Also, the identification of what constitutes the limbus is a shifting science, as is the evolutionary theory of the brain.  But one would expect such contingency in a scientific field only now garnering results.  What this shows is that we have been amazingly ignorant of the most important part of our anatomy that explains what we are, how our personalities and emotional lives are formed, and how those needs create the society in which we live.

Rather than individuals which are disconnected from those around us, what the book demonstrates is that the present state of psychiatry and neuroscience clearly shows that we are indelibly connected to those around us.  This not only includes family, but also our environments (both neo- and post-natal), and society.  Given that we are in the midst of a new renaissance in the sciences, the ambition of a “general theory” is a bit premature.

But what the authors have done is provide a strong hypothesis that is proving itself out in experimental and evolutionary biology and neuroscience: that we are social animals, that we have a strong and essential need for love and support early in our development, that our relationships and environment mold the structures of the brain, that emotional regulation is important throughout our lives, and that we are connected to each other in both intuitive and overt ways that make us what we are individually and societally.

They also provide, knowing the psychological needs of human flourishing, that the materialism and dispersion of modern society has contributed to the pathology and neuroses we see today: anxiety, depression, and narcissism, among others.  That this understanding is not academic–that understanding and applying this knowledge in solving human problems is also existential–is the challenge of our own time.

Out of Winter Woodshedding — Thinking about Project Risk and passing the “So What?” test

“Woodshedding” is a slang term in music, particularly in relation to jazz, in which the musician practices on an instrument usually outside of public performance, the purpose of which is to explore new musical insights without critical judgment.  This can be done with or without the participation of other musicians.  For example, much attention recently has been given to Bob Dylan’s Basement Tapes release.  Usually it is unusual to bother recording such music, given the purpose of improvisation and exploration, and so few additional examples of “basement tapes” exist from other notable artists.

So for me the holiday is a sort of opportunity to do some woodshedding.  The next step is to vet such thoughts on informal media, such as this blog, where the high standards involved in white and professional papers do not allow for informal dialogue and exchange of information, and thoughts are not yet fully formed and defensible.  My latest mental romps have been inspired by the movie about Alan Turing–The Imitation Game–and the British series The Bletchley Circle.  Thinking about one of the fathers of modern computing reminded me that the first use of the term “computer” referred to people.

As a matter of fact, though the terminology now refers to the digital devices that have insinuated themselves into every part of our lives, people continue to act as computers.  Despite fantastical fears surrounding AI taking our jobs and taking over the world, we are far from the singularity.  Our digital devices can only be programmed to go so far.  The so-called heuristics in computing today are still hard-wired functions, similar to replicating the methods used by a good con artist in “reading” the audience or the mark.  With the new technology in dealing with big data we have the ability to many of the methods originated by the people in the real life Bletchley Park of the Second World War.  Still, even with refinements and advances in the math, they provide great external information regarding the patterns and probable actions of the objects of the data, but very little insight into the internal cause-and-effect that creates the data, which still requires human intervention, computation, empathy, and insight.

Thus, my latest woodshedding has involved thinking about project risk.  The reason for this is the emphasis recently on the use of simulated Monte Carlo analysis in project management, usually focused on the time-phased schedule.  Cost is also sometimes included in this discussion as a function of resources assigned to the time-phased plan, though the fatal error in this approach is to fail to understand that technical achievement and financial value analysis are separate functions that require a bit more computation.

It is useful to understand the original purpose of simulated Monte Carlo analysis.  Nobel physicist Murray Gell-Mann, while working at RAND Corporation (Research and No Development) came up with the method with a team of other physicists (Jess Marcum and Keith Breuckner) to determine the probability of a number coming up from a set of seemingly random numbers.  For a full rendering of the theory and its proof Gell-Mann provides a good overview in his book The Quark and the Jaguar.  The insight derived from the insight of Monte Carlo computation has been to show that systems in the universe often organize themselves into patterns.  Instead of some event being probable by chance, we find that, given all of the events that have occurred to date, that there is some determinism which will yield regularities that can be tracked and predicted.  Thus, the use of simulated Monte Carlo analysis in our nether world of project management, which inhabits that void between microeconomics and business economics, provides us with some transient predictive probabilities given the information stream at that particular time, of the risks that have manifested and are influencing the project.

What the use of Monte Carlo and other such methods in identifying regularities do not do is to determine cause-and-effect.  We attempt to bridge this deficiency with qualitative risk in which we articulate risk factors to handle that are then tied to cost and schedule artifacts.  This is good as far as it goes.  But it seems that we have some of this backward.  Oftentimes, despite the application of these systems to project management, we still fail to overcome the risks inherent in the project, which then require a redefinition of project goals.  We often attribute these failures to personnel systems and there are no amount of consultants all too willing to sell the latest secret answer to project success.  Yet, despite years of such consulting methods applied to many of the same organizations, there is still a fairly consistent rate of failure in properly identifying cause-and-effect.

Cause-and-effect is the purpose of all of our metrics.  Only by properly “computing” cause-and-effect will we pass the “So What?” test.  Our first forays into this area involve modeling.  Given enough data we can model our systems and, when the real-time results of our in-time experiments play out to approximate what actually happens then we know that our models are true.  Both economists and physicists (well, the best ones) use the modeling method.  This allows us to get the answer even if not entirely understanding the question of the internal workings that lead to the final result.  As in Douglas Adams’ answer to the secret of life, the universe, and everything where the answer is “42,” we can at least work backwards.  And oftentimes this is what we are left, which explains the high rate of failure in time.

While I was pondering this reality I came across this article in Quanta magazine outlining the new important work of the MIT physicist Jeremy England entitled “A New Physics Theory of Life.”  From the perspective of evolutionary biology, this pretty much shows that not only does the Second Law of Thermodynamics support the existence and evolution of life (which we’ve known as far back as Schrodinger), but probably makes life inevitable under a host of conditions.  In relation to project management and risk, it was this passage that struck me most forcefully:

“Chris Jarzynski, now at the University of Maryland, and Gavin Crooks, now at Lawrence Berkeley National Laboratory. Jarzynski and Crooks showed that the entropy produced by a thermodynamic process, such as the cooling of a cup of coffee, corresponds to a simple ratio: the probability that the atoms will undergo that process divided by their probability of undergoing the reverse process (that is, spontaneously interacting in such a way that the coffee warms up). As entropy production increases, so does this ratio: A system’s behavior becomes more and more “irreversible.” The simple yet rigorous formula could in principle be applied to any thermodynamic process, no matter how fast or far from equilibrium. “Our understanding of far-from-equilibrium statistical mechanics greatly improved,” Grosberg said. England, who is trained in both biochemistry and physics, started his own lab at MIT two years ago and decided to apply the new knowledge of statistical physics to biology.”

No project is a closed system (just as the earth is not on a larger level).  The level of entropy in the system will vary by the external inputs that will change it:  effort, resources, and technical expertise.  As I have written previously (and somewhat controversially), there is both chaos and determinism in our systems.  An individual or a system of individuals can adapt to the conditions in which they are placed but only to a certain level.  It is non-zero that an individual or system of individuals can largely overcome the risks realized to date, but the probability of that occurring is vanishingly small.  The chance that a peasant will be a president is the same.  The idea that it is possible, even if vanishingly so, keeps the class of peasants in line so that those born with privilege can continue to reassuringly pretend that their success is more than mathematics.

When we measure risk what we are measuring is the amount of entropy in the system that we need to handle, or overcome.  We do this by borrowing energy in the form of resources of some kind from other, external systems.  The conditions in which we operate may be ideal or less than ideal.

What England’s work combined with his predecessors’ seem to suggest is that the Second Law almost makes life inevitable except where it is impossible.  For astrophysics this makes the entire Rare Earth hypothesis a non sequitur.  That is, wherever life can develop it will develop.  The life that does develop is fit for its environment and continues to evolve as changes to the environment occur.  Thus, new forms of organization and structure are found in otherwise chaotic systems as a natural outgrowth of entropy.

Similarly, when we look at more cohesive and less complex systems, such as projects, what we find are systems that adapt and are fit for the environments in which they are conceived.  This insight is not new and has been observed for organizations using more mundane tools, such as Deming’s red bead experiment.  Scientifically, however, we now have insight into the means of determining what the limitations of success are given the risk and entropy that has already been realized, against the needed resources that are needed to bring the project within acceptable ranges of success.  This information goes beyond simply stating the problem, leaving the computing to the person and thus passes the “So What?” test.

Finding Wisdom — Stephen Jay Gould in “The Mismeasure of Man”

Stephen Jay Gould

Perhaps no modern thinker among the modern scientific community from the late 1970s into the new century pushed the boundaries of interpretation and thought regarding evolutionary biology and paleontology more significantly than Stephen Jay Gould.  An eminent scholar himself–among evolutionary biologists his technical work Ontogeny and Phylogeny (1977) is considered one of the most significant works in the field and he is considered to be among the most important historians of science in the late 20th century–he was the foremost popularizer of science during his generation (with the possible exception of Richard Dawkins and Carl Sagan) who used his position to advance scientific knowledge, critical thinking, and to attack pseudoscientific, racist, and magical thinking which misused and misrepresented scientific knowledge and methods.

His concepts of punctuated equilibrium, spandrels, and the Panglossian Paradigm pushed other evolutionary biologists in the field to rise to new heights in considering and defending their own applications of neo-Darwinian theory, prompting (sometimes heated) debate.  These ideas continue to be controversial in the evolutionary community with, it seems, most of the objections being based on the fear that they will be misused by non-scientists against evolution itself, and it is true that creationists and other pseudoscientists–aided and abetted by the scientifically illiterate popular press–misrepresented the so-called “Darwin Wars” as being more significant than they really were.  But many of his ideas were reconciled and resolved into a new synthesis within the science of evolution.  Thus, his insights, based as they were in the scientific method and within proven theory, epitomized the very subject that he popularized–that nothing is ever completely settled in science, that all areas of human understanding are open to inquiry and–perhaps–revision, even if slight.

Having established himself as a preeminent science historian, science popularizer, scholar in several fields, and occasional iconoclast, Gould focused his attention on an area that well into the late 20th century was rife with ideology, prejudice, and pseudoscience–the issue of human intelligence and its measurement.  As Darwin learned over a hundred years before, it is one thing to propose that natural selection is the agent of evolution, it is another to then demonstrate that the human species descended from other primate ancestors, and the manner in which sexual selection plays a role in human evolution: for some well entrenched societal interests and specialists it is one step too far.  Gould’s work was attacked, but it has withstood these attacks and criticisms, and stands as a shining example of using critical thinking and analytical skills in striking down an artifact of popular culture and bad social science.

In The Mismeasure of Man, Gould begins his work by surveying the first scientific efforts at understanding human intelligence by researchers such as Louis Agassiz and Paul Broca, among others, who studied human capabilities through the now-defunct science of craniometry.  I was reminded, when I first picked up the book, of Carl Sagan’s collection of writings in the 1979 book, Broca’s Brain, in which some of the same observations are made.  What Gould demonstrates is that the racial and sexual selection bias of the archetypes chosen by the researchers, in particular Samuel George Morton (1799-1851), provided them with the answers they wanted to find–that their methodology was biased, and therefore, invalid from the start.  In particular, the differences in the skulls of Caucasians (of a particular portion of Europe), Black people (without differentiating ethnic or geographical differences), and Mongolians (Asian peoples without differentiation) in identifying different human “species” lacked rigor and was biased in its definitions from the outset.

In order to be fair, a peer-reviewed paper challenged Gould’s assertion that Morton may have fudged his findings on cranial measurements, since the researcher used bird seed (or iron pellets depending on the source) as the basis for measurement, and found, in a sample of some of the same skulls, (combined with a survey from 1988) that Morton was largely accurate in his measures.  The research, however, was unable to undermine the remainder of Gould’s thesis while attempting to resurrect the integrity of Morton in light of his own, largely pre-scientific time.  I can understand the point made by Gould’s critics regarding Morton that it is not necessarily constructive to apply modern methodological methods–or imply dishonesty–to those early pioneers whose works has led to modern scientific understanding.  But as an historian I also understand that when reading Gibbon on the Roman Empire that we learn a great deal about the prejudices of 18th century British society–perhaps more than we learn of the Romans.  Gibbon and Morton, as with most people, were not consciously aware of their own biases–or that they were biases.  This is the reason for modern research and methodological standards in academic fields–and why human understanding is always “revisionist” to use a supposed pejorative that I heard used by one particularly ignorant individual several years ago.  Gibbon showed the way of approaching and writing about history.  His work would not pass editorial review today, but the reason why he is so valued is that he is right in many of his observations and theses.  The same cannot be said for Morton, who seemed motivated by the politics of justifying black slavery, which is why Gould treats him so roughly, particularly given that some of Morton’s ideas still find comfort in many places in our own time.  In light of subsequent research, especially the human genome project, Gould proves out right which, after all, is the measure that counts.

But that is just the appetizer.  Gould then takes on the basis of IQ (intelligence quotient), g factor (the general intelligence factor), and the heritability of intelligence to imply human determinism, especially generalized among groups.  He traces the original application of IQ tests developed by Alfred Binet and Theodore Simon to the introduction of universal education in France and the need to identify children with learning disabilities and those who required remediation by grade and age group.  He then surveys how psychologist Lewis Terman of Stanford modified the test and transformed its purpose in order to attempt to find an objective basis for determining human intelligence.  In critiquing this transformation Gould provides examples of the more obviously (to modern eyes) biased questions on the test, and then effectively destroys the statistical basis for the correlations of the test in being able to determine any objective measure of g.  He demonstrates that the correlations established by the psychological profession to establish “g” are both statistically and logically questionable and that they commit the logical fallacy of reification–that is, they take an abstract measure and imbue it with a significance that it cannot possess as if it were an actual physical entity or “thing.”

Gould demonstrates that the variability of the measurements within groups, the clustering of results within the tests that identify distinct aptitudes, and the variability of results across time for the same individuals given changes in circumstances of material condition, education, and emotional maturity, renders “g” an insignificant measure.  The coup de grace in the original edition–a trope still often pulled out as the last resort by defenders of human determinism and IQ–is in Gould’s analysis of the work of Cyril Burt, the oft-cited researcher of twin studies, who published fraudulent works that asserted that IQ was highly heritable and not affected by environment.  That we still hear endless pontificating on “nature vs. nurture” debates, and that Stanford-Binet and other tests are still used as a basis for determining a measure of “intelligence” owes more to societal bias, and the still pseudo-scientific methodologies of much of the psychological profession, than scientific and intellectual honesty.

The core of Gould’s critique is to effectively discredit the concept of biological determinism which he defines as “the abstraction of intelligence as a single entity, its location within the brain, its quantification as one number for each individual, and the use of these numbers to rank people in a single series of worthiness, invariably to find that oppressed and disadvantaged groups—races, classes, or sexes—are innately inferior and deserve their status.”

What Stephen Jay Gould demonstrates in The Mismeasure of Man most significantly then, I think, is that human beings–particularly those with wealth, power, and influence, or who are part of a societally favored group–demonstrate an overwhelming desire to differentiate themselves from others and will go to great lengths to do so to their own advantage.  This desire includes the misuse of science, whatever the cost to truth or integrity, in order to demonstrate that there is an organic or heritable basis for their favored position relative to others in society when, in reality, there are more complex–and perhaps more base and remedial–reasons.  Gould shows how public policy, educational focus, and discriminatory practices were influenced by the tests of immigrant and minority groups to deny them access to many of the benefits of the economic system and society.  Ideology was the driving factor in the application of these standardized tests, which served the purposes of societal and economic elites to convince disenfranchised groups that they deserved their inferior status.  The label of “science” provided these tainted judgments with just the right tinge of respectability that they needed to overcome skepticism and opposition.

A few years after the publication of Gould’s work a new example of the last phenomenon described above emerged with the publication of the notorious The Bell Curve (1994) by Richard Herrnstein and Charles Murray–the poster child of the tradition harking back to Herbert Spencer’s Social Statics of elites funding self-serving pseudo-science and–another word will not do–bullshit.  While Spencer could be forgiven his errors given his time and scientific limitations, Herrnstein and Murray, who have little excuse, used often contradictory and poorly correlated (and causative) statistical methods to argue for a race-based argument of biological determinism.  Once again, Gould in the 1996 revision to his original work, dealt with these fallacies directly, demonstrating in detail the methodological errors in their work and the overreach inherit in their enterprise–another sad example of bias misusing knowledge as the intellectual basis to oppress other people and, perhaps more egregiously, to abandon coming to terms with the disastrous actions that American society has had on one specific group of people because of the trivial difference of skin color.

With the yeoman work of Stephen Jay Gould to discredit pseudo-scientific ideas and the misuse of statistical methodology to pigeonhole and classify people–to misuse socio-biology and advance self-serving theories of human determinism–the world has been provided the example that even the best financed and well entrenched elites cannot stop the advance of knowledge and information.  They will try–using more sophisticated methods of disinformation and advertising–but over time those efforts will be defeated.  It will happen because scientific projects like the Human Genome Project have already demonstrated that there is only one race–the human race–and that we are all tied together by common ancestors.  The advantages that we realize over each other at any point in time is ephemeral.  The knowledge regarding variability in the human species acknowledges differences in the heritable characteristics in individuals, but that knowledge implies nothing about our relative worth to one another, nor is it a moral judgment rendered from higher authority that justifies derision, stigma, ridicule, discrimination, or reduced circumstances.  It will happen because in our new age information, once transmitted, cannot be retracted–it is out there forever.  There is much wisdom here.  It is up to each of us to recognize it, and inform our actions as a result of it.