Both Sides Now — The Value of Data Exploration

Over the last several months I have authored a number of stillborn articles that just did not live up to the standards that I set for this blog site. After all, sometimes we just have nothing important to add to the conversation. In a world dominated by narcissism, it is not necessary to constantly have something to say. Some reflection and consideration are necessary, especially if one is to be as succinct as possible.

A quote ascribed to Woodrow Wilson, which may be apocryphal, though it does appear in two of his biographies, was in response to being lauded by someone for making a number of short, succinct, and informative speeches. When asked how he was able to do this, President Wilson is supposed to have replied:

“It depends. If I am to speak ten minutes, I need a week for preparation; if fifteen minutes, three days; if half an hour, two days; if an hour, I am ready now.”

An undisciplined mind has a lot to say about nothing in particular with varying degrees of fidelity to fact or truth. When in normal conversation we most often free ourselves from the discipline expected for more rigorous thinking. This is not necessarily a bad thing if we are saying nothing of consequence and there are gradations, of course. Even the most disciplined mind gets things wrong. We all need editors and fact checkers.

While I am pulling forth possibly apocryphal quotes, the one most applicable that comes to mind is the comment by Hemingway as told by his deckhand in Key West and Cuba, Arnold Samuelson. Hemingway was supposed to have given this advice to the aspiring writer:

“Don’t get discouraged because there’s a lot of mechanical work to writing. There is, and you can’t get out of it. I rewrote the first part of A Farewell to Arms at least fifty times. You’ve got to work it over. The first draft of anything is shit. When you first start to write you get all the kick and the reader gets none, but after you learn to work it’s your object to convey everything to the reader so that he remembers it not as a story he had read but something that happened to himself.”

Though it deals with fiction, Hemingway’s advice applies to any sort of writing and rhetoric. Dr. Roger Spiller, who more than anyone mentored me as a writer and historian, once told me, “Writing is one of those skills that, with greater knowledge, becomes harder rather than easier.”

As a result of some reflection, over the last few months, I had to revisit the reason for the blog. Thus, this is still its purpose: it is a way to validate ideas and hypotheses with other professionals and interested amateurs in my areas of interest. I try to keep uninformed opinion in check, as all too many blogs turn out to be rants. Thus, a great deal of research goes into each of these posts, most from primary sources and from interactions with practitioners in the field. Opinions and conclusions are my own, and my reasoning for good or bad are exposed for all the world to see and I take responsibility for them.

This being said, part of my recent silence has also been due to my workload in–well–the effort involved in my day job of running a technology company, and in my recent role, since late last summer, as the Managing Editor of the College of Performance Management’s publication known as the Measurable News. Our emphasis in the latter case has been to find new contributions to the literature regarding business analytics and to define the concept of integrated project, program, and portfolio management. Stepping slightly over the line to make a pitch, I recommend anyone interested in contributing to the publication to submit an article. The submission guidelines can be found here.

Both Sides Now: New Perspectives

That out of the way, I recently saw, again on the small screen, the largely underrated movie about Neil Armstrong and the Apollo 11 moon landing, “First Man”, and was struck by this scene:

Unfortunately, the first part of the interview has been edited out of this clip and I cannot find a full scene. When asked “why space” he prefaces his comments by stating that the atmosphere of the earth seems to be so large from the perspective of looking at it from the ground but that, having touched the edge of space previously in his experience as a test pilot of the X15, he learned that it is actually very thin. He then goes on to posit that looking at the earth from space will give us a new perspective. His conclusion to this observation is then provided in the clip.

Armstrong’s words were prophetic in that the space program provided a new perspective and a new way of looking at things that were in front of us the whole time. Our spaceship Earth is a blue dot in a sea of space and, at least for a time, the people of our planet came to understand both our loneliness in space and our interdependence.

Earth from Apollo 8. Photo courtesy of NASA.

 

The impact of the Apollo program resulted in great strides being made in environmental and planetary sciences, geology, cosmology, biology, meteorology, and in day-to-day technology. The immediate effect was to inspire the environmental and human rights movements, among others. All of these advances taken together represent a new revolution in thought equal to that during the initial Enlightenment, one that is not yet finished despite the headwinds of reaction and recidivism.

It’s Life’s Illusions I Recall: Epistemology–Looking at and Engaging with the World

In his book Darwin’s Dangerous Idea, Daniel Dennett posited that what was “dangerous” about Darwinism is that it acts as a “universal acid” that, when touching other concepts and traditions, transforms them in ways that change our world-view. I have accepted this position by Dennett through the convincing argument he makes and the evidence in front of us, and it is true that Darwinism–the insight in the evolution of species over time through natural selection–has transformed our perspective of the world and left the old ways of looking at things both reconstructed and unrecognizable.

In his work, Time’s Arrow, Time’s Cycle, Stephen Jay Gould noted that Darwinism is part of one of the three great reconstructions of human thought that, in quoting Sigmund Freud, where “Humanity…has had to endure from the hand of science…outrages upon its naive self-love.” These outrages include the Copernican revolution that removed the Earth from the center of the universe, Darwinism and the origin of species, including the descent of humanity, and what John McPhee, coined as the concept of “deep time.”

But–and there is a “but”–I would propose that Darwinism and the other great reconstructions noted are but different ingredients of a larger and more broader, though compatible, type of innovation in the way the world is viewed and how it is approached–a more powerful universal acid. That innovation in thought is empiricism.

It is this approach to understanding that eats through the many ills of human existence that lead to self-delusion and folly. Though you may not know it, if you are in the field of information technology or any of the sciences, you are part of this way of viewing and interacting with the world. Married with rational thinking, this epistemology–coming from the perspectives of the astronomical observations of planets and other heavenly bodies by Charles Sanders Peirce, with further refinements by William James and John Dewey, and others have come down to us in what is known as Pragmatism. (Note that the word pragmatism in this context is not the same as the more generally used colloquial form of the word. For this type of reason Peirce preferred the term “pragmaticism”). For an interesting and popular reading of the development of modern thought and the development of Pragmatism written for the general reader I highly recommend the Pulitzer Prize-winning The Metaphysical Club by Louis Menand.

At the core of this form of empiricism is that the collection of data, that is, recording, observing, and documenting the universe and nature as it is will lead us to an understanding of things that we otherwise would not see. In our more mundane systems, such as business systems and organized efforts applying disciplined project and program management techniques and methods, we also can learn more about these complex adaptive systems through the enhanced collection and translation of data.

I Really Don’t Know Clouds At All: Data, Information, Intelligence, and Knowledge

The term “knowledge discovery in data”, or KDD for short, is an aspirational goal and so, in terms of understanding that goal, is a point of departure from the practice information management and science. I’m taking this stance because the technology industry uses terminology that, as with most language, was originally designed to accurately describe a specific phenomenon or set of methods in order to advance knowledge, only to find that that terminology has been watered down to the point where it obfuscates the issues at hand.

As I traveled to locations across the U.S. over the last three months, I found general agreement among IT professionals who are dealing with the issues of “Big Data”, data integration, and the aforementioned KDD of this state of affairs. In almost every case there is hesitation to use this terminology because it has been absconded and abused by mainstream literature, much as physicists rail against the misuse of the concept of relativity by non-scientific domains.

The impact of this confusion in terminology has caused organizations to make decisions where this terminology is employed to describe a nebulous end-state, without the initiators having an idea of the effort or scope. The danger here, of course, is that for every small innovative company out there, there is also a potential Theranos (probably several). For an in-depth understanding of the psychology and double-speak that has infiltrated our industry I highly recommend the HBO documentary, “The Inventor: Out for Blood in Silicon Valley.”

The reason why semantics are important (as they always have been despite the fact that you may have had an associate complain about “only semantics”) is that they describe the world in front of us. If we cloud the meanings of words and the use of language, it undermines the basis of common understanding and reveals the (poor) quality of our thinking. As Dr. Spiller noted, the paradox of writing and in gathering knowledge is that the more you know, the more you realize you do not know, and the harder writing and communicating knowledge becomes, though we must make the effort nonetheless.

Thus KDD is oftentimes not quite the discovery of knowledge in the sense that the term was intended to mean. It is, instead, a discovery of associations that may lead us to knowledge. Knowing this distinction is important because the corollary processes of data mining, machine learning, and the early application of AI in which we find ourselves is really the process of finding associations, correlations, trends, patterns, and probabilities in data that is approached in a manner as if all information is flat, thereby obliterating its context. This is not knowledge.

We can measure the information content of any set of data, but the real unlocked potential in that information content will come with the processing of it that leads to knowledge. To do that requires an underlying model of domain knowledge, an understanding of the different lexicons in any given set of domains, and a Rosetta Stone that provides a roadmap that identifies those elements of the lexicon that are describing the same things across them. It also requires capturing and preserving context.

For example, when I use the chat on my iPhone it attempts to anticipate what I want to write. I am given three choices of words to choose if I want to use this shortcut. In most cases, the iPhone guesses wrong, despite presenting three choices and having at its disposal (at least presumptively) a larger vocabulary than the writer. Oftentimes it seems to take control, assuming that I have misspelled or misidentified a word and chooses the wrong one for me, where my message becomes a nonsense message.

If one were to believe the hype surrounding AI, one would think that there is magic there but, as Arthur C. Clarke noted (known as Clarke’s Third Law): “Any sufficiently advanced technology is indistinguishable from magic.” Familiar with the new technologies as we are, we know that there is no magic there, and also that it is consistently wrong a good deal of the time. But many individuals come to rely upon the technology nonetheless.

Despite the gloss of something new, the long-established methods of epistemology, code-breaking, statistics, and Calculus apply–as do standards of establishing fact and truth. Despite a large set of data, the iPhone is wrong because the iPhone does not understand–does not possess knowledge–to know why it is wrong. As an aside, its dictionary is also missing a good many words.

A Segue and a Conclusion–I Still Haven’t Found What I’m Looking For: Why Data Integration?…and a Proposed Definition of the Bigness of Data

As with the question to Neil Armstrong, so the question on data. And so the answer is the same. When we look at any set of data under a particular structure of a domain, the information we derive provides us with a manner of looking at the world. In economic systems, businesses, and projects that data provides us with a basis for interpretation, but oftentimes falls short of allowing us to effectively describe and understand what is happening.

Capturing interrelated data across domains allows us to look at the phenomena of these human systems from a different perspective, providing us with the opportunity to derive new knowledge. But in order to do this, we have to be open to this possibility. It also calls for us to, as I have hammered home in this blog, reset our definitions of what is being described.

For example, there are guides in project and program management that refer to statistical measures as “predictive analytics.” This further waters down the intent of the phrase. Measures of earned value are not predictive. They note trends and a single-point outcome. Absent further analysis and processing, the statistical fallacy of extrapolation can be baked into our analysis. The same applies to any index of performance.

Furthermore, these indices and indicators–for that is all they are–do not provide knowledge, which requires a means of not only distinguishing between correlation and causation but also applying contextualization. All systems operate in a vector space. When we measure an economic or social system we are really measuring its behavior in the vector space that it inhabits. This vector space includes the way it is manifested in space-time: the equivalent of length, width, depth (that is, its relative position, significance, and size within information space), and time.

This then provides us with a hint of a definition of what often goes by the definition of “big data.” Originally, as noted in previous blogs, big data was first used in NASA in 1997 by Cox and Ellsworth (not as credited to John Mashey on Wikipedia with the dishonest qualifier “popularized”) and was simply a statement meaning “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze.”

This is a relative term given Moore’s Law. But we can begin to peel back a real definition of the “bigness” of data. It is important to do this because too many approaches to big data assume it is flat and then apply probabilities and pattern recognition to data that undermines both contextualization and knowledge. Thus…

The Bigness of Data (B) is a function (f ) of the entropy expended (S) to transform data into information, or to extract its information content.

Information evolves. It evolves toward greater complexity just as life evolves toward greater complexity. The universe is built on coded bits of information that, taken together and combined in almost unimaginable ways, provides different forms of life and matter. Our limited ability to decode and understand this information–and our interactions in it– are important to us both individually and collectively.

Much entropy is already expended in the creation of the data that describes the activity being performed. Its context is part of its information content. Obliterating the context inherent in that information content causes all previous entropy to be of no value. Thus, in approaching any set of data, the inherent information content must be taken into account in order to avoid the unnecessary (and erroneous) application of data interpretation.

More to follow in future posts.

Sunday Contemplation — Finding Wisdom — Daniel Dennett in “Darwin’s Dangerous Idea”

Daniel_Dennett

“The Darwinian Revolution is both a scientific and a philosophical revolution, and neither revolution could have occurred without the other. As we shall see, it was the philosophical prejudices of the scientists, more than their lack of scientific evidence, that prevented them from seeing how the theory could actually work, but those philosophical prejudices that had to be overthrown were too deeply entrenched to be dislodged by mere philosophical brilliance. It took an irresistible parade of hard-won scientific facts to force thinkers to take seriously the weird new outlook that Darwin proposed…. If I were to give an award for the single best idea anyone has ever had, I’d give it to Darwin, ahead of Newton and Einstein and everyone else. In a single stroke, the idea of evolution by natural selection unifies the realm of life, meaning, and purpose with the realm of space and time, cause and effect, mechanism and physical law. But it is not just a wonderful scientific idea. It is a dangerous idea.”

Daniel Dennett (pictured above thanks to Wikipedia) is the Co-Director of the Center for Cognitive Studies and Austin B. Fletcher Professor of Philosophy at Tufts University.  He is also known as “Dawkins’ Bulldog”, for his pointed criticism of what he viewed as unnecessary revisions to Darwinian Theory by Stephen Jay Gould, who was also a previous subject of this blog, and others.  In popular culture he has also been numbered among the “Four Horsemen” of the so-called “New Atheism”.  His intellectual and academic achievements are many, and his insights into evolution, social systems, cognition, consciousness, free will, philosophy, and artificial intelligence are extremely influential.

Back in 1995, when I was a newly minted Commander in the United States Navy, I happened across an intriguing book in a Jacksonville, Florida bookshop during a temporary duty assignment.  The book was entitled Darwin’s Dangerous Idea: Evolution and the Meanings of Life.  I opened it that afternoon during a gentle early spring Florida day and found myself astounded and my mind liberated, as if chains which I had not previously noticed, but which had bound my mind, had been broken and released me, so great was the influence of the philosophical articulation of this “dangerous idea”.

Here, for the first time, was a book that took what we currently know about the biological sciences and placed them within the context of other scientific domains–and done so in a highly organized, articulate, and readable manner.  The achievement of the book was not so much in deriving new knowledge, but in presenting an exposition of the known state of the science and tracing its significance and impact–no mean achievement given the complexity of the subject matter and the depth and breadth of knowledge being covered.  The subject matter, of course, is highly controversial only because it addresses subjects that engender the most fear: the facts of human origins, development, nature, biological interconnectedness, and the inevitability of mortality.

Dennett divides his thesis into three parts: the method of developing the theory and its empirical proofs, it’s impact on the biological sciences, and the impact on other disciplines, especially regarding consciousness, philosophy, sociology, and morality.  He introduces and develops several concepts, virtually all of which have since become cornerstones in human inquiry, and not only among the biological sciences.

Among these are the concepts of design space, of natural selection behaving as an algorithm, of Darwinism acting as a “universal acid” that transforms the worldview of everything it touches, and of the mental concepts of skyhooks, cranes and “just-so” stories–fallacious and magical ways of thinking that have no underlying empirical foundation to explain natural phenomena.

The concept of the design space has troubled many, though not most evolutionary biologists and physicists, only because Dennett posits a philosophical position in lieu of a mathematical one.  This does not necessarily undermine his thesis, simply because one must usually begin with a description of a thesis before one can determine whether it can be disproven.  Furthermore, Dennett is a philosopher of the analytical school and so the scope of his work is designed from that perspective.

But there are examples that approach the analogue of design space in physics–those that visualize space-time and general relativity as at this site.  It is not a stretch to understand that our reality–the design space that the earth inhabits among many alternative types of design spaces that may exist that relate to biological evolution–can eventually be mathematically formulated.  Given that our knowledge of comparative planetary and biological physics is still largely speculative and relegated to cosmological speculation, the analogy for now is sufficient and understandable.  It also gives a new cast to the concept of adaptation away from the popular (and erroneous) concept of “survival of the fittest”, since fitness is based on the ability to adapt to environmental pressures and to find niches that may exist in that environment.  With our tracing of the effects of climate change on species, we will be witnessing first hand the brutal concept of design space.

Going hand-in-hand with design space is the concept that Darwinian evolution through the agent of natural selection is an algorithmic process.  This understanding becomes “universal acid” that, according to Dennett, “eats through just about every traditional concept and leaves in its wake a revolutionized world-view.”

One can understand the objection of philosophers and practitioners of metaphysics to this concept, which many of them have characterized as nihilistic.  This, of course, is argument from analogy–a fallacious form of rhetoric.  The objection to the book through these arguments, regardless of the speciousness of their basis, is premature and a charge to which Dennett effectively responds through his book Consciousness Explained.  It is in this volume that Dennett addresses the basis for the conscious self, “intentionality”, and the concept of free will (and its limitations)–what in the biological and complexity sciences is described as emergence.

What Dennett has done through describing the universal acid of Darwinian evolution is to describe a phenomenon: the explanatory reason for rapid social change that we have and are witnessing, and the resulting reaction and backlash to it.  For example, the revolution that was engendered from the Human Genome Project not only has confirmed our species’ place in the web of life on Earth and our evolutionary place among primates, but also the interconnections deriving from descent from common ancestors of the entire human species, exploding the concept of race and any claim to inherent superiority or inferiority to any cultural grouping of humans.

One can clearly see the threat this basic truth has to entrenched beliefs deriving from conservative philosophy, cultural tradition, metaphysics, religion, national borders, ethnic identity, and economic self-interest.

For it is apparent to me, given my reading not only of Dennett, but also that of both popularizers and the leading minds in the biological sciences that included Dawkins, Goodall, Margulis, Wilson, Watson, Venter, Crick, Sanger, and Gould; in physics from Hawking, Penrose, Weinberg, Guth, and Krauss, in mathematics from Wiles, Witten, and Diaconis; in astrophysics from Sandage, Sagan, and deGrasse Tyson; in climate science from Hansen and many others; and in the information sciences from Moore, Knuth, and Berners-Lee, that we are in the midst of another intellectual revolution.  This intellectual revolution far outstrips both the Renaissance and the Enlightenment as periods of human achievement and advancement, if only because of the widespread availability of education, literacy, healthcare, and technology, as well as human diversity, which both accelerates and expands many times over the impact of each increment in knowledge.

When one realizes that both of those earlier periods of scientific and intellectual advance engendered significant periods of social, political, and economic instability, upheaval, and conflict, then the reasons for many of the conflicts in our own times become clear.  It was apparent to me then–and even more apparent to me now–that there will be a great overturning of the institutional, legal, economic, social, political, and philosophic ideas and structures that now exist as a result.  We are already seeing the strains in many areas.  No doubt there are interests looking to see if they can capitalize on or exploit these new alignments.  But for those overarching power structures that exert control, conflict, backlash, and eventual resolution is inevitable.

In this way Fukuyama was wrong in the most basic sense in his thesis in The End of History and the Last Man to the extent that he misidentified ideologies as the driving force behind the future of human social organization.  What he missed in his social “science” (*) is the shift to the empirical sciences as the nexus of change.  The development of analytical philosophy (especially American Pragmatism) and more scientifically-based modeling in the social sciences are only the start, but one can make the argument that these ideas have been more influential in clearly demonstrating that history, in Fukuyama’s definition, is not over.

Among the first shots over the bow from science into the social sciences have come works from such diverse writers as Jared Diamond (Guns, Germs, and Steel: The Fates of Human Societies (1997)) and Sam Harris (The Moral Landscape: How Science Can Determine Human Values (2010)).  The next wave will, no doubt, be more intense and drive further resistance and conflict.

The imperative of science informing our other institutions is amply demonstrated by two facts.

  1. On March 11, 2016 an asteroid that was large enough to extinguish a good part of all life on earth came within 19,900 miles of our planet’s center.  This was not as close, however, as the one that passed on February 25 (8,900 miles).  There is no invisible shield or Goldilocks Zone to magically protect us.  The evidence of previous life-ending collisions are more apparent with each new high resolution satellite image of our planet’s surface.  One day we will look up and see our end slowly but inevitably making its way toward us, unless we decide to take measures to prevent such a catastrophe.
  2. Despite the desire to deny that it’s happening, 2015 was the hottest recorded year on record and 2016 thus far is surpassing that, providing further empirical evidence of the validity of Global Warming models.  In fact, the last four consecutive years fall within the four hottest years on record (2014 was the previous hottest year).  The outlier was 2010, another previous high, which is hanging in at number 3 for now.  2013 is at number 4 and 2012 at number 8.  Note the general trend.  As Jared Diamond has convincingly demonstrated–the basis of conflict and societal collapse is usually rooted in population pressures exacerbated by resource scarcity.  We are just about to the point of no return, given the complexity of the systems involved, and can only mitigate the inevitable–but we must act now to do.

What human civilization does not want to be is on the wrong side of history in how to deal with these challenges.  Existing human power structures and interests would like to keep the scientific community within the box of technology–and no doubt there are still scientists that are comfortable to stay within that box.

The fear regarding allowing science to move beyond the box of technology and general knowledge is its misuse and misinterpretation, usually by non-scientists, such as the reprehensible meme of Social Darwinism (which is neither social nor Darwinian).**  This fear is oftentimes transmitted by people with a stake in controlling the agenda or interpreting what science has determined.  Its contingent nature also is a point of fear.  While few major theories are usually completely overturned as new knowledge is uncovered, the very nature of revision and adjustment to theory is frightening to people who depend on, at least, the illusion of continuity and hard truths.  Finally, science puts us in our place within the universe.  If there are millions of planets that can harbor some kind of life, and a sub-set of those that have the design space to allow for some kind of intelligent life (as we understand that concept), are we really so special after all?

But not only within the universe.  Within societies, if all humans have developed from a common set of ancestors, then our basic humanity is a shared one.  If the health and sustainability of an ecology is based on its biodiversity, then the implication for human societies is likewise found in diversity of thought and culture, eschewing tribalism and extreme social stratification.  If the universe is deterministic with only probability determining ultimate cause and effect, then how truly free is free will?  And what does this say about the circumstances in which each of us finds him or herself?

The question now is whether we embrace our fears, manipulated by demagogues and oligarchs, or embrace the future, before the future overwhelms and extinguishes us–and to do so in a manner that is consistent with our humanity and ethical reasoning.

 

Note:  Full disclosure.  As a senior officer concerned with questions of AI, cognition, and complex adaptive systems, I opened a short correspondence with Dr. Dennett about those subjects.  I also addressed what I viewed as his unfair criticism (being Dawkins’ Bulldog) of punctuated equilibrium, spandrels, and other minor concepts advanced by Stephen Jay Gould, offering a way that Gould’s concepts were well within Darwinian Theory, as well as being both interesting and explanatory.  Given that less complex adaptive systems that can be observed do display punctuated periods of rapid development–and also continue to have the vestiges of previous adaptations that no longer have a purpose–it seemed to me that larger systems must also do so, the punctuation being on a different time-scale, and that any adaptation cannot be precise given that biological organisms are imprecise.  He was most accommodating and patient, and this writer learned quite a bit in our short exchange.  My only regret was not to continue the conversation.  I do agree with Dr. Dennett (and others) on their criticism of non-overlapping magisteria (NOMA), as is apparent in this post.

Finding Wisdom — Stephen Jay Gould in “The Mismeasure of Man”

Stephen Jay Gould

Perhaps no modern thinker among the modern scientific community from the late 1970s into the new century pushed the boundaries of interpretation and thought regarding evolutionary biology and paleontology more significantly than Stephen Jay Gould.  An eminent scholar himself–among evolutionary biologists his technical work Ontogeny and Phylogeny (1977) is considered one of the most significant works in the field and he is considered to be among the most important historians of science in the late 20th century–he was the foremost popularizer of science during his generation (with the possible exception of Richard Dawkins and Carl Sagan) who used his position to advance scientific knowledge, critical thinking, and to attack pseudoscientific, racist, and magical thinking which misused and misrepresented scientific knowledge and methods.

His concepts of punctuated equilibrium, spandrels, and the Panglossian Paradigm pushed other evolutionary biologists in the field to rise to new heights in considering and defending their own applications of neo-Darwinian theory, prompting (sometimes heated) debate.  These ideas continue to be controversial in the evolutionary community with, it seems, most of the objections being based on the fear that they will be misused by non-scientists against evolution itself, and it is true that creationists and other pseudoscientists–aided and abetted by the scientifically illiterate popular press–misrepresented the so-called “Darwin Wars” as being more significant than they really were.  But many of his ideas were reconciled and resolved into a new synthesis within the science of evolution.  Thus, his insights, based as they were in the scientific method and within proven theory, epitomized the very subject that he popularized–that nothing is ever completely settled in science, that all areas of human understanding are open to inquiry and–perhaps–revision, even if slight.

Having established himself as a preeminent science historian, science popularizer, scholar in several fields, and occasional iconoclast, Gould focused his attention on an area that well into the late 20th century was rife with ideology, prejudice, and pseudoscience–the issue of human intelligence and its measurement.  As Darwin learned over a hundred years before, it is one thing to propose that natural selection is the agent of evolution, it is another to then demonstrate that the human species descended from other primate ancestors, and the manner in which sexual selection plays a role in human evolution: for some well entrenched societal interests and specialists it is one step too far.  Gould’s work was attacked, but it has withstood these attacks and criticisms, and stands as a shining example of using critical thinking and analytical skills in striking down an artifact of popular culture and bad social science.

In The Mismeasure of Man, Gould begins his work by surveying the first scientific efforts at understanding human intelligence by researchers such as Louis Agassiz and Paul Broca, among others, who studied human capabilities through the now-defunct science of craniometry.  I was reminded, when I first picked up the book, of Carl Sagan’s collection of writings in the 1979 book, Broca’s Brain, in which some of the same observations are made.  What Gould demonstrates is that the racial and sexual selection bias of the archetypes chosen by the researchers, in particular Samuel George Morton (1799-1851), provided them with the answers they wanted to find–that their methodology was biased, and therefore, invalid from the start.  In particular, the differences in the skulls of Caucasians (of a particular portion of Europe), Black people (without differentiating ethnic or geographical differences), and Mongolians (Asian peoples without differentiation) in identifying different human “species” lacked rigor and was biased in its definitions from the outset.

In order to be fair, a peer-reviewed paper challenged Gould’s assertion that Morton may have fudged his findings on cranial measurements, since the researcher used bird seed (or iron pellets depending on the source) as the basis for measurement, and found, in a sample of some of the same skulls, (combined with a survey from 1988) that Morton was largely accurate in his measures.  The research, however, was unable to undermine the remainder of Gould’s thesis while attempting to resurrect the integrity of Morton in light of his own, largely pre-scientific time.  I can understand the point made by Gould’s critics regarding Morton that it is not necessarily constructive to apply modern methodological methods–or imply dishonesty–to those early pioneers whose works has led to modern scientific understanding.  But as an historian I also understand that when reading Gibbon on the Roman Empire that we learn a great deal about the prejudices of 18th century British society–perhaps more than we learn of the Romans.  Gibbon and Morton, as with most people, were not consciously aware of their own biases–or that they were biases.  This is the reason for modern research and methodological standards in academic fields–and why human understanding is always “revisionist” to use a supposed pejorative that I heard used by one particularly ignorant individual several years ago.  Gibbon showed the way of approaching and writing about history.  His work would not pass editorial review today, but the reason why he is so valued is that he is right in many of his observations and theses.  The same cannot be said for Morton, who seemed motivated by the politics of justifying black slavery, which is why Gould treats him so roughly, particularly given that some of Morton’s ideas still find comfort in many places in our own time.  In light of subsequent research, especially the human genome project, Gould proves out right which, after all, is the measure that counts.

But that is just the appetizer.  Gould then takes on the basis of IQ (intelligence quotient), g factor (the general intelligence factor), and the heritability of intelligence to imply human determinism, especially generalized among groups.  He traces the original application of IQ tests developed by Alfred Binet and Theodore Simon to the introduction of universal education in France and the need to identify children with learning disabilities and those who required remediation by grade and age group.  He then surveys how psychologist Lewis Terman of Stanford modified the test and transformed its purpose in order to attempt to find an objective basis for determining human intelligence.  In critiquing this transformation Gould provides examples of the more obviously (to modern eyes) biased questions on the test, and then effectively destroys the statistical basis for the correlations of the test in being able to determine any objective measure of g.  He demonstrates that the correlations established by the psychological profession to establish “g” are both statistically and logically questionable and that they commit the logical fallacy of reification–that is, they take an abstract measure and imbue it with a significance that it cannot possess as if it were an actual physical entity or “thing.”

Gould demonstrates that the variability of the measurements within groups, the clustering of results within the tests that identify distinct aptitudes, and the variability of results across time for the same individuals given changes in circumstances of material condition, education, and emotional maturity, renders “g” an insignificant measure.  The coup de grace in the original edition–a trope still often pulled out as the last resort by defenders of human determinism and IQ–is in Gould’s analysis of the work of Cyril Burt, the oft-cited researcher of twin studies, who published fraudulent works that asserted that IQ was highly heritable and not affected by environment.  That we still hear endless pontificating on “nature vs. nurture” debates, and that Stanford-Binet and other tests are still used as a basis for determining a measure of “intelligence” owes more to societal bias, and the still pseudo-scientific methodologies of much of the psychological profession, than scientific and intellectual honesty.

The core of Gould’s critique is to effectively discredit the concept of biological determinism which he defines as “the abstraction of intelligence as a single entity, its location within the brain, its quantification as one number for each individual, and the use of these numbers to rank people in a single series of worthiness, invariably to find that oppressed and disadvantaged groups—races, classes, or sexes—are innately inferior and deserve their status.”

What Stephen Jay Gould demonstrates in The Mismeasure of Man most significantly then, I think, is that human beings–particularly those with wealth, power, and influence, or who are part of a societally favored group–demonstrate an overwhelming desire to differentiate themselves from others and will go to great lengths to do so to their own advantage.  This desire includes the misuse of science, whatever the cost to truth or integrity, in order to demonstrate that there is an organic or heritable basis for their favored position relative to others in society when, in reality, there are more complex–and perhaps more base and remedial–reasons.  Gould shows how public policy, educational focus, and discriminatory practices were influenced by the tests of immigrant and minority groups to deny them access to many of the benefits of the economic system and society.  Ideology was the driving factor in the application of these standardized tests, which served the purposes of societal and economic elites to convince disenfranchised groups that they deserved their inferior status.  The label of “science” provided these tainted judgments with just the right tinge of respectability that they needed to overcome skepticism and opposition.

A few years after the publication of Gould’s work a new example of the last phenomenon described above emerged with the publication of the notorious The Bell Curve (1994) by Richard Herrnstein and Charles Murray–the poster child of the tradition harking back to Herbert Spencer’s Social Statics of elites funding self-serving pseudo-science and–another word will not do–bullshit.  While Spencer could be forgiven his errors given his time and scientific limitations, Herrnstein and Murray, who have little excuse, used often contradictory and poorly correlated (and causative) statistical methods to argue for a race-based argument of biological determinism.  Once again, Gould in the 1996 revision to his original work, dealt with these fallacies directly, demonstrating in detail the methodological errors in their work and the overreach inherit in their enterprise–another sad example of bias misusing knowledge as the intellectual basis to oppress other people and, perhaps more egregiously, to abandon coming to terms with the disastrous actions that American society has had on one specific group of people because of the trivial difference of skin color.

With the yeoman work of Stephen Jay Gould to discredit pseudo-scientific ideas and the misuse of statistical methodology to pigeonhole and classify people–to misuse socio-biology and advance self-serving theories of human determinism–the world has been provided the example that even the best financed and well entrenched elites cannot stop the advance of knowledge and information.  They will try–using more sophisticated methods of disinformation and advertising–but over time those efforts will be defeated.  It will happen because scientific projects like the Human Genome Project have already demonstrated that there is only one race–the human race–and that we are all tied together by common ancestors.  The advantages that we realize over each other at any point in time is ephemeral.  The knowledge regarding variability in the human species acknowledges differences in the heritable characteristics in individuals, but that knowledge implies nothing about our relative worth to one another, nor is it a moral judgment rendered from higher authority that justifies derision, stigma, ridicule, discrimination, or reduced circumstances.  It will happen because in our new age information, once transmitted, cannot be retracted–it is out there forever.  There is much wisdom here.  It is up to each of us to recognize it, and inform our actions as a result of it.

Sunday Contemplation — Finding Wisdom — Albert Camus

Albert Camus

“The evil that is in the world almost always comes of ignorance, and good intentions may do as much harm as malevolence if they lack understanding.”

Albert Camus was a philosopher like Bertrand Russell was a philosopher.  Camus, whose fiction is among the greatest written in the 20th century, denied that he was a philosopher or that he was proposing a philosophical position.  Indeed, in reading his fiction and essays it is apparent that he places little value in modern philosophy, ideology, and religion because, ultimately, each promises a utopia that is unrealizable and that oftentimes ends in evil, even though the intentions of the proponents of those schools of thought may be good.  Out of these writings, however, he does construct an edifice for how we can live our lives in a universe that we learn is vaster and older than we ever imagined.  In this way he anticipates the current crop of scientific writers who are beginning to extend their interests to this same territory, in particular, the so-called New Atheists through such works as Sam Harris’ The Moral Landscape, Daniel Dennett, and Richard Dawkins as in this talk:

But also other writings from various specialties such as Lewis, Amini, and Lannon in A General Theory of Love.  Or perhaps it is they who have continued his line of thought, though they may not be entirely aware of that fact.

For Camus, who lived first-hand during the fall, humiliation, and Vichy collaboration of his beloved France–a member of the Resistance–life was an “absurd” proposition since we live our mortal lives and ask ultimate questions in the face of a silent universe.  In his book length essay The Myth of Sisyphus (1942; Eng. tr. 1955) he noted that we humans continue to ask such questions yet, like Sisyphus, find ourselves tumbling back down the hill.  Reason and deductive philosophical methods fail to answer these questions since they attempt to prove using circular reasoning the propositions that they assume as true.

For me the essential wisdom to be garnered from Camus lies in the novels The Stranger (1942; Eng. trans., 1946), The Plague (1947; Eng. trans., 1948), and The Fall (1956; Eng. tr. 1957), though along with the essays The Rebel (1951; Eng. tr. 1954) and the aforementioned The Myth of Sisyphus, though he hardly ever wrote anything that was not worth reading.  Wisdom derived from these works is not simply in the philosophical propositions that they explore but in their insight into the human condition.

In The Stranger, the main character Meursault, a French Algerian, describes his world in a detached and pathological manner.  He is what today we would recognize as a sociopath, a condition that may describe as many as one of every twenty five people.  It is here that Camus explores the nature of evil.  The book opens with him discussing the death of his mother in a dry, almost passive voice, which he learns through a telegram.  He is asked to travel to a nursing home a distance away to make arrangements for her burial, which he does reluctantly.  He then returns home as quickly as he can to spend time with his girlfriend, for whom he expresses no feeling.  As we explore Meursault’s character we find that he does not care about anything, nor does he share empathy with his fellow human beings.  He decides eventually to kill another person as an intellectual exercise.  He wants to know: can he kill a stranger without anger?

When he is arrested for the crime Meursault barely tries to defend himself, explaining to the jurors that he feels nothing but annoyance at having to defend his actions.  As a result he is put to death for his crime.  The Stranger was first published in 1942 during the Nazi occupation of France.  It was during this time that Camus was editor of the Resistance newspaper Combat.  All around him was the horror of human cruelty given legitimacy by an invading force that killed without regret.  It is in this context that the novel’s flat tone is both shocking and intimate given the monstrous human phenomenon it describes.  For Camus, evil is ignorant–pathology and solipsism being extreme forms of ignorance.  The character Meursault sounds much like the pleadings of Eichmann after his capture by the Israeli authorities chronicled in Hannah Arendt‘s landmark book Eichmann in Jerusalem.  In her study of the man Arendt posited that Eichmann was anything but an aberration but, in her terminology, evil it turns out is banal.  In this same vein Camus’ Meursault is a very banal man, and the embodiment of his own country’s collaboration with fascism and the Holocaust which caused people to do horrible things to their fellow human beings.

In The Plague, Camus’ masterpiece, scores of people are falling ill and dying in the Algerian city of Oran.  Despite the reality before them, the city’s leaders are unwilling to accept that it is bubonic plague.  As the disease runs out of control with fear running amok, the government finally takes action and places the city under quarantine.  The people of the city are now not only cut off from the outside world and their loved ones, but also cut off from social contact within the city.  Fear, isolation, and panic overtake the community.

As Camus develops his story the people of Oran react in one of two ways to the plague: those who personalize the danger and regret their lives, and those who dedicate themselves to caring for the sick, despite the personal danger to their own health.  Among this latter group is Dr. Rieux and a few of his acquaintances.  Only after almost half of the city’s population dies does the community realize that all of them have a high probability of dying.  Accepting their own mortality they develop a sense of unity and place the needs of the community of a whole above their own personal needs and desires.  This is a theme that Camus will revisit in later essays and literature.  Faced with the realization of one’s mortality in an indifferent universe does one give up and die, pursue one’s own interests, or is there still another way to preserve the best that makes us human?  Camus comes down strongly for finding such a way in the compassion, sympathy, and empathy felt among one’s fellow human beings, which speak to the needs of all of us.

In The Fall, probably Camus’ most controversial and complex novel, we follow the conversation between former Parisian lawyer, Jean-Baptiste Clamence, and a fellow Parisian he meets in a seedy dockside bar in Amsterdam named Mexico City.  The conversation is one-sided, and first person through the second person, not an unfamiliar approach for those familiar with the work of Joseph Conrad.  The story covers a period of five days in five separate locations starting at the bar and ending in Clamence’s apartment.  Clamence describes himself as a “judge-penitent,” and it is not entirely clear what he means when his narrative begins, but which reveals itself as the story unfolds.

The novel follows three main sections:  Clamence in Paris and his fall, Clamence in a prison camp during the Second World War, and Clamence’s acquisition of the painting “The Just Judges.”  Each of these sections pose a dilemma and explain Clamence’s self-description of “judge-penitent.”

In Paris, before his self-described fall, Clamence had been a well respected lawyer.  He viewed himself as the defender of the downtrodden and actively sought out cases that bolstered his image in this way.  His actions were not so much motivated by altruism than both public approval and self-image.  Clamence’s fall, and his self-imposed exile to Amsterdam, is caused by his own lack of action when a woman falls to her death along the River Seine.  He passed the woman along his walk and saw that something was amiss.  Regardless he presses on and hears a splash, though he doesn’t see her fall.  He chooses not to go back and investigate, avoiding the choice of whether to place his own life in danger in saving the woman.  He tries putting the incident out of his mind and avoids reading the newspapers in fear that they may confirm that the woman did, indeed, jump–an act that would undermine his own self-image.

Then one day, he finds himself close to the same location along the river while in a self-congratulatory mood.  He hears laughter in the distance and it seems to be coming from the water, though he turns and it most likely came from two lovers in the distance, though there is enough doubt in the narrative to suggest that it was generated by Clamence’s subconscious and that he himself uttered the laugh.  He is thus reminded of his cowardly behavior and the possibility of the woman’s death.  He is struck by the contradiction of his self-image and the reality of his motivations and actions.

Later Clamence’s “fine picture of himself” is literally shattered by a sucker-punch to the face coming from a motorcyclist with whom he gets into an argument for blocking a congested city street.  Dejected and seeing for himself for the first time for what he truly is, Clamence attempts to destroy the image he built of himself, living a life of debauchery and consorting with the worst elements of Paris.  Despite these attempts the myth of his public image is too strong and he fails as a public penitent.

In the second part of the narrative, Clamence tells the story of his desire during the war to join the Resistance, but his fear of death is too much for him.  In fear he instead flees to North Africa with the intention of ending up in London.  I was reminded in reading this portion of the book of the Humphrey Bogart movie Casablanca and came to realize that its narrative was very close to the experience of many Frenchmen during this time.  During his transit Clamence is arrested in Tunis, supposedly as a precautionary measure, and ends up in a German prison camp.  While in the camp he meets a veteran of the Spanish civil war, captured by a “Catholic general” and handed over to the Germans.  The man tells him that, supposedly as a result of the Church’s collaboration, he has lost his faith in Catholicism and posits that a new Pope is needed.  Only able to control the limited environment of their imprisonment, the inmates at the behest of the Spanish inmate elect Clamence the camp “Pope,” with wide latitude over the distribution of food, water, and work assignments.  At first diligent in his duties Clamence abuses his power one day by drinking the water of a dying man.  For the second time we have the imagery of water.  In the first case Clamence refuses to immerse himself to save another.  In this case Clamence consumes the water to cause the death of another.

In the final sequence, the stolen Jan van Eyck panel entitled The Just Judges from the fifteenth-century Ghent altarpiece entitled The Adoration of the Lamb hangs in a cupboard in Clamence’s apartment.  He explains that he acquired it from the bartender of the Mexico City who, in turn, had received it from the thief in return for a drink.  Because Clamence knew that the painting was being sought by the authorities he extended a “kindness” by offering to hide the panel for the new owner.  The subject of the panel are the judges on their way to adore Jesus.  To Clamence the judges will never find him since he cannot offer people the redemption that they seek.  Since Jesus’ teachings emphasized the avoidance of judging others, the Church subverted his message and turned him into the ultimate judge, separating him from his innocence as the Lamb.  It is here that he defines his role as judge-penitent.

Many critics have looked at The Fall as a break from the more optimistic and positive messages in The Plague, The Myth of Sisyphus, and The Rebel.  Instead, however, I believe that this work is the fullest rendering of the human condition that he wrote, exploring the themes that he always visited.  Unlike The Stranger, there is no final judgement that brings justice.  Unlike The Plague, there is no community to pull together.  Instead, in the atomistic post-World War II world we only have individuals who appear to be trustworthy and acting in the public interest, though the reality is starkly different.  What goes around does not always come around.  In this way Camus is much like Mark Twain’s “The Story of the Bad Little Boy.”

The narrative structure goes a step further by insinuating the reader into Clamence’s world.  As such we, the second person, allow him to be what he is.  And, as such, we are co-conspirators to his actions and, by extension, to the world we allow to take place.  It is a book, along with its predecessors, that still speaks to our time.