Sunday Contemplation — Finding Wisdom: The Epimenides Paradox

The liar’s paradox, as it is often called, is a fitting subject for our time. For those not familiar with the paradox, it was introduced to me by the historian Gordon Prange when I was a young Navy enlisted man attending the University of Maryland. He introduced the paradox to me as a comedic rejoinder to the charge of a certain bias in history that he considered to be without merit. He stated it this way: “I heard from a Cretan that all Cretans are liars.”

The origin of this form of the liar’s paradox has many roots. It is discussed as a philosophical conundrum by Aristotle in ancient Greece as well as by Cicero in Rome. A version of it appears in the Christian New Testament and it was a source of study in Europe during the Middle Ages.

When I have introduced the paradox in a social setting and asked for a resolution to it by the uninitiated, usually a long conversation ensues. The usual approach is as a bi-polar proposition, accepting certain assumptions from the construction of the sentence, that is, if the Cretan is lying then all Cretans tell the truth which cannot be the case, but if the Cretan is telling the truth then he is lying, but he could not be telling the truth since all Cretans lie…and the circular contradiction goes on ad infinitum.

But there is a solution to the paradox and what it requires is thinking about the Cretan and breaking free of bi-polar thinking, which we often call, colloquially, “thinking in black and white.”

The solution.

The assumption in the paradox is that the Cretan in question can speak for all Cretans. This assumption could be false. Thus not all Cretans are liars and, thus, the Cretan in question is making a false statement. Furthermore, the Cretan making the assertion is not necessarily a liar–the individual could just be mistaken. We can test the “truthiness” of what the Cretan has said by testing other Cretans on a number of topics and seeing if they are simply ignorant, uninformed, or truly liars on all things.

Furthermore, there is a difference between something being a lie and a not-lie. Baked into our thinking by absolutist philosophies, ideologies, and religions is black and white thinking that clouds our judgement. A lie must have intent and be directed to misinform, misdirect, or to cloud a discussion. There are all kinds of lies and many forms of not-lies. Thus, the opposite of “all Cretans are liars” is not that “all Cretans are honest” but that “some Cretans are honest and some are not.”

If we only assume the original conclusion as being true, then this is truly a paradox, but it is not. If we show that Cretans do not lie all of the time then we are not required to reach the high bar that “all Cretans are honest”, simply that the Cretan making the assertion has made a false statement or is, instead, the liar.

In sum, our solution in avoiding falling into the thinking of the faulty or dishonest Cretan is not to accept the premises as they have been presented to us, but to use our ability to reason out the premises and to look at the world as it is as a “reality check.” The paradox is not truly a paradox, and the assertion is false.

(Note that I have explained this resolution without going into the philosophical details of the original syllogism, the mathematics, and an inquiry on the detailed assumptions. For a fuller discussion of liar’s paradoxes I recommend this link.)

Why Care About the Paradox?

We see versions of the paradox used all of the time. This includes the use of ad hominem attacks on people, that is, charges of guilt by association with an idea, a place, an ethnic group, or another person. “Person X is a liar (or his/her actions are suspect or cannot be trusted) because they adhere to Y idea, group, or place.” Oftentimes these attacks are joined with insulting or demeaning catchphrases and (especially racial or ethnic) slurs.

What we attribute to partisanship or prejudice or bias often uses this underlying type of thinking. It is a simplification born of ignorance and all simplifications are a form of evil in the world. This assertion was best articulated by Albert Camus in his book The Plague.

“The evil that is in the world always comes of ignorance, and good intentions may do as much harm as malevolence, if they lack understanding. On the whole, men are more good than bad; that, however, isn’t the real point. But they are more or less ignorant, and it is this that we call vice or virtue; the most incorrigible vice being that of an ignorance that fancies it knows everything and therefore claims for itself the right to kill. The soul of the murderer is blind; and there can be no true goodness nor true love without the utmost clear-sightedness.”

Our own times are not much different in its challenges than what Camus faced during the rise of fascism in Europe, for fascism’s offspring have given rise to a new generation that has insinuated itself into people’s minds.

Aside from my expertise in technology and the military arts and sciences, the bulk of my formal academic education is as an historian and political scientist. The world is currently in the grip of a plague that eschews education and Camus’ clear-sightedness in favor of materialism, ethnic hatred, nativisim, anti-intellectualism, and ideological propaganda.

History is replete with similar examples, both large and small, of this type of thinking which should teach us that this is an aspect of human character wired into our brains that requires eternal vigilance to guard against. Such examples as the Spanish Inquisition, the Reformation and Counter Reformation, the French Revolution, the defense of slavery in the American Civil War and the subsequent terror of Jim Crow, 18th and 19th century imperialism, apartheid after the Boer War, the disaster of the First World War, the Russian Revolutions, the history of anti-Jewish pogroms and the Holocaust, the rise of Fascism and Nazism, Stalinism, McCarthyism in the United States, Mao and China’s Cultural Revolution, Castro’s Cuba, Pinochet’s Chile, the Pathet Lao, the current violence and intolerance borne of religious fundamentalism–and the list can go on–teaches us that our only salvation and survival as a species lies in our ability to overcome ignorance and self-delusion.

We come upon more pedestrian examples of this thinking all of the time. As Joseph Conrad wrote in Heart of Darkness, “The mind of man is capable of anything—because everything is in it, all the past as well as all the future.”

We must perform this vigilance first on ourselves–and it is a painful process because it shatters the self-image that is necessary for us to continue from day-to-day: that narrative thread that connects the events of our existence and that guides our actions as best and in as limited ways that they can be guided, without falling into the abyss of nihilism. Only knowledge, and the attendant realization of the necessary components of human love, acceptance, empathy, sympathy, and community–that is understanding–the essential connections that make us human–can overcome the darkness that constantly threatens to envelope us. But there is something more.

The birth of the United States was born on the premise that the practical experiences of history and its excesses could be guarded against and such “checks and balances” would be woven, first, into the thread of its structure, and then, into the thinking of its people. This is the ideal, and it need not be said that, given that it was a construction of flawed men, despite their best efforts at education and enlightenment compared to the broad ignorance of their time, these ideals for many continued to be only that. This ideal is known as the democratic ideal.

Semantics Matter

It is one that is under attack as well. We often hear the argument against it dressed up in academic clothing as being “only semantics” on the difference between a republic and a democracy. But as I have illustrated  regarding the Epimenides Paradox, semantics matter.

For the democratic ideal is about self-government, which was a revolutionary concept in the 18th century and remains one today, which is why it has been and continues to be under attack by authoritarians, oligarchs, dictators, and factions pushing their version of the truth as they define it. But it goes further than than a mechanical process of government.

The best articulation of democracy in its American incarnation probably was written by the philosopher and educator John Dewey in his essay On Democracy. Democracy, says Dewey, is more than a special political form: it is a way of life, social and individual, that allows for the participation of every mature human being in forming the values that regulate society toward the twin goals of ensuring the general social welfare and full development of human beings as individuals.

While what we call intelligence be distributed in unequal amounts, it is the democratic faith that it is sufficiently general so that each individual has something to contribute, whose value can be assessed only as enters into the final pooled intelligence constituted by the contributions of all. Every authoritarian scheme, on the contrary, assumes that its value may be assessed by some prior principle, if not of family and birth or race and color or possession of material wealth, then by the position and rank a person occupies in the existing social scheme. The democratic faith in equality is the faith that each individual shall have the chance and opportunity to contribute whatever he is capable of contributing and that the value of his contribution be decided by its place and function in the organized total of similar contributions, not on the basis of prior status of any kind whatever.

In such a society there is no place for “I heard from a Cretan that all Cretans lie.” For democracy to work, however, requires not only vigilance but a dedication to education that is further dedicated to finding knowledge, however inconvenient or unpopular that knowledge may turn out to be. The danger has always been in lying to ourselves, and allowing ourselves to be seduced by good liars.

Note: This post has been updated for grammar and for purposes of clarity from the original.

Don’t Know Much…–Knowledge Discovery in Data

A short while ago I found myself in an odd venue where a question was posed about my being an educated individual, as if it were an accusation.  Yes, I replied, but then, after giving it some thought, I made some qualifications to my response.  Educated regarding what?

It seems that, despite a little more than a century of public education and widespread advanced education having been adopted in the United States, along with the resulting advent of widespread literacy, that we haven’t entirely come to grips with what it means.  For the question of being an “educated person” has its roots in an outmoded concept–an artifact of the 18th and 19th century–where education was delineated, and availability determined, by class and profession.  Perhaps this is the basis for the large strain of anti-intellectualism and science denial in the society at large.

Virtually everyone today is educated in some way.  Being “educated” means nothing–it is a throwaway question, an affectation.  The question is whether the relevant education meets the needs of the subject being addressed.  An interesting discussion about this very topic is explored at Sam Harris’ blog in the discussion he held with amateur historian Dan Carlin.

In reviewing my own education, it is obvious that there are large holes in what I understand about the world around me, some of them ridiculously (and frustratingly) prosaic.  This shouldn’t be surprising.  For even the most well-read person is ignorant about–well–virtually everything in some manner.  Wisdom is reached, I think, when you accept that there are a few things that you know for certain (or have a high probability and level of confidence in knowing), and that there are a host of things that constitute the entire library of knowledge encompassing anything from a particular domain to that of the entire universe, which you don’t know.

To sort out a well read dilettante from someone who can largely be depended upon to speak with some authority on a topic, educational institutions, trade associations, trade unions, trade schools, governmental organizations, and professional organizations have established a system of credentials.  No system is entirely perfect and I am reminded (even discounting fraud and incompetence) that half of all doctors and lawyers–two professions that have effectively insulated themselves from rigorous scrutiny and accountability to the level of almost being a protected class–graduate in the bottom half of their class.  Still, we can sort out a real brain surgeon from someone who once took a course in brain physiology when we need medical care (to borrow an example from Sam Harris in the same link above).

Furthermore, in the less potentially life-threatening disciplines we find more variation.  There are credentialed individuals who constantly get things wrong.  Among economists, for example, I am more likely to follow those who got the last financial crisis and housing market crash right (Joe Stiglitz, Dean Baker, Paul Krugman, and others), and those who have adjusted their models based on that experience (Brad DeLong, Mark Thoma, etc.), than those who have maintained an ideological conformity and continuity despite evidence.  Science–both what are called the hard and soft sciences–demands careful analysis and corroborating evidence to be tied to any assertions in their most formalized contexts.  Even well accepted theories among a profession are contingent–open to new information and discovery that may modify, append, or displace them.  Furthermore, we can find polymaths and self-taught individuals who have equaled or exceeded credentialed peers.  In the end the proof is in the pudding.

My point here is threefold.  First, in most cases we don’t know what we don’t know.  Second, complete certainty is not something that exists in this universe, except perhaps at death.  Third, we are now entering a world where new technologies allow us to discover new insights in accessing previously unavailable or previously opaque data.

One must look back at the revolution in information over the last fifty years and its resulting effect on knowledge to see what this means in our day-to-day existence.  When I was a small boy in school we largely relied on the published written word.  Books and periodicals were the major means of imparting information, aside from collocated collaborative working environments, the spoken word, and the old media of magazines, radio, and television.  Information was hard to come by–libraries were limited in their collections and there were centers of particular domain knowledge segmented by geography.   Furthermore, after the introduction of television, society had developed  trusted sources and gatekeepers to keep the cranks and flimflam out.

Today, new media–including all forms of digitized information–has expanded and accelerated the means of transmitting information.  Unlike old media, books, and social networking, there are also fewer gatekeepers in new media: editors, fact checkers, domain experts, credentialed trusted sources, etc. that ensure quality control, reliability, fidelity of the information, and provide context.  It’s the wild west of information and those wooed by the voodoo of self-organization contribute to the high risk associated with relying on information provided through these sources.  Thus, organizations and individuals who wish to stay within the fact-based community have had to sort out reliable, trusted sources and, even in these cases, develop–for lack of a better shorthand–BS detectors.  There are two purposes to this exercise: to expand the use of the available data and leverage the speed afforded by new media, and to ensure that the data is reliable and can reliably tell us something important about our subject of interest.

At the level of the enterprise, the sector, or the project management organization, we similarly are faced with the situation in which the scope of data that can be converted into information is rapidly expanding.  Unlike the larger information market, this data on the microeconomic level is more controlled.  Given that data at this level suffers from significance because it records isolated events, or small sample sizes, the challenge has been to derive importance from data where sometimes significance is minimal.

Furthermore, our business systems, because of the limitations of the selected technology, have been self-limiting.  I come across organizations all the time who cannot imagine the incorporation and integration of additional data sets largely because the limitations of their chosen software solution has inculcated that approach–that belief–into the larger corporate culture.  We do not know what we do not know.

Unfortunately, it’s what you do not know that, more often than not, will play a significant role in your organization’s destiny, just as an individual that is more self-aware is better prepared to deal with the challenges that manifest themselves as risk and its resultant probabilities.  Organizations must become more aware and look at things differently, especially since so many of the more conventional means of determining risk and opportunities seems to be failing to keep up with the times, which is governed by the capabilities of new media.

This is the imperative of applying knowledge discovery in data at the organizational and enterprise level–and in shifting one’s worldview from focusing on the limitations of “tools”: how they paint a screen, whether data is displayed across the x or y axis, what shade of blue indicates good performance, how many keystrokes does it take to perform an operation, and all manner of glorified PowerPoint minutia–to a focus on data:  the ability of solutions to incorporate more data, more efficiently, more quickly, from a wider range of sources, and processed in a more effective manner, so that it is converted into information to be able to be used to inform decision making at the most decisive moment.