The title is an ironic gesture to a disturbingly cheerful (some, like me, might say saccharin) tune by Bacharach and David, but my intention is to talk about what is less happily categorized as circular reasoning. This is one of those fallacies that has been recognized for so long that the medievals gave it a Latin name: petitio principii. It is also one of those painful failures of basic reasoning that goes beyond the narrow confines of formal logic, or introductory critical thinking classes. This is one of those monsters of bad thinking that empower authoritarian minded individuals and their enablers to unshamefacedly spout about “alternative facts” and other infantile drivel. You see, the problem with a circle, as well as with a mind that reasons in one, is that the circle is closed; inquiry, on the other hand, is (by necessity) open and ongoing.
I’ve talked before (several times, in fact) about what Altemeyer describes as the “compartmentalization” that occurs in authoritarian belief and ideology. One can scarcely dignify this latter as “thinking,” regardless of the degree of sophisticated cleverness employed in maintaining those compartments as air tight against all facts and logic. Authoritarian thinkers, following Hamlet’s example, keep their minds, bounded in a nutshell and count themselves kings of infinite space, were it not that they have bad dreams. (Of course, Hamlet was being ironic, and mocking his interlocutors, something the Mango Mussolini’s enthusiasts entirely fail to grasp.) The thing is, these people choose to be bounded by a nutshell, all the while imagining themselves in princely command of infinite space. Meanwhile, their bad dreams (which are the trailings of reality, dogging them despite their dogmatism) are the sources of their willing embrace of Trumpian neo-fascism. Because the nutshell – the “nut house” – in which they have bound their minds is a tightly enclosed circle that permits no entry from reality. Continue reading
It is certainly disturbing to see how many people prefer a convenient lie over a disquieting truth. But more importantly, we should make note of how many people will flee in abject terror to the warm, terroristic embrace of a convenient lie when confronted with an indisputable uncertainty, the unavoidable knowing that you do not know. I should get that tattooed somewhere … somewhere where no one will ever see it …
There is a formal structure to at least some kinds of disruptive uncertainty, and that structure is not all that hard to understand. I’ll mostly be discussing that logical structure, which often requires a kind of patience with inconsistency. But I will turn to the psychological issues of those who embrace inconsistency without thought at the end. What I wish to address here are kinds of inconsistency, most importantly noting that there are genuinely and importantly different kinds. I’ll mainly draw on investigations by Nicholas Rescher and Robert Brandom, coupled with developments by Jon Barwise and John Perry. Continue reading
Where an argument comes from is not supposed to be relevant to the logical credibility of the argument, and there are named fallacies that highlight just such errors. (I’m going to talk loosely here, at first, so take the immediately following with a grain of salt.) The genetic fallacy says that where an argument comes from – its origins or “genesis” – should not be treated as relevant to the cogency of that argument. A somewhat more specific version of the genetic fallacy is a variant on the argumentum ad hominem, known as the tu quoque fallacy. “Tu quoque” basically means “you too,” or “you’re another.” The idea with this latter is rejecting the advice or argument of a person on the grounds that that person is doing the very thing she or he is advising against.
However, such a rejection is clearly not only unfair, but unjustifiable. An alcoholic may not be able to stop drinking, but is certainly in a position to understand the evils of that drinking, and present cogent arguments against it. Similarly, the nicotine addict, slowly suffocating from emphysema may not be physically or psychologically able to stop smoking, but said person is certainly well placed to understand the viciousness of doing so, and can offer extremely valid arguments against ever picking up the habit. But there are times when the source of a claim really is important, and needs to be taken into account when evaluating a claim. The probative value of evidence which we are not able to check ourselves often rests on the credibility of the source. The superficial version of the genetic fallacy that I presented above says that the source of a claim should not be given any weight, and that the argument should be evaluated by itself and on its own terms. But when we do not have complete control and/or mastery over those terms, then that source must also be taken into account. Continue reading
The “slippery slope” is the fallacy (if it is a fallacy – some might dispute that!) that says certain actions cannot ever be taken because they lead to other actions, which make still other actions possible, etc., leading finally to some kind of catastrophic action which can no longer be denounced or argued against because of all the little steps that led up to it and gave it permission. It is a frequent traveler with those who would argue against any sort of incremental changes to social institutions or the guarantee of civil rights. Thus, we’ve seen a great deal of slippery slope “reasoning” amongst conservatives denouncing marriage equality, with such claims being floated as, “If gays are allowed to marry, what is to prevent people from marrying farm animals, or young children?” (I’ll not link to any such claims; if the rock you’ve been hiding under these past several years has kept you shielded from such nonsense, I will not be the one responsible for breaking your bubble.)
What inspired me to write about this now was my recollection of how this fallacy relates to the famous sorites paradox: Sorites: noun so·ri·tes \sə-ˈrī-(ˌ)tēz\ The paradox (if it is a paradox) rotates around the question of how trivial actions, too small to have any consequence of their own, nevertheless can sum up to be massive and absolute distinctions. So, in a sense, slippery slope is going down the hill, while sorites is going up it. Continue reading
So my last round of musing was on the subject of “emptiness.” Connected to that idea is the concept of “fullness,” of “plenum.” I suspect that one of the primary failures of contemporary metaphysics is misunderstanding which is really which: that is to say, what is really full, and what is really empty. Here again, Whitehead’s process metaphysics offers us important insights. Because how we think of “fullness” – of a thing, a region of space, or whatever – is directly correlated to what we believe to be genuinely real. I argued earlier against the naïve concept of “empty” space, pointing out that not only is that space (according to physics) a broiling froth of micro events and virtual particles, but that it is also densely awash in relational connections to the rest of the universe. Adding to that earlier discussion, one could say that the space itself is a kind of “thing”: it is an event in its own right, it is a process of space relating itself to other spatial events. In this regard, Whitehead rejected the “material aether” that dominated astrophysical thought in the days between James Clerk Maxwell and Albert Einstein (the last quarter of the 19th C. to the first decade or two of the 20th), and argued instead for an “aether of events” as the dominating characteristic of space.
Without assuming – indeed, explicitly denying – any absolute sense of either “emptiness” or “fullness,” what sorts of relative conditions might lead us to characterize one sort of collection as generally more full, and another as comparatively more empty? Well, for that we need a notion of what it is that fills, hence that which is not there when things are empty. My argument is that what “fills” are events and relations. Continue reading
A recent interview on NPR, in their “Short Answers to Big Questions” segment, went to special extremes to demonstrate how monumentally bad science journalism is these days. My discussion here will come in two parts, one short, and one a great deal more detailed. The short part will be a quick debunking of supposedly scientific claims from a conventionally scientific standpoint. In particular, statements are made in this interview with absolute confidence that cannot possibly stand up to even the most basic grasp of physical science. The longer discussion will have to do with philosophical criticisms that run beyond most of contemporary science. This is because so much of that science has degenerated into pure model centrism, and consequently fails to ask any of the fundamental questions that need to be raised. The motivating idea behind all of this is the idea of “empty” space.
The offending NPR piece opened with a question about how empty a volume of space would be were there only (say) three atoms or molecules within a volume of about one cubic meter. After a few moments discussion about the volume of molecules of air in a cubic meter at sea level (a discussion that appears to contain an unimportant typographical error), the discussion moves out into space, into deep, deep space. The conversation leads to the following (slightly edited) highly problematic exchange:
if there are points in space with only three atoms per square meter, what fills in the rest? The answer is nothing…
for a physicist, the absence of matter is nothing. I mean there is still space and time there, but you know, there – the absence of matter we consider to be a state of, you know, zero matter, zero energy density, is a way of putting it.
The problems here come at two levels, one of fairly ordinary physics and the other at a deeper philosophical level. I’ll deal with both in turn. Continue reading
Kenneth Arrow is a well known economist, logician, statistician, and political theorist. While his scholarly contributions are numerous, his best known was his first, published as a part of his dissertation. This is the above titled “paradox of voting,” which is also referred to has his “impossibility theorem.” This latter is evidently the technically correct title. However, I learned about it as the paradox of voting, and that’s the title I’ll stick with here. For one thing, calling it his “paradox of voting” makes it more clear at the outset what the theorem is about, and suggests what is really at stake. Details of the impossibility theorem are readily found for no more effort than looking, so my intention here is to provide a non-technical gloss of the topic. Still, enough of what I say here is about basic logic (and not merely political screed) that I am satisfied that this topic falls within my basic parameters for this blog.
The stakes here could scarcely be any higher, as they effect the very foundation of our nominally democratic system. Because of how our voting and electoral system is set up, we have a “winner take all” format that can (and often enough, does) allow a person to be elected even thought that person did not receive a majority of the votes. Once you have more than two candidates (or more than two parties) involved in any particular election, it is no longer possible to representatively distribute preferences in the election. This is the somewhat fancy way of saying things. The simpler way of saying it is that the more widely detested candidate can win. Continue reading
An oddity about philosophers, and especially logicians, is that when they talk about “quantity” they are not talking about numbers, or numerical counts. Rather, they are talking about the ways things can be gathered together (or singled out) using words like “all” or “some.” These ideas are called “quantifiers.” I want to do three things (briefly, as always) here: say a little about the “basic” quantifiers (“all” and “some”), say a little about how they get dropped from common discourse and argument – whether from laziness or deliberate obfuscation – leading to much gratuitous confusion. Finally, I want to say something about quantifiers that typically do not make it onto philosophers’ or logicians’ lists, yet are at least as common in ordinary discourse and argument as the “principal” two are. My purpose here (as always) is not to lead you onto the path of righteous proof making, but simply alert the reader to the importance of these operators so that they might not slip by quite so stealthily in the future.
The second greatest sin in logic is to allow things to pass implicitly; the greatest sin is to block the road of inquiry, which is one of the things that happens when concepts are allowed to pass implicitly. Allowing things to remain implicit means that vague statements are permitted, by innuendo, to become concrete, thus leading us astray (blocking inquiry) from the directly stated vagueness. Sometimes things really are ambiguous, and they must be allowed to stay that way until real data, rather than jumping at conclusions, enables us to clear up the ambiguity. That, or recognize that the ambiguity is not – or, at least, not yet – cleared. Continue reading