Thank You For Your Service


Like many people who have worn the uniform, that phrase makes me uncomfortable.

Uncomfortable, mind you; not angry or upset as it does for many Vietnam (and these days, I suspect, Afghanistan) veterans. Just uncomfortable.

Because, you see, I did not swear the oath, I did not don the pickle suit, for you. I thought I was doing it for “me,” though 45+ years after the fact I recognize I scarcely understood at the time what that meant. I strongly suspect that even those adorable naifs who are certain they are acting purely out of love for God and Country (who nominally ARE doing it for you) were and are every bit as clueless about what they were saying as I was; even those remarkable few who, after how ever many years, are even more certain now than they were then that they were/are acting for God and Country. Because whatever the character of their certainty then, it is most certainly not the conviction they live by now.

Regardless what they might believe at the time, nobody really understands what they are committing to when they take that oath. And it feels really awkward for being congratulate for having put on a blindfold and then running off a cliff, when you don’t even know IF there is a bottom, much less where the bottom might be to that cliff.

Two things you should understand here. The first is a matter of objective fact. And that is the difference between Veterans day and Memorial day.i It is a really easy difference to understand, which is why it is so sad that so many people do not understand it.

  • Veterans day is for those who came home.
  • Memorial day is for those who did not.

Which is part of the awkwardness (for me) of when people say, “thank you for your service.” It is a little like saying, “thank you for not taking up space in Arlington.” Because I didn’t do it for you. I did it for me. But I didn’t know at the time what that meant.

I was in the US Army from 1975 until 1978. For context, Saigon fell in April of 1975, and I went active (into Basic training) in June. From ‘76 until I rotated out in June of ‘78 I was stationed on what was, at the time, the East German border, assigned to an IHAWK anti-aircraft missile battery. The closest I ever came to combat was cocking snooks at the Russians, some 12 or so klicks to the east. But for all of that, I did take my duties seriously. Because – and I didn’t really understand this at the time (I’m saying nobody ever does) – swearing the oath changed me. In particular, I came to understand that some 35 years or so after I raised my right hand, I realize I still consider myself bound by that oath. In particular, the part where I swore

to protect and defend the Constitution of the United States against all enemies foreign and domestic.

That’s some powerful shit right there. In particular, it means that Fascist animals like Donald Trump are persons I am oath bound to oppose. Because for these people, the Constitution is nothing more than toilet paper to wipe their butts on. But even as my entire body shifted at the time of the saying of those words, I am still learning what they mean for me. For me, mind you, not you. And I’m still learning what that means

For example, I have, for some years past (though hardly forever), taken up carrying a copy of the Constitution on my person at all times. This habit was triggered by the TV show The West Wing, where they consistently referred to it as, “the Owners’ manual.” But the reason that show allowed me to realize that doing so was important was precisely because it reminded me that my oath was to the Constitution and not, for example, to the flag. Don’t every let anyone fool you on this point: the flag is a rag. Nobody ever died “defending the flag,” except for sorry-assed buffoons who were too illiterate to pay any attention to the oath that they actually swore. Refulgent in mythological imagery – yet devoid of any cognitive content – the flag is something rightwing fascists go into apoplectic histrionics over. It is not an accident that they only mention the Constitution as though it were itself nothing more than another flag to wave.

So this evening I had a very nice meal at O’Charley’s, which has a very generous offer of a free entree (and the local one included the first beer) for veterans on this Veterans’ day. I find being surrounded by people in a moderate state of noisiness, who are otherwise uninterested in bothering me, to be an excellent context for reflection. Having an external world to tune out makes it easier to concentrate on my thoughts within. (I basically wrote my dissertation with Metallica on a loop, so … yeah.) I frankly thought it was more appropriate to tell the wait staff and cooks, “thank you for your service,” than for anyone to say as much to me. But it was an opportunity to spend some time in my own thoughts, with my body quieted by an environment that included a good meal and non-intrusive environs (non-intrusive in their presence rather than their frantically demanding absence).

And so I’m going to leave this somewhat less than ideally connected stream of consciousness with this one final observation.

When I was in the Army, we never even observed (much less “celebrated”) either Memorial or Veterans’ day. Maybe that has changed since I was in uniform. But back then we never did, and it was only today, 45+ years later that I made that connection.

And it seems right.

To “celebrate” Memorial day, for someone in uniform, is to make a mockery of those who have given “their last full measure.” And to “celebrate” Veterans’ day is like dancing up and down shouting “Yay me!”

The wrongness just doesn’t get any wronger than that. And maybe that’s why having people say, “thank you for your service,” just feels uncomfortable. I didn’t do it for you, even if I came to discover that I did it for my country and my Constitution.

So I’m not going to get angry, I’m not going to be confrontational, I’m not going to be upset.

But at the same time, I wouldn’t mind if y’all just stopped doing that.

– – – – – – – – – –

i These are, of course, the US holidays.

Biggest Mistake



“Biggest mistake of my life.”

“Worst mistake I ever made.”

I can’t speak for other cultures, but phrases such as the above and others akin to them are fairly commonplace in American conversation, particularly when the topic involves the foolish choices made when we were young. While often accompanied with an eye roll and a shake of the head in signs of regret, there is just as often a tinge of wistfulness as well, a longing for a return to that kind of vivid recklessness and the electrifying sense of being alive that was at its core.i There is certain legitimacy to that longing – even, and even especially, for the mistakes – at the metaphysical level. For every act of creation is, in an important sense, an error, a mistake, a “failure” to follow the “correct” path. So it is worth a moment to take a look at such things.

Before going any further, I want to dismiss one kind of mistake that is grotesque in its calculated refusal of any possibility of creativity. That is the kind of action “celebrated” by the despicable Jackass films and shows. These aren’t errors of any kind. They are acts of willful stupidity pandering to the lowest element of human character, “entertainment” predicated on laughing derisively at others for pulling absurdist stunts devoid of any talent or art. These programs are simply an extension of the “Good Ol’ Boy’s last words” jokes.ii There is nothing interesting or amusing about such behavior or the people who wallow in it.

Complexity – It Ain’t Simple (part 2 of 2)


, ,

The guiding motto in the life of every natural philosopher should be, Seek simplicity and distrust it.”

– Alfred North Whitehead, The Concept of Nature (end of chapter VII.)

Ultimately, the only way we know how to measure the complexity of some process or phenomenon – beyond excruciatingly vague and unhelpful statements like, “this is really complicated” – is by measuring how hard it is to solve the mathematical equations used to characterize the problem. All the rest, even when palpably, indisputably true, is just hand-waving. Sometimes hand-waving makes us feel better, because we need to burn off the energy pent up in our frustration. But it never really tells us anything. On the other hand, we really do have some effective means of measuring how hard it is to solve some mathematical equation or other, and we’ve refined such measures significantly over the past fifty years because such measures tell us a great deal about what we can and cannot do with our beloved computers (which includes all of your portable and handheld devices, in case you weren’t sure.)

Some problems simply cannot be solved. This even despite the fact that the problems in question seem perfectly reasonable ones that are well and clearly formulated. (Actually, being well formulated makes it easier to demonstrate when a problem cannot be solved.) Some problems can be solved, albeit with certain qualifications, while still others are “simply” and demonstrably solvable.i However, saying that a problem is “solvable” – even in the pure and “simple” sense (notice how I keep scare-quoting that word) – doesn’t mean that it can be solved in any useful or practical sense. If the actual computation of a solution ultimately demands more time &/or computer memory space than exists or is possible within the physical universe, then it is unclear how we mere mortals benefit from this theoretical solvability.ii It is these latter considerations that bring us into the realm of computational complexity.

Complexity – It Ain’t Simple (part 1 of 2)


, , ,

Some sixty-one years ago, the American philosopher Willard Van Orman Quine wrote a famous essay, “On Simple Theories of a Complex World.” Actually, referring to this as a “famous essay” is a tad redundant, since Quine is one of those people who only ever wrote famous essays. But setting that observation (bordering on sour grapes) aside, Quine goes on to observe the difficulty in saying just what does qualify as simplicity. He further observes the legitimate psychological and formal reasons while theory builders so ardently crave simple theories: the simpler the theory, the more readily it can be employed in our various cognitive activities. Of course, too simple a theory leaves us with no purchase on the world what-so-ever. “God willed it” is about as simple a theory as you can come up with, but it is also as singularly useless a theory as anyone could ever imagine; it provides absolutely no insight, a complete absence of predictive power, and only an illusion of emotional comfort for those readily distracted by vacuous hand waving.

A “Rube Goldberg” machine.

Quine was writing more than a decade before the emergence of computational complexity as a sub-field of abstract Computer Science, in which upper and lower bounds for kinds of complexity (and thus, conversely, forms of simplicity) was even formulated. But we do now have a variety of ways to address Quine’s concerns about how to characterize complexity and simplicity. I’ll say more about this in a moment. What I want to start with a more controversial proposition: Namely, Quine got it backwards. In a very real sense, it is the world that is fundamentally simple and our theories that are complex.

Viral Philosophy


, ,

A couple of different articles recently have spoken to the need for the humanities in general, and philosophy in particular, to become a more active voice in contemporary matters, particularly with regard to the COVID-19 pandemic. One article (which I’ll get to below the fold) was especially critical of the “failure of philosophy,” singling out as the basis of this sweeping claim a famous European scholar’s decision to become publicly and egregiously stupid. A second article (which I’ll get to after the first) is not focused on philosophy so much as the humanities in general, and even here the author (a former Chancellor of UC Berkeley) is more concerned on the social sciences than upon the humanities as such. This second piece brings us back to C. P. Snow’s famous lament about the “two cultures”, and argues that the problems Snow argued about have only gotten worse, even as “the players” have in many respects reversed positions.i

A cute picture of my cat so that this post will go viral and reach tens of people …

My concern here will be with philosophy rather than the humanities writ large, and specifically the impact that philosophy and philosophers can and ought to have upon the world. This latter topic falls under the general heading of what is called “public philosophy.” This is an instance of “what is old is new again;” in terms of the contemporary academy, public philosophy is a fairly new idea (one which many academic philosophers openly object to.) In terms of the history of philosophy, it is as old as the topic itself. I’ll not engage the debate about whether or not one should engage in public philosophy here, since its need is so manifestly obvious such “debate” is as silly as arguing over whether or not we should breathe. Rather, I wish to talk about the ways (and possibly the “ifs”) of how public philosophy has failed us in these last 18 months of global pandemic.

A Gary Story



Sometimes, the good guys win. This was brought to mind by a recent story at OpenCulture that tells of Jocelyn Bell Burnell. She is the discoverer of pulsars, in case you didn’t know (which is likely.) Because her – and, of course, it is always HERmale advisor took the credit and was awarded the Nobel prize for it. You can read the original OC story HERE.

However, not all such stories are as infuriating (and even Bell Burnell’s is far from being the worst example from a seemingly endless list of women being denied earned credit.) For example, there is this one that I can attest to as a witness: I was there and I knew the people involved, and I saw it come to light in real time. So, this being my blog, it is my right and privilege to deviate from my normal focus on philosophical topics to tell personal anecdote. This one is from about 40 years ago, back when I was well and thoroughly ensconced in the computer and high-tech industry as a professional technician. So permit me to tell you a Gary Story.

God’s Name


, , ,

I see yet another story bemoaning the death of a willfully stupid fool who not only denied the reality of the COVID-19 virus (SARS-CoV-2) but also declared that he would place himself “in God’s hands”, both to avoid catching the disease, and then survive it once caught. Well, God appears to have been too busy trying to wipe a grease stain off those hands to attend to this gentleman’s demands, because said individual got sick, suffered, and died gasping for air while intubated.

I’m going to step outside my usual zone of operation and address a few words at those staggeringly vain individuals who imagine it is their unique privilege to tell God what to do. Now, as a proper Whiteheadian, I do believe that the term “God” has minimal reference and conceptual content. Not even remotely enough to form the kind of center of meaning that one might go to church to celebrate. For that, one must move well beyond Whitehead and into the process theology that owes its source to Charles Hartshorne. Much of this latter, though not all by any means, is also rooted in various interpretations of Christianity. And while even this is beyond the scope of my primary interests, I’m actually going to address my remarks to the vastly more conservative field that tends to identify as evangelical or fundamentalist. (While there can be overlaps, the two groups are NOT the samei.) I’ll spare you any fatuous declarations as to either the reality of these people’s God, the truth of their concept(s), or the validity of the Bible. In fact, I’ll be taking these things as given. Rather, what I want to show is that a certain class of behavior that they publicize as evidence of their devout faith is, by their own standards, a gross and indefensible sin. It is not hard to show.

The Infinite


, ,

I don’t anticipate any explicitly Whiteheadian considerations this time around, but all my thoughts are informed by my Whitehead scholarship, so you never know. What I want to talk about here is the idea of infinity. I say “idea,” rather than “concept,” because even within the relatively constrained bounds of formal mathematics infinity is not one thing. Outside of the bounds of mathematics matters are significantly worse, little or since no effort is made to constrain such talks, or even render it potentially intelligible, with formally legitimate techniques.

Speaking of “outside the bounds,” the ancient Greek word for the infinite is “apeiron” (ἄπειρον), which translates as “unlimited” – the “a” being the negation (“un”) and “peiron” meaning limited or bounded. Clever as they were, the Greeks lacked our additional 2,300 years of mathematical study, so the idea that one can have something that is infinite (unbounded) – for example, the length of the perimeter of a geometrical figure – i.e. a perimeter that exceeds any possible length, measurable either in practice or the ideal, that is nevertheless bounded by an easily measured finite figure (a circle, for example) would never have occurred to them.i But the figure above, the Koch snowflake, is precisely such a figure. (Details can be found HERE. As is my wont, I skip the technical details which will take up more text than this blog post.)

Computation, Complexity, and Why is The Rum Always Gone? (2)


, ,

Some tasks, processes, “computations,” are too difficult to do in any practical context. Some are so intrinsically hard that, even while they don’t seem especially difficult, God herself could not do them. The first is the problem of computational complexity, the other of computability/solvability. The former, complexity, emerged from the latter, computability, because the problem of computability was more obvious to mathematicians who’d never seen, much less actually used, a computer. But after Alan Turing presented his own abstract model of a computing “machine” (the “Turing Machine,” or TM) to prove the existence of unsolvable mathematical problems, the difference between what could be solved in theory (computability) and what could be solved in practice (complexity) came into view, and methods were developed to investigate the latter as well as the former. This is all by way of summary of, and pointing forward from, the previous post.

Mechanical Turing Machine

There are theoretical &/or partial work arounds, ways of tricking out the game, for both complexity and computability. For complexity, it is unclear whether the trick can be realized in practice. For computability, it is unclear whether the trick (which is only a partial trick, really) is even physically possible. Still, I’m going to talk a little about both – in the preceding order – and finish with some comments on how these theoretical considerations can be manifested in our considerations of what does and what does not constitute legitimate scientific inquiry, and a few comments closing the circle on analysis versus ontology.

Computation, Complexity, and Why is The Rum Always Gone? (1)

Were it ever the case that there was another person as peculiar as myself, who would study topics like Whitehead’s philosophy of process and theory of computation at the same time (over a period of decades), such a singular individual might speculate about the connection between the theory of computation and Whitehead’s process of emergent actual occasions. The latter bears some real analogies to a real, completed computation: the data (Whitehead actually uses that term) that combine via a process of integration into the holistic completion of an occasion/computation has a variety of structural similarities. This is made more interesting by the fact that Whitehead was writing long before theoretical concepts of computation emerged in anything like a developed form in Alan Turing’s work in the mid-to-late 1930’s.

An example of the Nazi “Enigma” machine.

The analogy fails catastrophically, of course, after even a little examination. The theory of computation offers nothing in the way of insight into the continuum of possibility; it is hopelessly finite in every character; it does not even imagine a difference between analysis and ontology. Whitehead’s process philosophy transcends all of these distinctions. But – and this is key – that is because Whitehead looks at both analysis and ontology, and notes the distinction. The theory of computation only looks at analysis. Still, while it goes no further, as far as it does go is broadly applicable to any activity where analysis is involved. So that is what I want to talk about here. As always, I’ll avoid technical details; working through even a trivially simple computation in pure, “Turing Machine” (TM) form, is an exercise in tedious details that would stress even the most detail oriented individual to the breaking point. Books on theoretical computation, and computational complexity, are so readily available for the curious that I’ll not even trouble to make a list (which could, by itself, consume the 1500 words I otherwise try to limit myself to.) But neither will I say anything that I can’t cite multiple sources to justify.