It’s Yogi Berra, deja vu all over again, time. Even with an almost exclusively rational flow of information, from vetted sources, especially the newsletter of Your Local Epidemiologist, “written by Dr. Katelyn Jetelina, MPH PhD— an epidemiologist, biostatistician, professor, researcher, wife, and mom of two little girls. During the day she has a research lab and teaches graduate-level courses, but at night she writes this newsletter. Her main goal is to “translate” the ever-evolving public health science so that people will be well-equipped to make evidence-based decisions, rather than decisions based in fear,” I have twinges of the same anxiety I felt in the first fear-filled largely uninformed days of the outbreak of Covid in earnest, when Trump was still President. I mention the latter condition, not only to place my reference in time, but because he so conditioned the nature of the response and the state of mind of virtually everyone (which is to say, everyone) whose lives could potentially be affected at some undetermined time adversely by the emerging scourge, and the pandemic it brought to the world in its advance waves of infection.
As Dr. Jetelina suggests, we are at the mercy of time, as the first emerging signs of the effects of Omicron, which sprang out of nowhere, seemingly, though first reported, as it continues to be, most reliably by the health professionals monitoring these things in South Africa. It appears as a preliminary very cautious take on the very first, and largely really inconclusive results, that the best response may be to revert somewhat to the near hibernation mode of sequestration we entered – some of us a great deal more devotedly and aggressively than others – when lockdown began in the United States in the first days of March 2020. One thing that seems already to trend so powerfully it is not likely to be moderated by more data, is that having been previously infected by any other strain of Covid does not protect the newly infected, or more pertinently the reinfected, unfortunate enough to have come down with the Omicron strain.
More children, being the most likely never to have been vaccinated, than adults are being infected with Omicron (one likely, certainly a possible, outcome of the evidence that reinfection is going to occur among adults who will then pass it unknowingly to their unprotected children). What’s not at all clear, and what represents the currently sought gold standard of a statistical index on the new highly transmissible and rapidly spreadable strain called Omicron, is to what extent vaccination protects either against infection altogether, or, at least, against highly threatening levels of infection in an individual. It has been clear for some time that all the forms of vaccine, but especially the Pfizer, the Moderna, and the Johnson, especially with a booster, help to mitigate the effects of infection and keep the symptoms mild and manageable.
So what accompanies the renewed wave of a very familiar type of feeling of anxiety, is a renewed hope that time will eventually reveal the positive efficacy of these vaccines. Secondarily, of course, there’s the hope that work that has already begun in the laboratories of the developers and vendors of the vaccines, will quickly produce a booster formulation that works against Omicron. More than ever before, the compelling need for far greater preventative measures on the part of the public, and not just health professionals, and a far greater commitment to the measures advocated from before the start of the continuing pandemic (now about to enter its third year) and consisting largely of the now familiar to all (whether followed or not) protocols of prophylaxis and social distancing, including, but not limited to, avoiding congested spaces, especially indoors, keeping a safe distance from others, going masked in the presence of others, and frequent hand washing, especially after contact with objects likely handled by others and with other individuals themselves.
For those of us who have only gradually and very cautiously moderated the curtailment of such measures and only under certain circumstances, the clamping down that now seems inevitable if there is any hope of avoiding a devastating level of resurgence of infections and the need for hospitalization for the most severely affected. It would not be the first, for sure, but it could be, at this stage of not knowing, one of the worst, of the three or four major surges that have crippled the economies of a number of countries and their populations.
Suddenly, as I read and heard (on the air) the prevailing stories centering on the persistent anti-vax movement in the news, I had two insights. I demur from calling them epiphanies, but they feel a little like that.
For one, it is clear that there is bipartisan approval of the proven and sometimes seemingly miraculous monoclonal antibody treatment as a desired clinical strategy for mitigating the severity of a Covid infection should it occur. Even people among the preponderant majority of non-vaccinated infected patients who still actively oppose even the idea of vaccination seek the treatment, if not demand it should it be medically warranted.
Combine this with the alternative lunatic list of preventative/cure therapies, going back to the former president’s endorsement of an array of treatments, including anti-malarial medication or common household bleach, and culminating recently in the current dangerous embrace of the equine version of a deworming medication. You have a clear pattern of behavior that suggests the basis of a hypothesis. Philosophically, clearly we are better disposed, those among us susceptible to the serious belief in the efficacy of clinical methodologies – after contracting the disease with sufficient virulence as to require active medical intervention – despite an equally serious, if not stronger, belief in the risk of equally well-proven scientifically valid preventatives in the form of vaccines.
Obviously, there are some among us (an alarming large number, actually) who are willing to swallow (literally) any strange, not to mention dangerous, shit, because we somehow have become disposed to believe in cures. But we persist (by “we,” I mean they) in having no faith whatsoever in the efficacy of prevention, and thereby circumventing the likelihood of ever requiring intervention, least of all medical.
It’s a strange qualified endorsement of science. The qualification is, it has definite limits.
The old adage is an ounce of prevention is worth a pound of cure. Clearly, that has flipped, and in monetary terms, at least (not to mention efficacy as measured scientifically) the order of magnitude has multiplied, possibly astronomically. Have you looked recently at the cost of two doses of the Pfizer vaccine (to name just one) versus the cost of a single course of monoclonal antibodies? Let me help you refresh your recollection: As with the vaccine, the actual monoclonal antibody medication cost is absorbed by the government. Any cost associated with the antibodies as a treatment is for the services of the medical facility administering the infusion process. The uninsured cost can be in the tens of thousands of dollars, as much as $20,000. The vaccine is administered for free (just as a benchmark though, the agency that oversees Medicare expenditures allows $40 for each administration of the vaccine, that is, each time a shot is given, it’s 40 dollars paid by Medicare to the provider. The vaccine itself has a book value of one cent. The recipient pays nothing.
Proof (and mathematics) mean nothing. Philosophically, many of us are better disposed to believe in a cure than to believe in ensuring putting the odds in our favor of getting sick in the first place.
What might this mean? Even if I thought I knew or could figure it out, thinking out loud, I am sure there is not enough space here on this platform to explain it – my intuition tells me it would require a book, if not a series of peer reviewed studies. And then a book, to explain the studies.
So, that’s one insight. The other, related to the resistance to vaccination, raises questions about the nature of the philosophical foundation on which some of us build the rules of conduct of our lives…
Increasingly, it appears that any acceptable argument, legally, for refusing to comply with a vaccination mandate would be on the basis of religious belief.
What I learned today, from a news podcast focusing on the current status of this movement, is that, for one, for example, there is nothing, by preliminary general agreement, in Roman Catholic doctrine that constitutes an argument for the validity of such a justification for resistance and refusal. However – an argument, presumably a legal argument, is being formulated – it may be possible that, on an individual basis, an otherwise sincere believer in the Roman Catholic faith might, in an act of individual interpretation of some aspect of doctrine, refuse to be vaccinated. Indeed, to make things more complicated, and certainly to add to the complexity of sorting out any valid interpretation of behaviors that lead eventually – if they do – to legal confrontations, the Vatican has issued what quickly became a controversial endorsement of vaccination as a preventative measure. Catholics still have the right to make the case to their employers for opting out – that is, in the usual interpretation of Title VII of the Civil Rights Act of 1964.
If, in fact, as has been reported, there are no major religions that specifically interdict vaccination (though it’s in dispute, at least in the arena of internet debate, as to whether Islam should be included), I further learned, there are, in fact, bespoke religious solutions emerging, including new religions, like the one started by a 20-something self-styled “pop star cult leader from space” named, she says, Unicole Unicorn. According to the cult leader, “your personal reality is reality,” and that Unicult is a legitimate religion. Hence, if your personal reality within the context of this specific belief does not permit you to get vaccinated, you must be exempt.
The question comes up, even in the face of the stipulations of the civil rights of all citizens, because deliberately the Constitution, from the time it was first framed by the founders, avoids defining what constitutes a religion. Hence, the Constitution does not spell out what can be done about the “religious” claims of Unicult, because there is no definition of religion in the context of one’s freedom to believe as one chooses. Ms. Unicorn assures her listeners and adherents on TikTok that Unicult will provide the necessary documentation attesting to the religious exemption of any believer who refuses to be vaccinated.
The deliberate decision not to define religion stems from the prior decision to avoid any chance of any one religion becoming the sole defining authority on matters of belief, morality or religious practice. It also avoids setting up the courts as a platform for adjudicating what constitutes valid religious belief. In the legal realm framed by the Constitution, a religion cooked up yesterday by your neighbor has the same validity in law as some ancient established system of belief, like Judaism, or Islam, or Buddhism, etc.
Aside from the difficulty of getting the courts to adjudicate on a matter, “what is a religion officially?” that is, by definition, not defined by the foundation of our legal system, there is the problem of who to put in the position of making such judgments, especially in the case of laws that permit so-called religious exemptions to adherence. In many instances, as has happened already in the matter of vaccination, the decision is left up to the employer (which is why the mandate in President Biden’s order includes employers of more than 100 employees).
Now, what I am leading up to is the substance of my other insight – though it should really be expressed merely as the discovery of a conundrum regarding this matter. It’s a conundrum that has not only legal, but historical, ramifications.
While listening to the “On the Media” segment from its September 17 podcast, entitled “Why the Constitution Does Not Define Religion,”which is the basis of some of the foregoing commentary, I suddenly found myself thinking about another thorny issue, also a largely ethical problem and having to do with individual rights to act freely even in the face of unjust mandates – unjust on the basis of essential and fundamental questions of what does and what does not accrue harm to others as a result of an individual’s actions. I found myself thrust back in my mind to the 60s, when we were in the throes as a nation with conflicts among various segments of the population concerning a war, a literal war, in which our national defense forces were engaged overseas with armed forces defending their own countries and based on foreign soil. I am, of course, talking about the Vietnam War (or if you prefer, as they refer to it in Vietnam historically, as the American War).
The question in the 60s was about adherence to the mandates of one’s obligations as a male citizen over the age of 18 in submitting oneself to the requirements of Selective Service. That is, should one, if so ordered, enter the armed forces as a conscript and allow oneself to be trained for combat and then to serve actively as a combat participant in live action that could culminate in death or injury, including one’s own risk therein, to a human being.
The only out, save for legitimate medical exemptions that would constitute unfitness for active duty, was what was commonly known and referred to as conscientious objection.
For the sake of this particular discussion, let us consider what conscientious objection constitutes insofar as it may be understood as a religious basis for exemption. If one practiced, and could prove such practice as what is now understood as a condition of “sincerity” in one’s practice, a religion that interdicted doing harm to others (and, indeed, it is arguable that all Judeo-Christian doctrine has a basis in the rule that one should not do to others what one would prefer others did not do to oneself), one could claim a conscientious objection to participating in any combat role.
Now, the reason I bring all this up is because, as I recall, any adjudication as to not only the sincerity of one’s stated religious (or moral or ethical) belief and adherence, but also the application of such belief as being in conflict with the requirements of active service in a combat role, was performed by a local board of civilian volunteers. In the case of a conscript claiming a conscientious objection to serving in combat, or, in the extreme, to serving in the military in any capacity, he had to appear in person before the board and make his case convincingly that his moral, ethical, or religious beliefs were not only sincere, but strongly felt enough to constitute grounds for reclassification (if otherwise found fit for duty).
Let’s just say, it was devilish hard to “prove” a legitimate conscientious objection that was not on political grounds. There were not many. There may have been a greater number who ended up “migrating” to another country, usually Canada. Some ended up in prison, presumably for an insufficiency of sincerity.
Again, without getting into the vast and possibly limitless bounds of salient matters for discussion, whether the issue is getting vaccinated or getting issued a weapon and live ammunition and having the direction of the enemy pointed out, especially if one understands that all salient pathways inevitably lead to differences in political viewpoints, I have only something simple (or seemingly) to ask.
It seems obvious to me the difference in these matters. In the case of vaccination, compliance with the law, as defined by the mandate, represents a relatively small risk to oneself (measured by scientifically valid statistical methods) while ensuring increased safeguards to the entire populace; in the case of serving in combat, non-compliance with the laws of conscription represents a significant risk to oneself in terms of punishment and redress as meted out by agents of the state, while ensuring the individual will do no harm whatsoever to anyone else by not participating in acts of violence.
Yet, in the case of the latter, we were content through the course of most of the Vietnam War to leave decisions up to civilians not otherwise prepared, never mind trained or in the regular practice, to make the judgments required as to the applicability of profound and difficult ethical reservations, within the context of, at best, ambiguous and uncertain good deriving from forcing adherence despite those reservations.
In the case of the former, it is clear, even as the plague rages on, and people continue to be infected and to require hospitalization and to die from Covid at rates in some parts of the country almost as high as they have ever been in 18 months of pandemic, with no realistic end in sight, that we will leave the questions, finally, of the ethical obligations of the objectors to the courts, who might ultimately – and who knows when? – relinquish any claim to definition or jurisdiction. And we will be left, once again, with a judicially held point of view, obvious as so many other such observations on other aspects of the conduct of our lives in this country, that any hope of resolution is in the hands of the legislature. We should all live so long.
What I wonder is, have we progressed since the 1960s? Or should what has changed be called something else? Or, and this is a safe bet, has nothing changed really?
The gymnasium at Montgomery County Community College set up as a vaccination center, winter 2021.
“[Yesterday, President] Biden announced that by April 19, more than 90% of Americans over the age of 16 will be eligible for a vaccine and will live within five miles of a vaccination site, including 40,000 pharmacies.” —Heather Cox Richardson, Letters from an American, 30 March 2021
This is great, but goes no distance at all in explaining or justifying why, as of yesterday, the only sure way to get an appointment for a vaccination shot (no choice as to location of the vaccination site, one of three major ones, and no choice as to date, never mind the type of vaccine that would be administered) was to be sure you were registered some time in January or February—and, as I write, in two days it will be, indeed, April. I am talking about those of us, and there are two of us in our household, who qualify by age or other ineluctable factors for being in the first wave of vaccination registration in our county. Pennsylvania, where that county is located, has elected, regardless of the county administration’s ability to handle the logistics, and regardless of the population size and distribution, chose to do it this way. It took the state two full months after vaccinations began in earnest to allow distribution from third-party providers equipped and qualified to administer vaccines in a safe and medically approved manner (like pharmacies and hospitals and medical centers). Distribution was slow and unreliable regardless, and it took the state a full six to eight weeks to ramp up.
I signed up, that is, I registered to be on a list for an appointment in a county-sanctioned vaccination center, on the last day of the deadline period for the initial round of vaccinations to be given. I received an acknowledgment of my registration immediately, and this message also was the opening salvo of weekly “updates” from the county of how the process was going. The messages were lengthy and detailed, but the bottom line report was, “slow and steady.” I received the information and identification materials I would need to appear for my first shot on a specified date, which turned out to be some seven weeks after the registration date deadline I had met. They did tell us it would take as many as ten weeks for this process, so I guess I am not supposed to complain about efficiency, never mind other indices of governmental administrative competence and performance.
I was told where to appear on that particular date, and I was given a choice of a time of day, with remaining slots for five minute intervals of each hour indicated.
The original two sites selected by the county, since expanded to three, are located in county owned or country-run venues (the original two are located in a county administrative buidling in the county seat, a town located about 15 miles from my home, and in the county community college campus, about the same distance away). Albeit true I live on the fringe of the county boundaries, located immediately adjacent (our house is located some 1200 feet or so from the county line) to the most densely populated county in the state, that of Philadelphia, the largest city in the state, and the fifth largest metropolitan area in the country, we are clearly not in the heart of things county.
I’ll interject here that Philadelphia has a minority population, out of two-and-a-half million of just shy of 40%, most of these residents being African-American or somewhat smaller minorities of people of color. You couldn’t tell that by walking the streets of our small town, literally yards away from West Philadelphia, a neighborhood which is even more densely populated by minority residents. We appear to be, because we are, a predominantly white middle-class suburb, which just happens to be in one of the small set of zip codes in this county, and which themselves happen to be in the top decile of the wealthiest zip codes in the entire nation.
We see a lot of black faces, but that’s because most of the jobs that predominate in public service markets: retail, fast food, groceries and beverages, etc. are filled by people of color. As we are so close to the division between city and suburb, and as we are more easily accessible to most of the kinds of businesses, including eating establishments, that the public frequents than one would find to be the case in the city (a condition counterintuitive to what you’d expect given the differences in density per square mile of private homes, especially in our suburb and those immediately adjacent, which are, except for political demarcations by precinct and ward and other jurisdictional determinants, identical). The ease of access and density of choices for where to get a prescription filled, or an order of hamburger and fries, or fried chicken, or a cheesesteak (the staple of the local culinary vernacular) is a function of the predominant mode of transportation for accomplishing any task more complicated than attending your immediate neighbor’s backyard cocktail party, the automobile.
So, day and night, our shops and restaurants are patronized in far greater numbers than by actual residents of our neighborhoods, by the residents of the city’s nighborhoods. The African-American, LatinX, and other minority populations of color are served in the common amenities and categories of purveyors of merchandise and refreshment nowhere closer by than within the borders of this county – Montgomery County – and that immediately adjacent, Delaware County. Though Delaware County is more middle class and lower and extends to the south to the state border with (as you might guess) the President’s home state.
What all this has to do with vaccinations is this. The sites chosen for vaccination administration centers are deep in the interior of Montgomery County, indeed are actually closer to the opposite boundaries of the county to the northeast and northwest than we are (given our proximity to the City of Philadelphia). Yet the members of our household given leave and registered to be vaccinated had to travel, because of the convoluted geographic routes to those venues, the better part of an hour to get to a point only 15 miles away by car. We are actually closer to sites, including medical care annex sites of the major medical center in downtown Philadelphia where we are both treated (these annexes and the medical center itself are between four and six miles away, on local streets and thoroughfares), where the vaccines are administered. However, because we are neither residents nor do we work in either Delaware or Philadelphia Counties, we are constrained by the regulations stipulated by the state government from having the vaccine administered to us in these nearby locations.
Such constraints are irrespective of supply and efficiency of administration in any one of these counties. It just happens Montgomery County, which has perpetually been rated by the monitoring done of all counties in the country by the New York Times as having a “very high risk” of Covid infection, apparently also has one of the worst records for rate of vaccination of county residents.
In turn, I’ll ask rhetorically, what does this have to do with the minority population density of Philadelphia and adjacent counties? I’ll answer simply by observing – and admittedly these are purely personal and anecdotal observations – that both I and the other member of the household who qualified for vaccination, and finally were able to have done so, at least, so far, for the first shot of two of the Pfizer/BioNTech vaccine assessed the unrelenting crowd of fellow citizens being processed and vaccinated in two different vaccination centers as being far and away predominantly white and (somehow a salient if merely collateral observation) as a whole, individually significantly overweight.
Greek icon, ca. 1700 of The Second Coming, public domain
As will surprise no one who knows me, I am firmly of the camp that says when all is – to tap one of the larger clichés – said and done, all of what we spend so much time contemplating, analyzing, and sending one another alerts to heed (replete with links to videos, audios, articles, tweets, re-tweets, comments, and every permutation of the mechanisms afforded by technology to transmit and preserve utterances of the moment) is about language. It’s about what differentiates us from the other apes, and most of the other mammals: verbal communication (I said most, because there are other vertebrates, at least, who do vocalize and about whom we are discovering there are underlying structures, with rules and, well, essentially, phonemes, not which can necessarily and strictly be called verbal, but are certainly the cognitive equivalent: so we have a larynx, other creatures have other physiological structures to emit sounds).
So, it’s not so surprising that, after all, after all the tallying of the various categories and intensities of false utterances by our president, not to mention the cadre supporting him, sometimes with more lies, sometimes with ingenious if tortuously convoluted assemblages of words that don’t exactly – just shy of forensically – constitute more mendacity, but which work just as well as a plausible, but unprovable, construction of words in a seemingly comprehensible assemblage that serves to settle the senses. Or, it so confounds the senses (at their most corrosive, they confound all the senses at once… what’s that smell?) to so confuse them as to demand the respite of a self-imposed mental abandonment, as in “fuggedaboudit,” because it’s too painful to try to deconstruct into reason. We are left with pinning a sentence on the chief perpetrator of obliteration of all well-being into a state of chaos and woe with no more evidence than his own words. And spinning tales, in all genres of formal and informal rhetoric: essays, documentaries, texts (and equilibrating and neutralizing counter-texts), and doubtless what will be a long, possibly unending stream of creative formulations – fictions, certainly, but inevitably, metafictions, and speculative fictions, and the whole spate of formal ironic counterpoint, satires and parodies, not the truth, but not really ever untrue.
However, for now, until the current major engines of substantive content: the movies and series and mini-series, the blockbusters, and streams, and likely even TikToks, not to mention the book-length treatments, the one-offs, the tell-alls, the multi-volume compendious and comprehensive authoritative scholarly accounts, with all the apparatus providing the mass and weight of relentlessly factual gravity, for indisputable credence, begin to grind out, as a sub-industry in and of itself, we must content ourselves with the mainly moralizing, alternatively finger-pointing or hand-wringing, “who-could-have-known” and “didn’t-I-tell-you-so” opinion mongering from the hordes of usual suspects, and the inexhaustible supply of others who, absent a platform, simply construct their own – with instant credibility, because what is a network and connectivity for, but self-anointment?
What inspires for me this, that is, my own not so extraordinary meditation on the power and the meaning of words, of, I can say by extension, without stretching the pertinence, the meaning of language constructed of verbal forms, is this piece by a duo of senior NYTimes reporters, that is, by the attached link to a NYTimes story, not so extraordinary assessment, on the eve of Trump’s departure from the center ring of the political circus that has been his tenure in office, are two things about this account, from the necessarily salient voice – who other than Maggie Haberman has served, sometimes precariously, as an avatar of the phenomenon that occurred in full view and yet in strict terms of mindful probity as it happened, and that is the transformation of what had been undeniably the closest thing to reliable for a source (“source,” from the French, *source,* that is a spring, spurting unimpeded, pure, uncontaminated, and always refreshing and, if need be, restorative) of truth, that is, in English, that is, in the United States, that is, The New York Times. Truth, like the Times, had, in ways that will require clever analysis indeed to disentangle the actual process – somewhat like a re-enactment of the discovery of DNA, but not for the code of life itself, but something as inchoate as it turns out, the code of unassailable truth (if sometimes requiring a correction or retraction or reconstruction) – truth, as I was saying, has become now the neglected step-child, ragged and dirty, unkempt and maybe even sniveling a bit, of belief, the beast that had been tamed it was thought, but was the veritable rough beast slouching toward Bethlehem. Aside: You can’t talk about this stuff without an allusion to the poem, The Second Coming, named for the imminence that has been the core of the doom anticipated throughout the 20th century, and prefigured the further spiritual decay of this present time; and what is what we fear most, but that for all of our sense that it can’t get any worse than this – the state of mind that prevailed through the course of major wars that have dominated our global history since September 1, 1939 (another reverberative poetic touchstone) and continue to do so, not to mention, the concomitant and co-extensive reign of terror, which has proven to be a weird melange of the sharp shocks arriving without warning and leaving ever greater masses of rubble and toxic clouds in the wake of explosive events mixed with the prevailing atmosphere of doom encased in the ruling rhetorics of state policy (buttressed with stockpiles of the apparatus of true universal annihilation).
But I was saying… two things, almost unnoticeable, surely innocuous, as are most banal verbal markers – surely meant to be no more than declarative, and possibly at least orientating, if not definitive. First, the NYTimes calls this article not reporting, and not opinion, but a “political memo.” A “memo?” I know who it’s from. To whom is it addressed though? For whom is it meant?
Second, buried in there is the very briefest phrase, applicable to the man himself, “functional self-delusion,” which I suddenly (even in the moment; nothing stealthy going on here) understood to be part of some new taxonomy about the behavior of paper tyrants (like Donald J. Trump), the kind that he invented, the first of its sort, seemingly familiar, but really never seen before, because of one fact (if it must be reducible to that, the form we Americans have come to prefer for our truths to go down, especially in the absence of sugar in the spoon), and that is, this tyrant had his finger amazingly, and unbelievably for four anxiety-dominated years, on the nuclear button.
So, I have to ask, what is “functional self-delusion?” Whose self-delusion? What, if there was something functioning, exactly was functioning? If this was “functional,” what’s dysfunction like?
And do we really want to know? Don’t waste any words telling me.
I just read the Jill Lepore essay in the January 18 issue of The New Yorker, “What’s Wrong with the Way We Work.” It’s yet another semi-sweeping assessment of the sort I get to see and choose to read periodically about what has decayed about the relationship of Americans – ordinary Americans, the 99% – to work. The conclusion I reach is always the same.
We are doomed. Increasingly, moment to moment, day by day, and it’s been a fate imposed for some time now, at least a half century. It would appear from the way Ms. Lepore has structured the factual underpinnings of her thesis that 1970 or so is a watershed in the turning of a delicate balance between the rewards to management and owners of business off the sweat of their workers whose wages, and mixed ragged assortment of benefits, they paid, and the just compensation that the workers received in this transaction that permitted them to feel like they were supporting themselves, not just by way of scant and necessary sustenance, but in such a way that there was sufficient surplus that there was a basis for feeling like they were thriving, or at least leading productive and satisfying lives. I’ve avoided the use of the word “meaningful” for the reasons that Lepore examines, wherein during the history of radical deconstruction of the relationship of work to the sense of the quality of life enjoyed by the people, that is, the preponderance of the working population, who do the actual work. As here:
“Meaningful work” is an expression that had barely appeared in the English language before the early nineteen-seventies, as McCallum observes. “Once upon a time, it was assumed, to put it bluntly, that work sucked,” Sarah Jaffe writes in “Work Won’t Love You Back: How Devotion to Our Jobs Keeps Us Exploited, Exhausted, and Alone” (Bold Type). That started to change in the nineteen-seventies, both McCallum and Jaffe argue, when, in their telling, managers began informing workers that they should expect to discover life’s purpose in work. “With dollar-compensation no longer the overwhelmingly most important factor in job motivation,” the chairman of the New York Stock Exchange wrote, “management must develop a better understanding of the more elusive, less tangible factors that add up to ‘job satisfaction.’ ” After a while, everyone was supposed to love work.
That is, there was a shift in the basis of perception and the value of work was transmuted into an assessment of how meaningful the work was to the person who performed it, with the suggestion that such a value transcended and superceded the actual emolument in material forms, such as wages and benefits to which a dollar value could be attached.
In other words, we are doomed because somehow a great grift was performed whereby the American worker was not merely in some blunt, if not brutish, way traduced, but subtly and slowly, to most people imperceptibly in real time, induced to accept – not to believe necessarily, but to accept as an ineluctable quality of the nature of work in the larger fabric of their day-to-day existence – an abstraction, hardly provable, and always elusive, dependent as it was on a too-often fleeting and evanescent sense of their internal state of well-being, as a substitute for the hard material reality of adequate compensation in the form of sufficient coin of the realm to meet their needs for subsistence, plus something else, also usually in the form of abstractions, that allowed them to feel that life is “worth living.”
We are doomed now, because we have systematically, if obliviously (which is a polite way of saying being willfully unheeding of what is as plain as the most stark quotidian realities, like whether the sun is shining, or the color of the sky overhead during daylight hours – probably not for the sake of plausible deniabiity, because there clearly are no penalties for the omissions, transgressions, and impositions put in place, each another brick in the wall, a small brick, always, but many of them, and relentlessly and unceasingly being laid which resulted in a barrier to the kind of former life enjoyed by workers, who had secure jobs, with regular and predictable hours, and whose wages were not some egregiously monstrously tiny fraction of the compensation of their bosses. One of the more repugnant testimonies provided, involuntarily, as a quote by Lepore of the CEO of Dunkin’ Donuts, whose compensation was doubled to over 10 million dollars a year, yet who called the proposed rise in minimum wage for salaried and hourly workers in the organization to $15 an hour, “outrageous” (easy for him to say, computing as it does, absent any other benefits, to an annual wage of just over $31,000, that is, 3/10 of one percent of his income for that same year).
These would seem to be inequities that will be hard, even over a long period of time, to bridge to a condition that approaches being called egalitarian by reasonable human beings, who might still posit some faith in the economics of capitalism in a true democracy. Not without punitive (and doubtless insupportable by the current crop of legislators, who would have to craft the political and legal and economic apparatus necessary to effect such a change, even incrementally) sanctioned measures to bring down the highest allowable income level of American executives (in the way that certain other Western democracies have instituted, especially in the Scandinavian countries), even while raising the minimal salaries, and other necessary paid benefits, like sick leave, universal health insurance, parental leave, and job stability (though I’m not sure what this would mean in a way that is conceivable in an economy now largely based on service-related jobs within the current management apparatus designed to provide predictable just-in-time efficiencies while also optimizing the level of profit potentially to be derived, that is, in a labor market that has been gutted of any structure that supports the needs of the workers, except in the form of what we now glibly, if not merely unthinkingly – see notes on “willfully unheeding” above – refer to as a “gig economy.” It always seemed to me long since that the more apt term would be a gag economy. In every sense: it’s a joke of universal proportions, and it’s designed to keep workers in a state of perpetually feeling like they’re just short of being choked to death.
It’s close to, but not quite, the time to press the start-button of the internal clock we’re all blessed with. It’s a memory timer, a special one. It doesn’t make it harder or easier to remember things. It’s a device that measures the time it takes for us to lose our sensory experience at the moment of an occurrence – be it a thought, a visceral reaction, or a conditioned response to some action or turn of events. I mean the time it takes for it to be more and more difficult to recall, never mind feel with the same spontaneous immediacy, just how bad it was.
There’s a general wisdom afoot, a not surprising one given the hegemony of domestic media in attracting and shaping our attention, that this is a particularly American phenomenon. It might not be; but merely the result of our great self-absorption… that and the media learned we aren’t very much interested in news about the rest of the world, real news.
The de facto result, if it’s not actually verifiably, as John Oliver likes to say, “objectively” true that, for us in the U.S., what else matters? It’s as if Europeans, say, for whatever reason, or the Koreans, or the Armenians, are much better at keeping alive in all their sharp intensity, their affective spikiness, the outrages visited upon them. But especially to remember the perpetrators and the depths of their perfidy and cruelty.
Even in very recent history, the effects of this peculiar kind of mnemonic anesthesia become manifest – more precisely, to becoming touchstones of how effectively and swiftly the anodyne development of the process occurs. There’s Nixon, of course, and in the course of his historical reconstruction how, merely 20 years after he left office in disgrace and within a whisker of becoming our first president to be criminally indicted, he would be eulogized by another President, Bill Clinton. In 1994, at Nixon’s funeral, Clinton said, and this was only halfway through the largely laudatory remarks, “He gave of himself with intelligence and energy and devotion to duty, and his entire country owes him a debt of gratitude for that service.”
Only six years after that, we voted into office the man whose tenure and whose conduct as the chief executive did, indeed, seem to eclipse the level of Nixonian transgression – in tenor, in inhumanity, in criminality. Then, we were barely more than a single administration away from seeing the back of George W. Bush, to the collective relief of a great many people, including not a few of those naturally disposed to look favorably on his politics and his policies before a new standard emerged. Trump was barely in office when what we saw with a rapidly diminishing view through the rear view mirror began to look like a poignant recollection of a better time in the context of what was suddenly a monstrous and – a new quality – inescapable present; for the first time, an omnipresent and pervasive presidency.
And once again, our standards for imagining the bottom of what had seemed in earlier, now nostalgic times, almost with that romantic quality of the long ago, that time we reserved for a sense of yesteryear, a fairy tale quality never to be recaptured, were transformed. There was that jocular meme, “Miss me yet?” and only one of the many artifacts that seemed to sprout spontaneously, like plants in a desert that hadn’t seen rain in century. Suddenly, it seemed sudden anyway, George W. Bush wasn’t so bad after all. How many of us have heard that, and how many times? And all it took was eight years.
Detail from Hieronymus Bosch, “The Garden of Earthly Delights,” at the Museo del Prado
The germ of this thought comes from listening to an interview on Fresh Air yesterday, Thursday, November 12, recorded the day before, which would have been Wednesday, a week and a day since the still officially unresolved election, and at least two news cycles previous. I point it out using this commonplace gauge of cultural progress because it is also still current (or why would Terry risk the embarrassment of being out of touch?). To wit, I notice in both the New York Times and the Washington Post that President Trump – his aides are alleged to say – has no plan; he is merely getting himself however he can from news cycle to news cycle.
White house memo
Trump Floats Improbable Survival Scenarios as He Ponders His Future
There is no grand strategy. President Trump is simply trying to survive from one news cycle to the next.
The thought flits through my head that, maybe, he has at long last legitimately found his own bit of revelation and, as an endgame, turned to religion and a faith in miracles.
But nah. I can’t help but grab the seat of my pants and what’s left to palpate of my shrinking gluteal mass, and deduce from the condition of my hind parts that it’s the same old shit, just a different day. But it’s the implications of the ghoulish contemplations and deliberations on the possible, the probable, the unthinkable, and the preposterous that nag at me. It’s like a constant frigid flow of air from the left, a polar express of glacial horror originating from somewhere “between the pit of man’s fears and the summit of his knowledge.” Yet it keeps nagging at me that I should just give in, and allow the temperature in my core to keep dropping, to the zone of absolutely no hope. It’s tempting, but I resist.
On Fresh Air, Terry’s guest was a dude touting some what is now considered durable, if not estimable, cred. His name is Garrett Graff, and he is the very model of the cyber-age journalist: former editor of Politico, a contributor to Wired, and the author of at least three books, one on Robert Mueller’s tenure as head of the FBI, a history of the bunkers built in secret to protect government leaders in case of nuclear attack, and an oral history of September 11 (which I am only guessing does not include President Trump’s notorious lies about witnessing people, which he averred were Muslims, dancing on rooftops and cheering from across the Hudson straits from Hoboken as the twin towers burned and finally tumbled).
The topic of their conversation is entitled, on the Fresh Air home page, as “Journalist Details ‘Potential Mischief’ of Trump’s Remaining Weeks in Office.” It consisted, in my hearing of it, of admittedly only speculative outcomes of the potentialities of the various “moves” and actions taken by the president in the past few days, and weeks, and, even going back months – with the unstated implication that every measure, every step, every vindictive or mean-spirited or sheer lunatic act was performed aforethought, and, conceivably… not saying it’s so, but this is how autocrats, authoritarians, totalitarians, dictators do things, have done things…
And I realized, not a new thought for me, but a refreshed set of impressions, that this is how a certain quarter among the news media has been reporting and commentating on the Trump presidency all along. To me, it constitutes a really unsettling superset of the stuff of dread-scrolling. For now I call it Paranoia Porn.
It amounts to imagining the worst outcomes of a regime that resists owning the qualities ascribed to it, beyond the malevolence and hatefulness embedded in the spirit of its worst aimless deconstruction of certain entities and systems necessary to the conduct of governance in the United States. These stories and conversations, these interviews and analyses, the stuff of a whole industry of media content engineering and manufacturing that has kept it going, and not just going but thriving – with the major companies, like The Times, reporting record levels of revenue and profits – in my view are the final throes of examining minutely what Trump has been doing, and then fantasized about by the far left media in the way of speculative horror scenarios based on incredibly complex conspiracies involving setting up a shadow government in the hollowed out shell of the existing legitimate infrastructure which has been performing the business of government for the entire history of the republic.
In fact, as far as I can tell, and anyone – from the lowliest whistle-blower to Carl Bernstein, from Mattis to Bolton, from Comey to Scaramucci – but anyone has been willing to make public, frankly and truthfully (by their own recognizance) has reported on every conceivable twist and turn, every u-turn and wrong turn, every impulse and miscue, there is only evidence of one large truth. Trump has proven repeatedly and consistently the incompetence and shallowness and shortsighted nature of nearly every one of his more far reaching initiatives and in four years, and continuing into this period of interregnum, when his aides tell the media that he has no endgame intended as a culmination of his current chaotically disruptive machinations, he has never betrayed the possession of anything resembling a strategy or plan.
Approximate Reading Time: 5minutesOf course he won’t go quietly
photo by Albert H. Teich/effects added by Howard Dinin
I’m thinking as we all, in some corner of our consciousness, fidget and distract ourselves awaiting an outcome, and suffer the condition of Tiresias in The Waste Land, not so much throbbing between two lives, as vibrating between what I’ll call two civic states of being. Is it the end of the beginning, or the beginning of the end?
The more that suddenly positivist liberal media, and especially the commentariat – that overpopulated sub-state of what it fancies itself to be, part of the fourth estate – are merely anticipating what they seem to think is a foregone outcome, the more I feel the hum of true uncertainty. Joy is in the air, and after a long term, it’s closing in on four years after all, of pissing and moaning and talking about the inconceivably further decaying state of civilization, and all embodied in one clinically obese semi-failed real estate developer with a knack for expropriating the attention of every person, including anyone not immediately in his presence. Optimism, can you believe, from the baleful doomsayers. This despite being bitten in their hindquarters innumerable times by a perversely indifferent set of facts, in this case numbers of votes to be counted. And yet, and yet… that delirious outcome of which we are on the brink – suddenly we’re a happy few, a band of brothers, whereas yesterday, they were all too ready to tell us what’s wrong with us – an outcome soberly still measurable against some calculable total of statistical deviance… is generally concluded.
And by the inherent permission accorded by an assumed happy and propitious resolution (however tiny, and therefore ambiguous, the margin), the collective wonder turns to a focus on how the incumbent, presumably, and in what I’ll cling to calling a presumptive way, is expected to make his exit. He has promised even well before, weeks before, the polls were scheduled to open – and briefly he gave us pause to think that he could even alter the implacability of that received fact: the immutability of the Election Day, as defined in the Constitution—call it off, delay it, schedule it for next year, or the release date of the vaccine; Can he do that? He seems to think he can do anything? He can’t do that! Well, of course not… but isn’t it pretty to think so, with echoes of his innate impotence in virtually all matters in which, in fantasy, in his wishes, he wields power impervious to the most refractory resistance – that he will contest whatever there is to contest, having established, at least for his own nefarious rhetorical purposes that not only was there a fraud of historic proportions afoot, but that it was already started, weeks ago remember, and all ballots save those cast, defiantly in the face of a raging monstrously contagious viral epidemic, by voters in person with proper identification, were bogus and void. Not just suspect and uncountable. Strip away the franchise that was born with the Republic, and never abrogated or delayed, not once in our history – except temporarily in 11 renegade southern states, and the Union would have magnanimously and unquestioningly have granted them continued voter status, if they would just, at the same time, put their muskets and rifles down, and let those people go…
He would not even answer the question about whether he would comply with the protocol of an orderly and non-disruptive transition of administrations as a new one took power from his – his non-responsiveness not to be interpreted as the globally accepted legal policy, ‘tacet contire,’ silence implies agreement, but really more in keeping with the rules of the game of stud poker, and he chooses, in anticipation, to keep his hole card face down for a long as possible. And of course, there were those of us who have expected the worst from him, even without provocation, because we had taken the measure of his character, and without pausing to analyze the sum of his life of grifts, not only weighing the comical grandiosity of the rewards when they succeeded, but also assessing the abject ignominy of the intentionally circumspect, if not downright concealed, and ultimately uncountable, failures, but including also the repeated acts of salacious indulgence that were the chief excrescence of his innate, his almost genetically determined, vulgarity. And those of us who did fully expect he will make his longed for extrication from the seat of power ugly – really ugly and gut-wrenching – and difficult (Herculean), and, if possible, violent, in a series of final acts of his particular style of scorched earth deconstruction of the social and civic order, which is then gilded over, like a chandelier of base metal left hanging among the ruins by a single strand of tarnished wire.
And so, it may surprise you to hear me agree, of course he will make it as bad as he can, not because he is vindictive and vengeful, though he is, not because he is a pugnacious bully, though he is, but because that is his nature. To be loud and attention-seeking, and monotonically in the mode of self-aggrandizement. In short it’s the manner in which he does everything. It is the template for the caricature of himself to present to a credulous world, hungry for the cheap seats version of some manifestation, two-button sharkskin suit and all, with the fake hair, and the fake skin, and the multiple layers of gold in the form of ostentatious artifacts, the gaudier the better, to be worn on one’s person, that passes in the age of the infinite loop of streaming content version, of a hero.
He was loud and attention-getting as a mere over-publicized and, measured by the tacit codes of socially accepted behavior (this was years before the concept of Real Housewives was ever imagined as a germ of an idea), over the limit in lubricious demeanor and affect, as phony as the very-expensive-dental-work realty shark, whose closest manifestation as front page content was the barely proximate permanent slot reserved for him on Page Six of the tabloids, like the best table at some parody of an ostentatiously “glamorous” venue. He was loud and attention-getting through the 70s, when he forced himself on a jaded media as the latest personality to pay attention to, and on through the 80s and 90s as his notoriety – always positioned as fame by his own exertions at spin – spread all over New York, like melting oleomargarine on toasted Wonder Bread, and oozed occasionally into the notice of the national downmarket tabloids.
It was the mode of his announcement – I’ll remind you: loud and attention-getting – with generous dollops of hyperbole and outrageous character assassination on a global scale, and perpetrated with the corrosive weapons of glittering, wholly mendacious stereotypes as he ascended that famous golden escalator with a hired mob of cheering sycophants.
Mark Twain, 1909. By Photographer: A.F. Bradley in his studio. [Public domain], via Wikimedia Commons
Mark Twain wrote the following piece the year previous to the tumultuous and critical election year for President of 1880. Only 15 years after the Civil War had ended. The incumbent, Rutherford B. Hayes, had run in 1876 with the promise that he would not seek re-election – a promise he kept… astonishingly to our modern sensibilities. As a result the election was highly contested.
Recall, for perspective, that the country had already weathered the initial vagaries of Reconstruction, the various eruptions of corruption that marred the chances for a more peaceful process of reconciliation between the north and south, or for the assimilation of African-Americans, now fully established as citizens with rights (albeit what these were, and their extent continued to be contested). It had weathered the chaotic and tumultuous administration of Andrew Johnson, the martyred Lincoln’s successor, and as a great exponent of exploiting his office for purposes of politically biassed exercise of power. It had weathered the previously unrivaled level of corruption revealed in the administration of President Grant, sullying the reputations of all but the General himself.
The election was precedent-setting for several reasons. Unlike today, there was, in practical terms, virtually total engagement of the electorate. More people voted, as a percentage of the whole population in the 1880 election than had ever occurred previously in the United States. The vote could hardly have been more evenly split. The winner, James Garfield (who ran with Chester A. Arthur as Vice President, later to succeed him to the highest office) garnered a majority of the popular vote over his rival,Winfield Scott Hancock, the Democratic Party candidate. The vote was split by a difference, in the final tally, of less than 2,000 votes nationally. But in electoral terms, although each candidate won an equal number of states (19 to each), Garfield’s electoral votes were entirely from the more densely populated, urbanized and industrialized north, including Oregon in the enclave of Pacific and Mountain states that existed in a kind of civic isolation from the rest of the country, separated by what was then still the territories (and therefore non-voting) of New Mexico, Arizona, Utah, Idaho, and Wyoming. This band of not-yet-enfranchised territory included the contiguous Dakota territory, not yet divided, and that of Montana. Importantly, the Democrat Hancock’s victory in the entirety of what had been the formerly secessionist southern states, plus Texas, Delaware, Maryland, and New Jersey, set the precedent that persisted for decades, of a solidly Democratic south. Until the the first third of the ensuing century the liberal banner was carried by the Republican Party – the classic notion of the “party of Lincoln” as the nucleus of progressive ideas, a notion now obviously defunct. Curiously, and consistent with the bizarre unpredictability of the American electorate, the one state Hancock did not manage to carry, and whose allotment of electoral votes would nearly have reversed the outcome (as opposed to ensuring the landslide that was Garfield’s) was Pennsylvania… still a contested state and, today, a potential game changer if President Trump does not manage to retain his advantage there in 2016. For perspective, if Hancock had won Pennsylvania, he would have lost the Presidency by a very slim two electoral votes.
In any event, whatever the actual political reality and the culture that inspired Twain to write this piece as he did, he does seem to have captured, as he did so often, what it turns out is an enduring, perhaps, in a sense, a genetic, characteristic of the peculiar and continuously unpredictable condition of what the electorate will find not just tolerable, but acceptable about its would-be representatives.
The “moral crimes” of Twain’s imaginary contestant for the office, qualified to run sufficiently by his own lights (the only ones that count, as apparently has long been the case in our country, if not from the beginning) despite his peccadilloes, may seem mild by comparison to what passes for business as usual in Washington or what is considered a candidate’s “private business” and of no bearing in fitness for office. But those were gentler times, and we and the politicians, have had just over 140 years since then to invent far more ingenious ways of interpolating tolerance for depravity into our perception of normal behavior, and the same amount of time to have our sense of outrage ground down, possibly to only a trace presence in our consciences.
“An Open Letter to My Countrymen”
I have pretty much made up my mind to run for President. What the country wants is a candidate who cannot be injured by investigation of his past history so that the enemies of the party will be unable to rake up anything against him that nobody ever heard of before. If you know the worst about a candidate to begin with, every attempt to spring things on him will be checkmated. Now I am going to enter the field with an open record. I am going to own up in advance to all the wickedness I have done, and if any Congressional committee is disposed to prowl around my biography in the hope of discovering any dark and deadly deed that I have secreted, why—let it prowl.
In the first place, I admit that I treed a rheumatic grandfather of mine in the winter of 1850. He was old and inexpert in climbing trees, but with the heartless brutality that is characteristic of me I ran him out of the front door in his nightshirt at the point of a shotgun and caused him to bowl up a maple tree, where he remained all night, while I emptied shot into his legs. I did this because he snored. I will do it again if I ever have another grandfather. I am as inhuman now as I was in 1850.
I candidly acknowledge that I ran away at the battle of Gettysburg. My friends have tried to smooth over this fact by asserting that I did so for the purpose of imitating Washington, who went into the woods at Valley Forge for the purpose of saying his prayers. It was a miserable subterfuge. I struck out in a straight line for the Tropic of Cancer because I was scared. I wanted my country saved, but I preferred to have someone else save it. I entertain that preference yet. If the bubble reputation can be obtained only at the cannon’s mouth, I am willing to go there for it, provided the cannon is empty. If it is loaded, my immortal and inflexible purpose is to get over the fence and go home.
My invariable practice in war has been to bring out of every fight two-thirds more men than when I went in. This seems to me to be Napoleonic in its grandeur.
My financial views are of the most decided character, but they are not likely, perhaps, to increase my popularity with the advocates of inflation. I do not insist upon the special supremacy of rag money or hard money. The great fundamental principle of my life is to take any kind I can get.
The rumor that I buried a dead aunt under my grapevine was correct. The vine needed fertilizing, my aunt had to be buried, and I dedicated her to this high purpose. Does that unfit me for the Presidency?
The Constitution of our country does not say so. No other citizen was ever considered unworthy of this office because he enriched his grapevines with his dead relatives. Why would I be selected as the first victim of an absurd prejudice?
I admit, also, that I am not a friend of the poor man. I regard the poor man, in his present condition, as so much wasted raw material. Cut up and properly canned, he might be made useful to fatten the natives of the Cannibal Islands and to improve our export trade with that region. I shall recommend legislation upon the subject in my first message. My campaign cry will be: “Desiccate the poor workingman; stuff him into sausage.”
These are about the worst parts of my record. On them I come before the country. If my country don’t want me, I will go back again. But I recommend myself as a safe man—a man who starts from the basis of total depravity and proposes to be fiendish to the last.
“Let’s Look at the Record”
Harper’s Magazine, July 1954
Reprinted from the Kansas City Journal, June 15, 1879
With the long distance assistance of my bread-making friend turned temporary sensei (which included about 20g of his precious “goo” (sourdough starter), sent by mail) I’ve embarked on a bread making phase. This is after a hiatus of I’d guess about 40 years, the last time I tried baking my own, long before the days of the surge in sourdough seriousness (they’ve been serious about it in other cultures – pun neither intentional or un- – for centuries, at least, I mean in the commercial sphere). Back then, as a “youth” I used commercial yeast, and, as I recall, I mainly made loaves in rectangular pan. I also tried Julia Child’s recipe for authentic baguettes; but that was a tremendous pain, and probably accounts for the long fallow period of my personal bread baking.
The results of the first attempt, last night, were, to my mind equivocal. I’m sure I made dozens of mistakes, some half-knowingly, almost willfully stupidly. What can I say? It doesn’t seem worth getting seriously uptight about. But this too shall pass, and probably sooner than the existential threat that beleaguers us all (and wouldn’t it be pretty to think it will take longer?).
It doesn’t sit as high and pretty and boule-like as I’d like, at least part of the reason being the foolhardy/half-ignorant decision to use whole wheat flour, Red Fife, which I ordered and had delivered from a small mill in the south at some expense. The bag clearly said bread flour, but that was addressing certain fundamental characteristics of almost sacramental significance to serious breadmakers. God bless them.
Otherwise it was fairly painless, some aspects even fun. And the results, even if not entirely photogenic, frankly to this ancient palate, taste mighty good.
You can adjust all of your cookie settings.