The Real Problem with Lethal Autonomous Weapons Systems (LAWS)

The Real Problem with Lethal Autonomous Weapons Systems (LAWS)

With the extremely rapid advances in technology made in the twenty-first century, many aspects of human life have transformed irrevocably. One of the most significant changes involves norms regarding the commission of intentional, premeditated homicide by governments. The practice is today termed “targeted killing,” but it differs only in the implement of death from what in centuries past was called “assassination” and deemed illegal. Black-ops involving shady assassins who stalk and eliminate perceived enemies under a cloak of secrecy are no doubt still carried out by governments. But the use of unmanned combat aerial vehicles (UCAV) or lethal drones to stalk and eliminate terrorist suspects in lands far away is openly acknowledged and has been largely accepted by politicians and the populace alike as one of the military’s standard operating procedures.

The use of lethal drones to kill rather than capture suspects began in Israel, but was taken up by the George W. Bush administration in the war on terror waged in response to the attacks of September 11, 2001. President Barack Obama then expanded the practice, electing essentially to eliminate the problem of longterm detention of suspects in facilities such as the prison at Guantánamo Bay by defining them as guilty until proven innocent and then dispatching them using missiles launched from drones. The suspects killed were classified posthumously as Enemy Killed in Action (EKIA) unless specific information demonstrating their innocence was brought to light. But since many of the killings took place in remote parts of the world, such as the Federally Administered Tribal Areas (FATA) of Pakistan, where there were few if any troops or intelligence analysts on the ground to do the sort of due diligence needed to establish the innocence or even the identity of the persons killed, this nearly never happened.

With the ascendance and spread of lethal drones, government officials have effectively permitted the current state of technology to dictate morality, rather than subjecting proposed tools to scrutiny before using them. This is most plausibly a result of the fact that the experts to whom politicians defer on these matters are invariably either military officers or persons with ties to military industry. Indeed, many military officers end up serving on the boards of weapons manufacturing and military logistics firms. The revolving door between government service and industry is evident in cases such as those of Dick Cheney, James Mattis and Lloyd Austin, all of whom served as secretary of defense and also sat on the boards of private military companies with sizable government contracts. From the perspective of military experts, whose focus is upon winning wars through maximizing lethality, the development of remotely piloted aircraft (RPA) has naturally been regarded as a boon, offering the possibility of combating the enemy without risking soldiers’ lives.

Yet in the development and spread of remote-control killing technology, important ethical considerations have been overlooked. First, during regular combat warfare, when troops are placed in dangerous situations, where “kill or be killed” becomes a prudential maxim for survival, many acts of killing can be construed as literal acts of self-defense. Whether or not the troops should be there in the first place, as in Iraq or Vietnam, is another matter altogether, but if a soldier is already in a perilous theater, with enemy combatants lurking around every corner, then the pretext of self-defense becomes reasonable. The same cannot be said for acts of killing perpetrated by soldiers sitting in trailers in Nevada, who are not being directly threatened by their targets.

U.S. combat soldiers on the ground in both Vietnam and Iraq killed many people who might have been insurgents but proved not to be. The veterans of those conflicts suffered enormously as a result, and many ended up permanently wrecked by the experience. Soldiers who use drones to target the enemy are far from the bloody fray and physically safe from the dangers of the “battlefield” on which they fire. Nonetheless, drone and laser sensor operators such as Brandon Bryant abandoned the profession after having become disillusioned with the disparity between what they had signed up to do (defend their country) and what they ended up doing, killing poor tribesmen living out in the middle of nowhere who were not threatening anyone with death at the time when their lives were abruptly ended.

Because drone operators follow and observe their victims for extended periods of time, and witness their anguish in the aftermath of strikes as they bleed out, they have been prone to suffer bouts of regret and develop post traumatic stress disorder (PTSD) despite never having been directly endangered themselves. Such reflective servicepersons furthermore recognize that collateral damage, said to be unavoidable in the “fog of war,” is truly excusable only in a life or death, do or die, dilemma. Up to now, what the drone and laser operators had to fall back on was the fact that they were not in a position to be able to assess the value of the intelligence used to select targets. Their job was to locate and kill the person(s) said to warrant elimination by officers higher up in the chain of command. Accordingly, when mistakes were made, the blame ultimately rested with the analysts who had built the case for targeting on the basis of evidence gathered by drones, obtained through paid informants, and mined from cellphones. In other words, even if the drone operators themselves regretted having killed persons whom they themselves did not believe deserved to die, based on their own observation of the targets, some among them were still able to assuage their conscience by invoking the tried-and-true “invincible ignorance” line, according to which soldiers are not to blame when negative consequences arise from their having executed what to all appearances were legal orders.

But surely intelligence analysts, too, may suffer regret when obviously (or even possibly) innocent people are destroyed on the basis of the analysts’ marshaling and interpretation of the available data. Why not, then, take the fallible human being out of the loop altogether, thus minimizing the possibility of error and the human vulnerability to emotions which sometimes culminates in PTSD? If it was better for soldiers in trailers in Nevada to kill thousands of terrorist suspects throughout the Global War on Terror, rather than having them fly dangerous combat missions, would it not be even better to relieve all parties involved of the burden of having killed?

Despite the moral dubiousness of killing “enemy soldiers” who are not directly threatening anyone with harm, and a fortiori in countries where there are no allied combat soldiers on the ground said to require force protection from above, remote-control killing technology continues to be refined and extended with the aim of making drones both more efficient and more lethal. Consequently, a world in which robots “decide” whom to kill, as in dystopic films of the twentieth century such as Terminator, Robocop and their sequels, is no longer the mere fantasy of writers of speculative fiction. Lethal Autonomous Weapons Systems, with the proper-sounding “LAWS” as its acronym, are currently being pursued as the best way both to keep soldiers off the battlefield and also to minimize the errors invariably committed by all-too-human operators in drone warfare. From a purely tactical perspective, an obvious benefit of LAWS is that with this new technology, which takes human beings “out of the loop,” when mistakes are made, there will be no operator who must bear the burden of knowing that he killed people who did not deserve, much less need, to die. Indeed, arguably the most significant benefit to the military in rolling out LAWS will be the elimination of PTSD among drone operators who deeply regret their participation in the serial, mass killing of persons who posed no direct threat to their killers when they were incinerated by missiles launched from drones.

With LAWS, the responsibility for mistakes made can be almost completely diffused, for computers will not only gather and analyze the data, but also select the targets on the basis of that data, and then launch the missiles themselves. The magnitude of the mistakes made will vary from case to case, but so long as human beings are involved in the construction and programming of the machines used to kill, then the potential for error will obviously remain. There may still be a bit of room left for soul searching among those who programmed the computers, but they will always be able to absolve themselves by pointing to the inherent limitations of data collection. Without perfect information, mistakes will continue to be made, but the lengthier the causal chain, the fewer individuals there will be who feel the need to shoulder any blame.

From a tactical perspective, all of this may sound very logical and clearly better than having soldiers risk their lives, and analysts and operators suffer psychological distress upon learning that they contributed to the carnage when innocent persons are erroneously destroyed. The first premise in the inexorable march toward Lethal Autonomous Weapons Systems, however, that the killing will happen, with or without human operators and analysts, needs to be subjected to scrutiny. What has propelled the mad rush to develop and implement LAWS is the false assumption that the killing ever needed to happen in the first place. The governing idea has been that because the persons being targeted have been determined to be potentially dangerous, they might undertake to threaten people at some future time, if they are permitted to live. In other words, the victims are being preemptively eliminated, following the reasoning used to promote the 2003 invasion of Iraq, when the warmakers claimed that Saddam Hussein posed a threat to the world because of his alleged possession of weapons of mass destruction (WMD). That pretext was of course later found to have been false, along with others, including the claim (obtained through torture) that the Iraqi dictator was somehow in cahoots with al Qaeda. Yet the war went on all the same, with some pundits and war supporters filling the justificatory void with the tried-and-true need to spread democracy.

In the maelstrom of the wars on Afghanistan and Iraq, assassination was simply rebranded as targeted killing, when in fact both practices involve the intentional, premeditated elimination of persons deemed potentially dangerous. This criterion is so vague as to permit the targeting of virtually any able-bodied person who happens to be located in a place where terrorists are suspected to be. The only differences between assassination and targeted killing are the nature of the weapon being used and the fact that soldiers wear uniforms, while undercover assassins and hitmen do not. But are these differences morally relevant?

Unfortunately, over the course of the more than twenty-year Global War on Terror, there has been no attempt to reckon with the facts. But if the war on Iraq was a violation of international law, then every person killed in the conflict was the victim of a crime. Because of the shock of the events of September 11, 2001, however, most of the people who pay for the military’s killing campaigns have gone about their business, allowing the government to use their tax dollars to kill people who had nothing to do with the terrorist attacks, and in many cases were protecting their own land from illegal invaders. Twenty years on, the military continues to kill people when and where it pleases under the pretext of the need to fend off the next terrorist attack. That millions of persons have been killed, maimed, widowed, orphaned, reduced to poverty and/or rendered refugees as a result of the ever-expanding missions of the U.S military in the Middle East and North Africa—most of which were caused by overspill of previous missions, beginning in Afghanistan and Iraq—has been largely ignored.

The “killing machine” has been on autopilot for some time now, in the sense that lists of targets continue to be drawn up and dispatched with the killers themselves writing the history of what transpired. The wars on Afghanistan and Iraq gave rise to new terrorist groups such as ISIS, which then spread to countries such as Pakistan, Yemen, Libya, Syria, Mali, and beyond. Subsequent interventions in those lands then led to the spread of factions throughout Africa, where drone bases have been erected in several countries to deal with the problem of radical Islamist terrorism. With LAWS, the perpetual motion targeting of unnamed persons can be expected to be revved up to run even faster, around the clock, for robotic killers suffer neither compunction nor fatigue, and the success of their missions will continue to be measured by the number of “dead terrorists”, who in fact are suspects. In other words, the ethical problem with LAWS will remain precisely the same as the ethical problem with the drone program through which human operators have pressed the buttons to launch the deadly missiles.

The debate over LAWS should not be over how to make robots act as human beings might. Rather, we must pause and back up to ask why anyone would ever have thought that this rebranding of assassination as the military practice of “targeted killing” should be permitted in the first place. The fallacy in thinking that lethal drones and LAWS “protect the troops” derives from the assumption that the people being killed would have been killed had this technology never been developed. The truth, however, is that the many drone bases now peppering the earth have served as a pretext for launching missile attacks which would otherwise never have occurred. With such tools at their disposal, military and political administrators are apt to use them without thinking through the moral implications of what they are doing, specifically ignoring the long-fought advances in criminal justice made over millennia, above all, the presumption of innocence upheld in free societies the world over.

Drones were originally deployed for surveillance purposes, but it did not take long before they were equipped with missiles to provide a dual-function machine capable of both collecting data and taking out enemy soldiers based on that data. Most of the individuals eliminated have not been identified by name, but in some cases specific persons have been hunted down and killed, as in President Barack Obama’s targeting of U.S. citizens Anwar al-Awlaki and Samir Kahn in Yemen in October 2011, and Prime Minister David Cameron’s killing of British nationals Reyaad Khan and Ruhul Amin in Syria in August 2015. More recently, on January 3, 2020, President Donald Trump targeted top Iranian commander Qasem Soleimani, who was located in Baghdad at the time. Trump openly avowed that the act of killing was intentional and premeditated. According to the president, the major general was responsible for past and future attacks against the United States. All of these eliminations of specific, named individuals would have been considered illegal acts of assassination in centuries past but are today accepted by many as “acts of war” for the simple reason that they are carried out by military drones rather than spies.

The ethical problems with lethal drones have been raised many times by activists, who have protested the killing of persons in countries such as Pakistan, with which the United States is not even at war, and also by successive U. N. Special Rapporteurs on Extrajudicial, Summary or Arbitrary Executions (Philip Alston, Christof Heyns, et al.), who have repeatedly cautioned that the extension of the right to kill anyone anywhere at the caprice of the killers, which has been assumed by the U.S. government in its wide-ranging drone-killing program, can only sabotage the prospects for democracy in lands where leaders opt to eliminate their political rivals, facilely denouncing them as “terrorists” while pointing to the precedent set by the United States, the United Kingdom, and Israel. Needless to say, the literal self-defense pretext does not hold when leaders choose to use remote-control technology to hunt down and assassinate specific persons rather than charging them with crimes and allowing them to be judged by a jury of their peers. But, just as in the case of unnamed targets, when the victims of drone strikes are identified by name, they are invariably labeled terrorists, with no provision of evidence for that claim.

With LAWS comes the specter of fully normalized political assassination with no territorial boundaries whatsoever. The question, then, is not “how do we devise the best algorithms with which to program robotic killers?” Instead, we must ask why homicide should be used in cases where the decision to kill is clearly not a last resort, as it never is in drone killing outside areas of active hostilities, because no human being will perish if the missile is not launched. In expanding the drone program, the Obama administration carried out many “signature strikes,” where the precise identity of the targets was not known but their behavior was said to be typical of known terrorists. In addition, cellphone SIM card data was used to identify persons who had been in contact with other persons already believed to be terrorists or found to have connections to known terrorist groups. To execute persons on the basis of such circumstantial evidence of the possibility of complicity in future terrorist acts is a stunning denial of the human rights of the suspects, and flies in the face of the democratic procedures forged over millennia precisely in order to protect individual persons from being killed at the caprice of those in positions of power. This drone killing procedure in fact exemplifies the very sort of tyranny which finally led Western people to abolish monarchic rule and establish republican constitutions protective of all citizens’ rights. As the mass collection of citizens’ data continues on, such moral concerns are more pressing than ever before, for political leaders may decide to use their trove of intelligence to eliminate not only citizen suspects located abroad, but also in the homeland.

What needs to be done is manifestly not to make machines more efficient and lethal killers. Instead, we need to revisit the first premises which were brushed aside in all of the excitement over the latest and greatest homicide technologies deployed in the twenty-first century, when the U.S. government was given free rein to pursue the perpetrators of the crimes of September 11, 2001. That license to kill with impunity was never revoked, and to this day the drone killing machine continues to be used to destroy people who had nothing whatsoever to do with what happened on that day. With the diffusion of responsibility inherent to LAWS, a truly dystopic future awaits, as the criteria for killing become ever more vague and moral responsibility is fully diffused.

In Defense of Statues and Other Texts—All of Them

In Defense of Statues and Other Texts—All of Them

There has been a lot of discussion and some action on the question whether statues portraying or representing men currently regarded as scoundrels by self-styled “good people” should be permitted to stand. On its face, such a view would seem to imply that many of the public squares and buildings of the great cities of the world must be razed, which strikes me as a reductio ad absurdum. Pick any leader you like: Churchill, Truman, De Gaulle, in any country, at any time, and look closely enough at his record and you will find dubious decisions made with deplorable consequences. The leaders who saved the world from the Nazis may be considered heroes today, but that does not imply that they were somehow flawless, as is nowhere more obvious than in what happened in the Soviet Union after World War II, when millions of Russians became the victims of a regime which had worked with the governments of the United Kingdom and the United States to halt Hitler’s mad quest to conquer the world.

Looking at more recent leaders, there are a good number of libraries, institutions, and buildings dedicated to men such as George H. W. Bush, George W. Bush, and Tony Blair, who together wrecked Iraq after concocting bogus pretexts for the invasion of a sovereign nation. For his part, Barack Obama attacked Libya before leaving it in shambles and dramatically increased the use of lethal drones to kill suspects abroad, including U.S. citizen persons of color who were executed without indictment or trial. Obama also dropped bombs throughout his two-term presidency (an average of seventy-two per day in 2016), targeting seven different countries across the Middle East and Africa. The military policies of each of these men have caused untold human misery, yet buildings and foundations continue to be named after them.

Those who wish to raze statues and rename buildings are for some reason not talking about their contemporaries, and the idea of prosecuting men such as Bush, Blair and Obama at the International Criminal Court (ICC) for crimes against humanity does not seem to cross their minds. Indeed, we find celebrities such as Ellen Degeneres and former first lady Michelle Obama entirely willing to overlook the war crimes of their buddy George W. Bush. President Obama himself opted not to prosecute those responsible for the Bush administration’s widely decried torture of human beings at Abu Ghraib, Baghram, and Guantánamo Bay prisons, among other places. Obama claimed, “That’s not who we are,” but effectively left torture as an option on the table for other administrations, including his own. He also “solved” the problem of the extended detention and mistreatment of terror suspects never charged with crimes by defining them as guilty until proven innocent and incinerating them with missiles launched from drones.

Remarkably, despite the horrors perpetrated under their watch, the esteemed opinions of George W. Bush, Tony Blair, and Barack Obama continue to be sought out. As far as I can tell, many people are entirely ignorant of the foreign policy record of Barack Obama, whose reputation seems to have received a big boost by the brash and boisterous demeanor of his successor. Mention Libya to a fan of Obama, and you are likely to receive a puzzled look in response. It was not, of course, Obama’s intention to catalyze a resurgence of black African slave markets in Libya through his ousting of Moammar Gaddafi in 2011, but that was nonetheless one of the consequences. When it comes to relatively mild-mannered men such as George W. Bush and Barack Obama, the prevailing prioritization of intentions over consequences translates smoothly into a willingness to forgive the perpetrators of catastrophic campaigns of mass homicide along the lines of the tried-and-true just war line: They meant to do good. So powerfully does the assumption of good intentions among compatriots hold sway over people that even Henry Kissinger, despite his role in perpetrating and perpetuating the Vietnam debacle, which resulted in millions of deaths, has managed somehow to continue to be revered, at least in some circles.

The same charitable interpretation is not, however, extended to the men whose effigies have been damaged or destroyed all over the United States in something of a mad frenzy to decry them as evil, while highlighting the protesters’ goodness by contrast—if only to themselves. Dozens of statues and monuments have been vandalized—spanning the time period from Christopher Colombus to Ronald Reagan—but the “cancel culture” crowd has focused especially on what have been interpreted to be the racist overtones of effigies of Confederate soldiers and officers from the Civil War, as a result of which slavery was finally abolished. As educated people know, the Civil War did not commence as a simple one-issue battle over whether slavery should be permitted, any more than the United States entered into the mêlée of World War II “in order to save the Jews.” (What went on at the concentration camps was discovered upon, not before, the liberation.) At the end of a conflict, when history is written by the victors, moral motivations are invariably emphasized over what were originally political reasons for taking up arms. In the case of the Civil War, economic objectives among secessionists and federalists, including President Abraham Lincoln, were what gave rise to the war. Nonetheless, the abolition of slavery is naturally viewed as a felicitous consequence of the loss of the war by the Confederate army.

I am not interested in debating the virtues and vices of the many men throughout history who held slaves, as did some of the founding fathers of the United States, but would like to suggest, rather, that calls for the destruction of statues and the metaphorical burning of texts, better known as censorship, are misguided. This is, first, because such works have always and everywhere been the result of intelligent human beings’ acts of creation. It is true that nearly no one knows anymore who created the vast majority of public squares and statues. In the case of structures erected in ancient and medieval times, it is plausible that they were produced under duress by persons enslaved, because that’s how things were done back then. But the fact that the Roman Colosseum was built with the blood, sweat and tears of slaves made to realize the vision of non-slaves (Emperors Vespasian and Titus) does not imply that the structure should be erased from the face of the earth. To do so would accomplish nothing beyond depriving the world of memory traces of centuries past.

What we know about the more recent structures being defaced is that their production involved the industry and creativity of artists who were not slaves. There is a reasonable sense in which the statues can be viewed as works of art rather than effigies to bad men who no longer exist or insane calls to make slavery legal again. The idea that destroying such works will somehow diminish racism rests on the entirely false view according to which an artifact has a single, definitive, immutable interpretation. This is a profound misunderstanding of the nature of art. A statue of a Confederate soldier can be as much a tribute to all soldiers who fight and lose as it is to any individual racist’s beliefs. The concept of “invincible ignorance”, that at least half of all soldiers are fighting for an unjust cause (which follows logically from the fact that at most one side can be right—though both can be wrong), has throughout history been invoked to protect the soldiers on the losing side who followed what to all appearances were legal orders in promoting a mistaken leader’s cause. Again, a fine-grained study of history reveals that many men who fought on the side of the Confederacy were not doing so out of hatred for African Americans and were not in fact slave owners. Moreover, there were racists among the Union army’s ranks, and some among them did own slaves. It is true that, had the Confederate army prevailed, then slavery would likely have lasted longer in North America than it did. It is also true, however, that many other countries managed to abolish slavery without such a bloody and protracted war.

When we allow one person or small group’s interpretation alone, the least charitable of all possible interpretations, to dictate the meaning of a work, we are allowing them to delimit reality precisely in the manner of a tyrant. Works of art, like written texts, are by their very nature subject to multiple interpretations, and their meaning to an individual comes finally to be a function of that person’s background beliefs. Non-didactic artists and writers may not even know what “message” they were attempting to convey when they created their works, and they often discover meanings only after the fact, while considering the object from the perspective of a reader/interpreter. The fact that slavery was widespread throughout the ancient world does not imply that all ancient texts should be erased and artifacts destroyed. The fact that Aristotle may have believed that women were only partial persons does not imply that we must interpret his every word as expressive of that idea. Nor should the fact that a few pseudo-intellectual Nazis took the work of philosopher Friedrich Nietzsche as supportive of their ridiculous Weltanschauung lead us to burn all of Nietzsche’s books.

In a truly liberal society, such as that championed by nineteenth-century English philosopher John Stuart Mill, according to whom the “marketplace of ideas” is a cornerstone of democracy, the possibility of learning and correcting the errors in our ways mandates that we remain receptive to other people’s points of view. Even if we find the views of others despicable, this does not imply that they should be effaced. Practically speaking, it would be highly imprudent to prevent everyone who disagrees from speaking, because then we would never know who our true enemies are. But the primary reason for opposing censorship is that no censor has a monopoly on the truth. We are all wallowing in beliefs forged through a random assortment of arbitrary interactions, processing information as we encounter it, accepting some claims while rejecting others, based, ultimately, on how they cohere with our current worldview.

In Ray Bradbury’s classic novel of speculative fiction, Fahrenheit 451, the government is being run by leaders who believe that they are right and anyone who disagrees is wrong, and that books which promote other, “dangerous” world views must be destroyed in order to prevent poisoning people’s minds. The most famous works of dystopic fiction, including George Orwell’s 1984 and Aldous Huxley’s Brave New World, also caution against the perils of permitting small committees of fallible people to delimit the contours of reality as they please. Reading such works, in which censorship is a key component of governance, it may seem obvious that the worlds depicted are anti-humanist. These are worlds where intelligent persons are treated as criminals for having ideas which differ from what has been deemed the proper way of viewing things. Despite the popularity of such works of fiction in recent times, an ever-increasing number of people have come to believe that we should topple the statues of “bad men” and remove classic works of literature from school curricula. The next step down a slippery slope should have been predicted when all of this began: some people should not be permitted to speak in the public square, for their words may be dangerous.

The recent removal of President Donald Trump from both Twitter and Facebook, on the grounds that he allegedly used the platforms to foment violence, specifically, the raucous protest at the U.S. Capitol on January 6, 2021, has been applauded by self-proclaimed “liberals” all over the planet, despite the fact that Trump explicitly dissociated himself from those who wreaked the violence, and publicly denounced it as wrong. Note that no one held Democratic party leaders or candidates responsible for the many violent protests and widespread looting in cities all over the United States throughout 2020. Yet Michelle Obama, who in befriending George W. Bush let war crime bygones be bygones, wrote a two-page letter exhorting the Big Tech social media giants to permanently bar “infantile” Trump from sharing his words with the world, claiming that he was responsible for what happened on January 6. Her sentiments were echoed and amplified throughout the mainstream media, with many people parroting the refrain that it is the prerogative of private companies to conduct their businesses as they please, and offering a pithy response to those who disagree: “If you don’t like it, leave.”

It is true that the Big Tech companies are not bound by the First Amendment to the United States Constitution, which protects the free speech of citizens. Twitter and Facebook have been censoring, “curating” content, and banning users for years now. In recent times, and most aggressively during the 2020 election cycle, they have adopted the measure of attaching “child safety” warnings to posts, alerting users that content is dubious according to the company’s fact checkers. What happened in this case, however, was startling because one of the alternatives to Twitter, Parler, where people banned from Twitter or annoyed by their censorship or “child safety” warnings sometimes migrated, was removed nearly immediately from the Google , Amazon and Apple app stores, effectively suppressing the speech of everyone who did not like Twitter and left (as they had been told to do). The symbiosis between the government and Big Tech became undeniable when incoming President Biden tapped Facebook executives to be a part of his cabinet, and one hopes that lawsuits charging monopolization will put an end to the squelching of alternative views and ensure that history, though continuously rewritten by the victors, will not be entirely erased.

Donald Trump’s tweets were certainly a bizarre phenomenon in U.S. political history and surely one of the reasons for his polarizing presence the world over, with people revering and reviling him in close to equal numbers—at least judging by the outcome of the 2020 presidential election. For Big Tech to effectively uphold Hillary Clinton’s “deplorable” trope in favoring one half of the country over the other can only exacerbate the deep divisions already on display. Michelle Obama is entitled to her opinion, but so are the more than 74 million people who voted for Trump in the 2020 election. Donald Trump was the president of the United States for four years, and his often emotive Tweets are historical texts, for better or for worse, which should be accessible to all.

Unfortunately, the so-called liberal people who demand and applaud the silencing of Trump do not appear to understand what they are advocating. They do not appreciate that humankind has progressed over millennia as a result of a lengthy experiment in testing out various ways of looking at the world. To claim, for example, that the works of Mark Twain should be censored because they contain the word ‘nigger’ is to forget that societies have changed in part as a result of people’s having read such books. It is in fact arguable that to remove a statue of Robert E. Lee would do no more than to hasten the process of forgetting how wrong human beings were, for most of history, in upholding slavery, not only in the United States, but all over the world. Likewise, from an antiwar perspective, every monument to World War I (and they exist in many different countries) stands as a testament to the sheer insanity of sending millions of young men to their deaths for essentially nothing. Again, one look at the Vietnam Veterans Memorial in Washington, D.C., where the names of the many men who died in a misguided war are inscribed, serves as a reminder not to make the same mistake all over again.

To oppose the razing of historical structures on the grounds that they were commissioned by or represent “bad” people is not to deny that communities have the right to decide how to decorate their public spaces. Sometimes sculptures in parks and squares are removed or replaced, but to pretend that unilateral acts of vandalism will reduce racial tensions is delusive in the extreme. It is both puerile and false to assume that a single interpretation exhausts the value of any work of art or text. The truth is not a function of whatever arbitrary group of people happen to have acceded the corridors of power or whatever angry protesters have taken it upon themselves to wreck what they happen not to like, based on their own, necessarily limited, interpretations. Rational adults are aware that all people are fallible and that no one can pass strict “purity” tests.

Should the Roman Colosseum, which was not only built by slaves but also used as an arena for gladiator battles between slaves forced to fight to the death for the entertainment of the upper classes, be pulverized, and the land on which it stands be turned into a parking lot? That is the direction in which this childish movement is going. To cancel culture is to cancel the extant evidence of the process human beings have gone through to get where they are today. The Nazis attempted such a cancel culture project, denouncing modern forms of art as “degenerate” and permitting only aesthetic views in  conformity with their delusive project of furthering the Aryan race. Some may take offense at my comparison of these two cases, but they differ only in details, not in approach. Both define cultural history as having effectively ended with the current view of those who would erase the past and dictate all that may exist in the future. Both are obtuse, shortsighted and small-minded.

Despite their foibles and deficiencies and myopia and biases, some human beings nonetheless make the effort to contribute their intellectual energy to produce new works and texts for our consideration. We need not and will not like all of them, but to say that we should destroy them is tantamount to the tyrant’s decree “Off with their heads!” in response to the annoying dissenters who may emerge among his ranks. The censors tomorrow may not agree with your interpretations and may decide that you need to be canceled at their caprice. The grandest irony of all is that if such a view had prevailed in the pre-abolition United States, with the suppression of the texts and speech of all those who disagreed with the laws of the land at that time, then slavery would never have been abolished.

Welcome to Zombie Pharm

Welcome to Zombie Pharm

Imagine a world where you were required to cover your face and nose whenever you stepped out of your home or when anyone came to your door. No one would know when you smiled or frowned and the difficulty of communicating the ideas you attempted to share would eventually deter you from saying much of anything at all, frustrated as you would be by the annoyance of always having to repeat yourself. There would be no point in asking anyone questions requiring more than a “yes” or “no” answer, because more complicated replies would be muffled by their masks and not worth the time and effort needed to decipher.

Imagine a world where all inhabitants of a city, state or country were told where they could go and what they could do, not only in public, but also in their homes and in privately owned businesses. Healthy persons would be quarantined to prevent other people from becoming ill. Good citizens would be enlisted and surveillance and tracking apps used to identify anyone who refused to abide by emergency lockdown and curfew orders or the required hygiene measures. Small business owners would be fined for attempting to run their businesses or neglecting to enforce emergency laws. Employees would be arrested for attempting to go to work.

Imagine a world where private tech companies collaborated with government bureaucrats to censor your written speech. You would not be permitted to share texts which conflicted with the official story of whatever the authorities claimed that they had done and were doing and wanted you to believe. You could still write texts, on your own computer, but there would be nearly no one around to read what you had to say. The censors could not suppress texts faster than they could be written and shared, however, so some would slip through. This would necessitate visits from the state police to the homes of those who had attempted to incite violations of any emergency laws which happen to have been enacted by administrators to protect their constituents. Whether or not the measures actually helped anyone would be entirely beside the point, because everyone knows (from all of the “just wars” throughout history) that all that matter are the lawmakers’ publicly professed intentions to do good. The perpetrators of what were deemed dangerous texts would be arrested and taken away, if necessary, by force.

Imagine a world where journalists were required to promote the official government line in order to keep their jobs. No text or report which reflected poorly on the military-industrial-congressional-media-academic-pharmaceutical-logistics-banking complex would be allowed. A few of those who “indulgently” refused to comply might then impudently begin their own publications, such as The Intercept, issuing their interpretation of what was going on in the few sequestered places (made difficult to find by Google) where independent journalism was still possible.

Imagine a world where the independent media, too, had been infiltrated by persons keen to hold the line, to defend what they had been persuaded to believe (by hook or by crook, carrots or sticks) must be upheld as the truth. Anyone who attempted to share inconvenient “disinformation” or “fake news”, as it would be denounced, would then have their work edited to conform with “the official story”. The “traitors” (as they would be characterized) who disagreed would have two choices: either to stop writing or to flee to another place with even fewer readers than before, such as Substack.

Imagine a world where publishers who revealed crimes committed by governments would be subject to criminalization: arrest, incarceration, isolation, extradition and more. Those who exposed murderous crimes would themselves be treated as though they were violent criminals, even when they had never in their lives wielded any implement of dissent beyond a pen.

Imagine a world where oppressive lockdown and curfew policies were said to be necessitated by case surges of “infections” in persons many of whom, while testing positive, manifested no symptoms at all. Suppose that the tests being used were revealed to be notoriously inaccurate, by some estimates, 90% inaccurate. Yet the testing continued on, ever faster and more furiously, and the case surges would serve as the basis for preventing healthy people from living their lives. When vaccines emerged, everyone who tested positive before but survived would still need to be inoculated, because, the “Listen to the Science” crowd would insist, it might be possible to become reinfected. People who had already survived the dreaded disease would only know that they were safe and not a menace to public health if they took the new vaccines, whatever they were, and whether or not they had been demonstrated to prevent and transmit infection, and no matter what the unknown side effects might be. Because, obviously: Science.

Imagine a world where people with life-threatening diseases were required to postpone their treatment because another disease, 95% of whose victims were octogenarians or older, had been designated by select “expert” epidemiologists as more dangerous and life-threatening than cancer, heart disease, stroke, and the other top killers of human beings. Imagine a world where distraught and desperate people reduced to poverty and rendered homeless through not being permitted to work began turning to deadly drugs such as Heroin, sometimes Fentanyl-laced, with the result that, in some cities (such as San Francisco), more persons died of overdoses than of the disease serving as the pretext for the laws forbidding those people from working.

Imagine a world where citizens were required to undergo medical treatments not known to prevent disease but believed to alleviate the symptoms of a disease for which the vast majority of humanity suffer only minor symptoms. This would be undertaken in the name of public health, but the effect would be to harm some of those who were not vulnerable to the disease and essentially had been tricked or coerced (since uninformed “consent” is not really consent) into volunteering as subjects in an enormous experimental trial with the aim of determining the outcome of introducing into human bodies certain foreign substances deemed potentially profitable by the companies which produced them. Most people would line up enthusiastically for such vaccines on the basis of widely disseminated claims of 90% and 95% efficacy lauded by well-respected experts, with details about the “known unknowns” and “unknown unknowns” available only in the fine-print of a few more nuanced articles which nearly no one read.

Imagine a world where people who had already survived the dreaded disease and also had been vaccinated were nonetheless required to abide by all of the ongoing hygiene measures, from wearing a mask, to staying home, to taking more vaccines on a schedule determined by their government. There would be no need to explain what any of this was for because all good citizens would already be accustomed to reciting the mumbled refrains (behind their masks): “Extraordinary times call for extraordinary measures!” and “We’re all in this together!” Everyone would have to comply, everyone would need to be quiet, everyone would be required by law to roll up their sleeves for the clearly compelling reason that a global pandemic was tearing through the world like a tsunami, wiping out everyone in its path. Except that most of the victims were dying at the same rate and age as the actuary tables would have predicted even if the culprit virus had never arrived on the scene. And the death toll over the course of the year would be about the same as for any other year, but with a slightly different distribution in causes of death.

Imagine a world where you were required to present your health record on demand and you would not be permitted to enter stores, restaurants, schools, to work or to travel without first proving that you had agreed to participate in an experimental vaccine trial for a disease from which you were at minimal risk of harm. The local health authorities would determine when you needed to present yourself again, for a new treatment, as the virus in question could morph unpredictably over short intervals of time into something else, thus necessitating that you and everyone else on the planet prepare your bodies once again, just in case this time around it might be more dangerous to you and those around you.

Imagine a world where a healthy person’s refusal to undergo medical treatment for a potential future possible disease to which he was not vulnerable, according to all available statistical indicators, was taken as proof of his suffering from another disease, Oppositional Defiant Disorder (ODD), as clearly indicated in the latest edition of the Diagnostic Statistical Manual of Mental Disorders (DSM). The person thus diagnosed would be required by law to submit to whatever medication would make him more amenable to the other forms of medical treatment to which he was opposed because obviously there would be something very wrong with him, constituting as he would a grave danger to public health.

Imagine a world in which children were taught from an early age that it was unsafe to touch other human beings or to be touched by them. They would be required to wear masks and full-face plastic shields and to wash their hands frequently and to attend school by video conference because, they would be sternly instructed, otherwise they might kill somebody else’s parents or grandparents, even though they themselves were not ill. If the children found any of this a source of anxiety, they would be prescribed psychiatric medications to transform their view of the world, so that they would accept rather than reject what they were told were “the new normal” contours of reality.

Imagine that everyone around you embraced all of the above and undertook public shaming campaigns against anyone who disagreed. Their faces would turn red and they would shriek in righteous indignation, “Listen to The Science!” whenever anyone attempted to point out the manifest absurdity of what was going on. They would denounce as degenerate, anti-science, anti-vax ignoramuses anyone who pointed out that pharmaceutical firms are profit-driven, publicly traded companies whose success depends on their ability to develop, produce, and market new wares.

Beware the ‘Nurse Ratched’ State

Beware the ‘Nurse Ratched’ State

Advocates of minimal government have often warned against “The Nanny State,” which rears its ugly head whenever bureaucrats try to tell people what they should do and how they should live. There is a sense in which all governments do that, through the very enactment of laws, but Nanny-leaders mete out prescriptions which vastly exceed what can be fairly portrayed as an attempt to protect people from one another. An extreme example of this sort of overreach occurred in the United States during the Prohibition Era, with catastrophic consequences. Not only did outlawing the enjoyment of alcohol not prevent people from drinking, it actually catalyzed a massive expansion of organized crime all over the United States, as career criminals stepped in to provide people with the means needed to imbibe. No one wants to go to prison, which is why murder was on the rise throughout Prohibition, with blood flowing in some cities nearly as freely as whiskey and wine.

Such unintended consequences have arisen wherever recreational drugs have been outlawed, and experiments such as the one in Portugal, where drug-related deaths diminished significantly after decriminalization, may have helped to propel some in the United States to accept the legalization of marijuana. The state of Oregon recently went even further, by legalizing possession of small amounts of hard drugs as well. Just as economics played a major role in putting an end to the thirteen-year Prohibition fiasco, the voters of some states may have been persuaded to permit recreational drug use after having seen the massive tax revenues being collected through pot shop sales in states such as Colorado. Whatever the reasons may have been, the slow dismantling of the legal framework undergirding the “War on Drugs” is certainly a welcome development to anyone who rejects the Nanny State.

The trend toward tolerating alternative lifestyles more generally, however, conflicts starkly with what else has been going on in 2020, coincidentally one century after the ratification of the Volstead act. Policymakers attempting to save people from COVID-19 have pulled out all the stops—going above and beyond, in their view—to protect their constituents by issuing new and ever-changing edicts about how people ought to behave. This might be more tolerable if there were any genuine benevolence on display. Instead, what we are witnessing is an increasingly despicable effort to blame citizens for the failure of policies implemented in response to the arrival of the virus on the scene. When restrictions intended to stop the virus are imposed but cases and deaths then increase rather than diminish, this has been taken to prove to those crafting the new rules that citizens did not in fact do as they were told, and they are, therefore, responsible for the current state of the health crisis.

I have been in Austria, Wales, England and the United States over the course of 2020, and in each of these countries I was surprised to find the very same finger-wagging reproach of citizens by government administrators who wish to blame what is manifestly nobody’s fault on somebody else. All over social media, angry mobs continue to lash out at those who refuse to stay home or “mask up,” and many government leaders now address their constituents as though they were toddlers or, perhaps more aptly, the residents of Nurse Ratched’s ward.

This is a strange conception of government, according to which politicians do not work for the people who pay their salaries but instead are their guardians, who alone can decide what the populace may and may not do. The phenomenon is not unique to the dictators-in-waiting who run states such as California and Michigan. Citizens all over the world are continually being threatened by government officials that if the case numbers do not go down, then lockdowns will be ordered or tightened, and more businesses will be closed, and further restrictions imposed, as though anything anyone does at this point has an effect upon a virus which is nearly everywhere and beyond anyone’s means to control. This punitive paradigm may have been possible to uphold with a straight face until late October, and many on the cacophonous COVID-19 caravan in the U.S. and in the U.K. have ceaselessly carped about their own incompetent government’s response, contrasting it to the approaches of the admirable leaders of countries in the European Union and Oceania, who obviously knew what they were doing!

But then along came the resurgence of cases in Europe, particularly in countries which had been held up for months as shining examples of how a government ought to manage the crisis. Germany had tough lockdowns, mask requirements and probably the best contact-tracing program around. They restricted the entry of people from any country with an unacceptably high “infection rate” (scare quotes are necessary given the widely acknowledged problems with the PCR tests), and anyone at the border who did not present proof of not being COVID-19 positive was either quarantined or turned away (some were also fined). So how does one explain the new wave of “infections” all across Europe? It must be the case that the naughty plebeian Europeans were lying about their contacts, meeting in large gatherings, and brazenly violating social distancing and mask ordinances. None of the case surges throughout the Northern hemisphere has anything whatsoever to do with the fact that more people invariably fall ill with the onset of winter.

In the U.K., Prime Minister Boris Johnson issued in November a nationwide month-long lockdown order in response to a resurgence of cases which villagers tended to blame on the haughty Londoners—who obviously had been flouting the rules by partying and congregating in pubs and then spreading COVID-19 dust everywhere they went—from England to Wales to Ireland to Scotland, and back again! That was, however, not my impression. What I found upon my arrival in London at the end of October (before the new lockdown) were empty streets, shuttered stores, and restaurants and pubs with very few patrons. Realty signs were all around, and the place looked frankly like a ghost town. My train from Norfolk to London was nearly empty, as were all of the trains I took in the U.K. from July to November, when I finally decided to leave in exasperation at the abrupt and arbitrary cancellation and closing of any- and everything I might want to do and see.

Throughout this crisis, not only the governors of Democratic states in the U.S. but also the prime minister of Australia and the health minister of the U.K. have exemplified the Nurse Ratched mode of governance, repeatedly threatening their constituents with ever-sterner measures should the epidemiological situation not improve, under the assumption that case surges decisively demonstrate not that the policy initiatives were worthless but that people were not following the rules. Sadly, many citizens, terrorized by the mainstream media’s nonstop fear-mongering about COVID-19, have accepted this absurd blame game, which has broadened what was already, long before March 2020, the chasm dividing a populace torn in two. Unfortunately, the situation is likely to get much worse as those who blithely agree to do as they are told become increasingly intolerant of those who refuse to do the same. Yes, the small paper cups on trays will be coming your way soon. What will you do? People are already taking sides, and the ironies continue to multiply.

Leftists have often wielded the slogan “My Body My Choice” in protesting any attempts by the government to limit a woman’s right to a free and safe abortion. It is highly ironic, then, that some among them should now be agitating vociferously for the universal vaccination of people worldwide against COVID-19. The “Listen to the Science” crowd immediately shuts down anyone who dares to suggest that the decision about whether to allow foreign substances to be injected into their own body should remain the prerogative of individuals themselves. They denounce anyone who resists the call to vaccination as “antivax,” even when they are not vulnerable to the disease in question and have no problem whatsoever with time-tested vaccines. Those who express any hesitation whatsoever to roll up their sleeves are ridiculed as “antiscience,” even when they are in fact scientists by profession. When none of those inflammatory insults work, there is always the tried-and-true “selfishness” charge: you are a selfish, heartless human being if you are not willing to vaccinate yourself to protect other people from death.

Let us look soberly at the scientific facts, setting to one side all possible conspiracy mongering about 5G, microchips, the World Economic Forum’s “Great Reset,” chemtrails or anything else. First, COVID-19 is highly contagious but nowhere near as deadly as the pandemics of the past, and it specifically targets elderly persons with other health problems. An overall 99.5+% survival rate is not the sort of danger which would ordinarily lead a healthy young person to undertake a risky regimen to protect him- or herself. Why “risky”? At the most fundamental level, because safe and effective vaccines have always required years to produce and test, invariably involving, as they do, unknown side effects. The reason for this can be summed up in a simple, undeniable phrase: human variability.

For any trait, sensitivity, capacity, etc., found in human beings, its distribution can be plotted over a bell curve with a tiny percentage of people occupying the extreme ends of the curve. Those people are the “outliers,” who will be much more (or less) sensitive to a particular environmental factor than is the average person. Perhaps the simplest way of thinking about this human variability and its relevance to the vaccine issue is in terms of food allergies. No one knows that they suffer from a peanut allergy, for example, until their body encounters peanuts. Similarly, a person with Celiac disease will discover this fact only upon consuming gluten. When vaccines are manufactured, they contain scores of components with which a given person’s body may never have come in contact before. Most people will not be harmed by any of the components, as the vaccines have been rigorously tested on other animals even before human trials begin. Once extensive, long-term testing in large groups of human subjects has been completed, then the company producing the vaccine can assert with confidence that the risk to patients is quite low. The risk is never zero, however, just as the risk incurred by doing anything whatsoever is never zero. There will always be some people who are more sensitive than others, and they may end up being harmed by one or another of the many components of any vaccine. There is nothing mysterious or conspiratorial about any of this, and in fact it is precisely why vaccine manufacturers insist that, before distributing their product widely, they must be granted indemnity in the event of the unforeseen and unpredictable side effects upon a tiny percentage of those inoculated.

All of this to say: there is always risk involved in taking a vaccine. People decide for themselves, for example, whether or not they should take the seasonal flu vaccine, the reported efficacy of which has ranged from 19% to 48% over the past five years. This implies, according to epidemiologists themselves (not “antivaxers” or conspiracy theorists), that more than half of the people vaccinated have not been helped by the flu shot in the least. Were any of them harmed? It is difficult to say, because people become ill and die all the time, and there are usually far too many variables working simultaneously to be able to single out the cause of post-vaccine harm, particularly when the subjects are already elderly and frail. Those who sing the praises of the annual flu vaccine, including the public relations teams behind the aggressive marketing campaigns launched by governments to encourage their citizens to undergo vaccination, generally seem to believe that the efficacy rate is much higher than it is. From a consideration of the marketing material alone, one would be forgiven for concluding that the flu shot is rationally obligatory and 100% effective and safe. Having once examined the statistics, however, there is some cause for restraint.

Just as no one should be able to force you to drink green tea because they believe that it is good for your health, and no cancer victim can be compelled to undergo chemotherapy against his own will, individuals themselves must decide whether rolling up their sleeve for the annual flu shot is a good idea or not. Those who are young and hardy will most liking survive the flu in any case, and there is a real chance that the vaccine which they take—there are multiple versions every year—will not help to combat one or another of the virus strains which they happen to encounter anyway. It is literally a gamble. There are people who maintain that they never became sicker than after having taken a flu shot, but vaccine advocates quickly sweep in to silence them by insisting that they must have already been exposed to the flu before inoculation. In fact, the only reason for believing such an explanation is manifestly that one wishes to support universal vaccination. It may or may not be true. One thing is undeniable: pharmaceutical firms are profit-driven companies, whose revenues will wax or wane with general public sentiment about the wisdom of their many-splendored cures.

The current situation is quite a bit murkier than the case of the seasonal flu shot, because most of the COVID-19 vaccines being developed employ a novel RNA technology never before licensed for use in human beings. In the vaccines which have stood the test of time (measles, polio, etc.), a tiny amount of pathogen protein is introduced into a patient’s body so that it will preemptively ready itself for an immune response in the event that the virus is later encountered. Usually the virus matter introduced is dead, but sometimes it is live, and this is by design—it depends on the pathogen and is determined through extensive experimentation. A live vaccine induces a minor bout of the disease, which is much less likely to lead to death than is an unprotected body’s encounter with the wild virus. Anecdotally, I can report that after having received an obligatory Yellow Fever vaccine (which is live) before traveling to Ghana, I was quite ill for about five days. The cause and effect was clear: I was suffering a minor bout of Yellow Fever, thanks to which my body developed the antibodies needed to protect me from the disease during my trip to Africa.

Suppose, now, that the new COVID-19 vaccines worked just as the time-tested vaccines. In that case, before agreeing to be inoculated, a reasonable person would require some sort of assurance that the vaccine itself will be less likely to harm the patient than is the wild strain of the virus. Because the survival rate among people exposed to COVID-19 is greater than 99%, it would be prudent for a person to take the vaccine only if their prospects would be improved through vaccination. At this disease risk level, without any such guarantee, one may or may not wish to take an experimental vaccine. People in the vulnerable categories, advanced seniors and those who are exposed regularly to the disease in healthcare contexts, may well feel that it is worth the risk, and they will likely be first in line for the vaccines once they are made available.

It is of utmost importance to bear in mind, however, that the vaccines currently regarded as most promising for controlling the outbreak of COVID-19 do not involve the time-tested approach. Rather than introducing proteins from the offending organism (or a simulacrum), the front-runner vaccines introduce foreign pieces of viral RNA (ribonucleic acid) which will instruct the person’s own body to produce the immune system-galvanizing viral proteins itself. The presence of those pseudo-foreign proteins (coded for by foreign RNA but produced within the human body), will then initiate the needed immune response. In other words, there is an extra step involved. The foreign RNA is introduced, then the person’s body produces the proteins coded for by the snippets of RNA, after which the needed antibodies will be generated by the body in response. This ingenious scheme (if it works!) involves the human body tricking itself into triggering an immune response by producing what are empirically indistinguishable from traces of the offending virus itself. What could go wrong?

Perhaps nothing will go wrong, but the fact (of science!) remains: such vaccines have never been used in human populations before. In attempting to discuss this matter with various people (civil discourse is not always possible with the “Listen to the Science” crowd, ironically), I have been amazed that there should exist persons fully prepared to agree to totalitarian control over their very own bodies while knowing absolutely nothing about the history of vaccine development. They simply do not care that the novel vaccines are novel, nor that those who volunteer to take part in the largest experimental trial of vaccines in human history are essentially offering their bodies up as Petri dishes to pharmaceutical firms. Some vaccine enthusiasts appear not even to know what RNA is and attempt to discredit anyone who disagrees with their gurus in white labcoats (most of whom have financial ties to Big Pharma), despite the fact that plenty of published literature exists on the topic of vaccine harm. Advocates for forced universal vaccination appear to be unfazed by possible conflicts of interest and are not at all bothered by the sudden appearance of Bill Gates (whose company Microsoft violated anti-trust laws) in their social media timelines exhorting everyone everywhere to get on board with the global vaccination regime [sic].

Beyond all of the factors relevant to new vaccines more generally, one can quite reasonably inquire, in this case, whether anyone should trust a company (AstraZeneca) which “accidentally” (through a “manufacturing error”) gave thousands of its vaccine trial participants only half of their first dose, reported a 90% efficacy figure, but subsequently discovered that the true efficacy rate in those fully dosed was only about 62%. In other words, in the AstraZeneca trial in question, the less vaccine the subjects received, the better they fared. None of this is to suggest that anyone should expect laboratory technicians to be perfect, for they are human. But that is part of the gamble one takes in agreeing to participate in such a study, as can be seen throughout the history of vaccine development, which has left many bodies in its wake (mostly animals of other species, but also some human beings).

The reason why the healthy Western subjects of pharmaceutical drug trials have always been generously remunerated—in the third world they are not—is because they are risking their own well-being and even life by agreeing to ingest substances with unknown side effects, which cannot be predicted a priori. Indemnity clauses are always included in the contracts for those who agree to participate in experimental drug trials precisely in order to prevent any victims (or their survivors) from seeking compensation should something go awry. It is of course possible, and one certainly hopes, that the injection of foreign RNA into human bodies may not cause any lasting harm, but the unvarnished truth is that we simply do not know what the long-range and unforeseen consequences will be, because this has never been done before.

In all of the excitement over the splendid reported efficacy rates (90%, 94% and 95+%) of the front-runners in the great COVID-19 vaccine race, I have seen no mention by anyone of the survival outcomes of placebo subject classes. Why might that be? Whenever new drugs and remedies are scientifically tested, this is done with a contrast class of subjects who are given not the treatment being studied, but a placebo substance, which is considered to be inert vis-à-vis the disease to be defeated. This is the only way to demonstrate that the remedy is more helpful than doing nothing at all. In the case of COVID-19, there are a few key factors to bear in mind. First, based on the death charts of the Centers for Disease Control (CDC), the World Health Organization (WHO), and many other institutions as well, it is evident that any placebo remedy which I myself decide to take—water, vegetables, vitamin C, quinine, even air—already has a 99.5% chance of keeping me alive, even if I am exposed to and become infected with COVID-19. I may, therefore, stick with the placebo for the entirely rational reason that its efficacy rate in keeping me alive is likely just as high, if not higher, than that of any possible vaccine.

Big Pharma’s tactic of neglecting to report on the outcomes of placebo studies for its vast array of antidepressants and anxiety remedies was for years ignored. Eventually, a few courageous psychiatrists and psychologists revealed that, for many of the best-selling psych meds prescribed to millions of people all over the world, placebo subjects fared just as well and sometimes even better than those taking the drugs, particularly in long-range studies. In other words, many people prescribed psychotropes for acute cases of depression, anxiety and grief produced by life traumas such as the loss of a loved one would have improved over time, even if they had taken no drug at all. Mention of such results was routinely omitted from reports touting the efficacy of psychotropes for the plainly diaphanous reason that taking no medication does not produce any profit for drug manufacturers.

Similarly, the companies touting the virtues of their new vaccines designed to save humanity from COVID-19 make no mention of placebo class survival outcomes. Nonetheless, many people have been encouraged by the reported results, relieved that at last they will be able get back to living their lives as they please. In reality, the current misery of healthy individuals being victimized not by COVID-19 but by political policies crafted in response to the virus has no logical connection to the invention or success of any vaccine. Rolling up one’s sleeve cannot be made a condition upon ending policies which do not protect but rather harm most of humanity. Instead, the policies should be ended because they never had and never will have the advertised effects.

Remarkably, when anyone dares to express skepticism about the decrees of the new COVID-19 czars, this is taken to illustrate that they need to be protected from themselves and also from harming others. Somehow we have found ourselves in a world governed by Nurse Ratched-esque individuals who repeatedly scold us for the failure of their previous policies to put an end to COVID-19 and appear ready and willing to punish us further for not agreeing to do as they say and, now, to roll up our sleeves. They call it “treatment,” and they have already purchased, using taxpayer funds (what else?), “free” vaccines for all. From the perverse perspective of these government officials, it is our fault that the virus is running rampant, and, therefore, we must line up for our paper cup on the tray. If anyone objects to being made into the subject of an experimental vaccine trial, for any of the many non-conspiratorial reasons outlined above, they are to be denounced as lunatic fringe extremists and de-platformed across social media.

This frightening transformation of citizens into subjects is now so widespread that even some business leaders are promoting the same line, apparently believing themselves to comply with what they have been told over and over again are the dictates of science. The CEO of Qantas airlines recently announced that they will be requiring proof of COVID-19 vaccination for anyone attempting to board their flights. Needless to say, I will not be traveling to Australia again anytime soon, because my body is my own, and I do not agree to offer it up as a Petri dish in a large-scale clinical trial by any profit-driven company, and certainly not Big Pharma, whose amorality (at best) and manifest greed has already been firmly established through its many large-scale campaigns to drug everyone for anything—from infants to nonagenarians—with psychotropes. (Did you know that “Prozac” for dogs and cats is now a thing?)

It is precisely because of the unavoidable dangers involved that individuals, who alone will bear any negative consequences arising from their choices, must retain control over what is done to their own bodies. Yes, there are COVID-19 outliers as well: younger persons who suffer worse health outcomes than the vast majority of their peers, and it is possible that any given person will be an outlier in that sense. But there are already mountains of demographic statistics available on the dangers of COVID-19, while none whatsoever exist yet for the new vaccines. Free people must therefore decide for themselves whether the risks of taking an experimental antidote to a disease are outweighed by its alleged benefits. When authoritarian leaders and their associates in the corporate world paint themselves as benevolent, insisting that they are only trying to save the world from the dreaded disease, they are forgetting the most important quality of their constituents and customers: they are free to determine their own destinies and to assume risks which they themselves regard as rational and to reject those which they do not.

The United States Supreme Court recently upheld citizens’ first amendment rights of religion and assembly, even during a global pandemic, and one hopes that as lawsuits continue to wend their way up the judicial chain, the grip of authoritarian policymakers will be further diminished. Human beings should never be held hostage to the demands of those promoting universal vaccination, and least of all when their own danger of succumbing to the disease in question is small. If my own chances of dying from COVID-19 were 50%, rather than less than .5% then it might well be rational for me to gamble, just as many cancer victims, out of desperation, have agreed to submit to experimental treatments. But I am neither sick nor particularly vulnerable to the novel COVID-19 virus, so I’ll take my chances with my own immune system and my preferred placebo remedy of liberty. I may no longer be welcome in Australia, but there’s always Brazil. Or perhaps I’ll go to Mars.

Existentialism, Libertarianism, and the NAP

Existentialism, Libertarianism, and the NAP

I self-identify only as myself but have long been sympathetic with both libertarianism and existentialism. Having dealt throughout 2020 with an array of restrictions on my liberty imposed by local authorities everywhere I have been (Europe, the UK, and now in the US), the primary effects of which have been not to save lives but to control how people behave, I have been thinking about existentialism, which naturally raises questions about the proper scope and role of government, bringing me back, also, to libertarianism. Both outlooks prioritize human liberty, dignity and personal responsibility above all else. I have seen nearly nothing written about existentialism in recent years, perhaps because its most famous adherent in the twentieth century, Jean-Paul Sartre, was politically aligned with socialist and even communist movements. To suggest that existentialism and libertarianism are somehow related might seem prima facie odd because the latter is typically regarded as politically conservative, a right-wing, not a left-wing, view of the proper role of government. The mere mention of the word libertarian may incite ire among progressives of the “social justice warrior” stripe, and some leftists appear to derive untold delight from sardonically ridiculing libertarians as “pot-smoking Republicans”.

Another common stereotype is that libertarians must be white male land owners (why else would they care about protecting private property?!), which is of course just as simpleminded as Joe Biden’s claim that “You ain’t black!” if you have to think about whether to support him. In fact, nothing could be more racist than to assume that “authentic” black people have no real choice but to support the Democratic party. Biden’s claim was all the more disturbing given that he himself helped to author the 1994 crime bill which put thousands of people behind bars for nonviolent offenses, including many African Americans. Biden also rallied vigorously for the disastrous 2003 invasion of Iraq, which is relevant not only because a disproportionately high percentage of racial minorities serve in the military, but also because the lives of millions of persons of color were destroyed or degraded as a result of arguably the worst foreign policy blunder in U.S. history. In 2011, the Obama-Biden administration went on to offensively attack the country of Libya, which resulted in a resurgence of African slave markets. In that same year, they used lethal drones to execute brown-skinned U.S. citizens without indictment, much less trial. But who really cares about Biden’s policies? At least he is not Orange Man Bad!

Speaking of labels, Jean-Paul Sartre famously praised Che Guevara as “l’homme le plus complet de notre époque [the most complete human being of our age]” which, again, might lead some readers to scoff at my claim that existentialism and libertarianism have anything whatsoever in common. It would be a mistake, however, to confuse Sartre’s political views with the higher-order philosophical thesis of existentialism, which was most appealingly articulated by nineteenth-century thinkers Friedrich Nietzsche, Søren Kierkegaard and Fyodor Dostoevsky, who are not coincidentally some of my favorite authors. Albert Camus, another twentieth-century intellectual, wrote a number of works which arguably reflect an existentialist outlook—including his most famous novels, L’étranger [The Stranger] and La peste [The Plague]—but Camus himself resisted that label. He certainly wasn’t the first independent thinker throughout history to have refused to accept such labels, for a variety of different reasons. Some among them simply do not like club-like organizations, which do on occasion transmogrify into religious cults of sorts, even when their memberships comprise what to all appearances are intellectuals.

Jean-Paul Sartre followed the lead of his nineteenth-century predecessors in famously propounding that “l’existence précède l’essence,” which is an explicit rejection of the essentialism of ancient Greek thinkers such as Plato and Aristotle. We become what we do, but that is never fully determined by the circumstances of our birth. That said, it was not entirely insane for twentieth-century existentialists to champion left-wing political causes, so long as they were convinced (as they seem to have been) that the conditions for human liberty, dignity and personal responsibility were not available to the vast majority of persons. Sartre rejected not only Aristotle’s essentialism but also his belief (apparently common in ancient Greece) that women and non-Greeks (barbarians!) were not full-fledged persons. As pretty much everyone owns today, individuals denied the opportunity to educate themselves may appear to be illiterate, but that has nothing whatsoever to do with their inherent intellectual capacities. Along those lines, left-wing existentialists may insist that before anyone can make free choices, they need to have not only the potential but also the power, at least in principle, to do so. People who are scrounging around for their next meal or a roof over their head for the night may not have the energy or time to do much else.

As a result of the political activities and fame of Sartre and Camus, the existentialist waters were muddied for decades to follow, with some of those claiming Sartre as a personal hero more or less on a par with the twenty-somethings who wear Che Guevara t-shirts but never bother to read any books about him. Those who adore the iconic stenciled image of “Che”, and the implied “coolness” of anyone who agrees, might be stunned to learn, among other things, that Che Guevara personally oversaw the execution of more than 500 human beings, most of whom had been going along to get along with the Batista regime. Then again, given what might be termed “the authoritarian turn” taken in recent years by leftists keen to impose their values on everyone else, perhaps they would not be bothered in the least by Che’s homicidal creds.

The division between left-leaning and right-leaning existentialists turns most obviously on their interpretation of potential. Few would deny that it can be difficult for a person born into poverty to break out of his conditions, but it is nonetheless possible, as we know from the many people throughout history who have done just that. It is precisely the inherent dignity of human beings which drives some of them to achieve great things, and, although some will roll their eyes or snicker at this, one may with equal reason point out that many a person with a good deal of potential ended up squandering it in part as a result of the privileged conditions into which he was born. Ultimately, in a free society, the answer to the question what persons should do with their lives comes back to themselves, regardless of whether they were disadvantaged or spoiled, encouraged or oppressed.

The philosophical thesis of existentialism has no normative content—even morality is an undecided issue. Libertarianism, in contrast, champions what is sometimes characterized as the non-aggression principle (NAP) as its most fundamental tenet: initiating or threatening forceful interference with individuals and their property is wrong. In existentialism, everything is permitted. In libertarianism, in contrast, everything is permitted except violation of the NAP. Libertarianism, therefore, exemplifies moral absolutism, which existentialism does not. An existentialist may adopt non-aggression as a personal principle, and he may or may not exhort others to do the same. He may or may not find fault with those who neither agree with him nor follow his lead. The existentialist may skeptically regard the NAP as an article of faith, for it must be chosen by an individual himself for himself and for his own reasons. But to claim that normative principles such as NAP are articles of faith is not to deny their importance in how some people choose to shape their own lives.

What should we do? is not a question which can be settled by appeal to the deliverances of science, because science trades only in facts, while normative prescriptions for action are based in values, which cannot be read off of empirical reality. The paradox of morality is that you cannot argue someone into acting morally, if he does not already believe that he should, because what one ought to do can never be deduced from the way things happen to be. Instrumental rationality is a matter of fashioning means to ends, but setting those ends is up to individuals themselves—an idea championed not only by skeptics such as eighteenth-century Scottish philosopher David Hume, but also the existentialists.

The open-ended, contentless quality of existentialism is perhaps why much of what has been written by existentialists is literally literature—assuming the standard division between philosophy and literature. (I myself reject that division, but many philosophers do not.) However one distinguishes one type of writing from another, it is up to each person to decide how to interpret everything. If you choose to follow anyone else’s rules (those of your parents, teachers, the state, a religion or other group, a philosophical “school”), that is something which you choose to do—or not. “Ne pas choisir, c’est encore choisir [not to choose is still to choose],” as Sartre famously put it. Common criminals and protagonists such as Raskolnokov (in Dostoevsky’s Crime and Punishment) or Meursault (in Camus’ L’étranger) may be viewed by many as miscreants, but their comportment arises out of their individual decisions to adopt their own principles for living. They are free agents, and no one else is responsible for what they do. Yes, forces of nature and nurture act upon everyone, but we alone choose what to do and bear the primary credit or blame for the consequences which ensue.

Western democracy is generally regarded as the best available system for free persons, for it permits them to carve out their own destinies, based on their own beliefs. Everyone faces obstacles and struggles along the way, but with sufficient initiative, drive and ingenuity, some people manage to make something of themselves. The laws of modern societies prohibiting violence against other people effectively affirm the libertarian’s NAP (which is not however to deny that the state is itself the primary violator of the NAP, above all through war). An individual may lead his life as he wishes, provided that he does not prevent others from doing the same. If your concept of “The Good Life” requires the destruction of other human beings and/or their property, then your liberty will be restricted by the government, if you are caught. Some people do not embrace the NAP, choose to rape and murder, pillage and plunder, and some among them end up in prison next to the nonviolent pot-smokers and others locked up as a result of the 1994 crime bill and related NAP-hostile legislation.

Now that recreational marijuana has been legalized in many of the United States, and medical marijuana in even more, there are plenty of pot smokers roaming free, even while others continue to languish behind bars. We also know that, although some murderers are locked up, others remain at large: one out of every three homicide cases in the United States is never solved. That may seem to be an alarming statistic to some, but it is the price that must be paid for the much worse alternative of judging everyone guilty until proven innocent. The presumption of innocence protects many more innocent than guilty people. No one should be locked up (much less executed) for their mere potential to commit crimes, and anyone who thinks otherwise is a tyrant, tout court. Some of the best works of dystopic fiction underscore the horror of a world in which everyone is constantly under suspicion and subject to arbitrary detention for whatever reason any authority may deem sufficient, solely at his caprice.

In 2020, people are currently being denied the freedom needed to determine their own destinies and to conduct themselves with the dignity which distinguishes them from the members of other species. In this way, COVID-World offers libertarians a glimpse into the twentieth-century existentialists’ concerns about the material prerequisites which must first be satisfied in order for persons to be able to choose what to do with their lives. Before COVID-19, people in Western liberal societies were largely held responsible for their own deficiencies and failure to fashion a good life for themselves. Now, however, people are being denied the opportunity to do what they would choose to do, left to their own devices. Effectively, those being prevented from earning a livelihood and forced to stay home are the equivalent of innocent persons erroneously convicted and sentenced to prison terms. Incarcerated persons are severely hampered in their ability to start and run businesses, and to act in other ways which might prevent them from resorting to crime in the future. They are also strictly limited in their choices of how best to flourish and thrive while inhabiting a cage.

Just as innocent persons should not be incarcerated, healthy people should not be quarantined. From the perspective of both existentialism and libertarianism, this arbitrary detention of innocent persons can be viewed as an affront to humanity. People are being told how they must live by their government, which claims to be acting for the public good but in reality is destroying countless lives. It is not the case that persons are forbidden by the government only from harming other people and their property, as an NAP-based society would prescribe. Citizens are in fact being ordered, effectively, to harm themselves, under the pretext that acting in ordinary ways may lead to the deaths of other people. How so many compliant citizens have come enthusiastically to embrace this Orwellian Covidystopia as “the new normal” is beyond me. Perhaps it is simply the logical consequence of stringent behavioral conditioning initially implemented by appeal to what we now know to have been the false claim that millions of compatriots would otherwise die. Many months later, having already accepted the endless and mercurial decrees of the Covid czars, people still terrified of the virus are willing to do whatever they are told to do without posing any objections whatsoever. Nine months of habits die hard, so when gurus in white lab coats such as Anthony Fauci tell them to jump, they answer “How high?”

Governments allegedly of, by, and for the people have imposed many restrictions on liberty in countries all over the planet, the primary effects of which have been to harm millions of people in the name of the small percentage of those who are vulnerable to COVID-19. It may be tempting to ascribe underhanded or ulterior motives to those who wave their science flags in defense of the new Nurse Ratched state, but there is no real need to do so, for the phenomenon can be more simply explained as fully analogous to the enthusiastic drum-beaters for wars from which they themselves have nothing to gain and, indeed, much to lose. The problem at this point in time is that people reside on one or the other side of the COVID-19 divide, but the policymakers are for the most part aligned, claiming the authority to dictate behaviors for all of society by appeal to the opinions of a few select scientific experts, no matter how many times they have been wrong in the past. Recall that Anthony Fauci sincerely proclaimed in a 60 Minutes program interview that masks were not necessary, and in fact caused more problems than they prevented because people wearing them tend touch their faces more often than they might otherwise do. (And of course it is quite evident by now to any observant person that most people wear the same mask over and over again—pulling it out and putting it into the same pocket or purse, making the exercise purely a matter of show.) We were also told “fifteen days to flatten the curve,” but then the goalposts were changed again and again, until now, nine months later, Pennsylvanians have been ordered to wear masks whenever they leave their home and also within their residence, if anyone should happen to visit. Travel continues to be restricted and has been condemned by government authorities the world over, both at the national and state level, despite the IATA’s (International Air Transport Association’s) calculation that the chances of contracting COVID-19 on a plane this year were one in twenty-seven million. Although some disputed that claim, the U.S. government abandoned its own health screening of persons on incoming flights because the positive cases were so low that the program was deemed cost ineffective.

Citizens stepped onto a slippery slope when, back in March 2020, they agreed to stay home, and, if necessary, not to work. They agreed to wear masks wherever and whenever this was deemed necessary by the authorities that be. But one restriction and rule leads to another, with progressively more absurd implicatios, as is nowhere better illustrated than in the State of Pennsylvania’s requirement that people wear facemasks within their own homes. Who will be enforcing such laws? (Perhaps Amazon’s Alexa can be brought on board, given that she already resides in millions of homes.) This invasion of policymakers into the private lives of their constituents, and the fact that people have not risen up in response, is a dangerous turn in the already surreal series of events constitutive of the COVIDystopic year 2020, and it must be resisted, while it is still possible to do so. Beyond prohibiting domestic violence (which is one instance of enforcing the NAP), the state has no business whatsoever in any private residence. It is not the government’s business to tell human beings how they ought to live or who they should be. People need to take personal responsibility for their own health and well-being. No one denies anyone the right to choose not to smoke or to drink alcohol and eat fatty foods, and no one is preventing anyone afraid of the virus from donning hazmat suits. As for the rest of us, we should be permitted to shoulder the inevitable risks associated with leading what we freely choose to make of our own lives.

The Intellectual Fraud of ‘Listen to the Science’

The Intellectual Fraud of ‘Listen to the Science’

With the arrival of COVID-19 on the scene, many people have been seduced into believing that they must “listen to the science” and do whatever the self-proclaimed experts tell them to do. That this is charlatanry pure and simple follows from the fact that science says absolutely nothing about what we should or should not do. Those are questions of value, answers to which are provided by intelligent, conscious, and sentient human beings who thereby advance a perspective and promote their own values. Waving a “Follow The Science” flag distinguishes one not as a person of superior intellect and moral constitution but as someone who is easily duped and slings slogans as a way of covering up a lack of understanding—specifically, of how empirical science actually works. To refuse to wave a “Science” flag in support of political policies put forth by persons with specific value-laden agendas does not mean that one is a Luddite or an ignoramus but that one in fact grasps the fundamentally skeptical nature of the scientific enterprise.

All of the ongoing clamor about “the science” reminds me of what I observed while a graduate student in philosophy at Princeton University, where many of my peers seemed to believe that by specializing in areas such as philosophy of science or logic, they distinguished themselves as intellectually superior to those who wallowed in ethics or other forms of value theory. Having earned my undergraduate degree in biochemistry, conducted a good bit of research in organic chemistry, and taught chemistry at two different universities before pursuing graduate studies in philosophy, I was never vulnerable to the prevailing climate of scientism—the elevation of science as a form of religion—for I already knew what science could and could not do.

Science can tell you about the facts. Not all of them at once, and not immediately, but over time, as data is amassed and theories are proposed and rejected or confirmed. Those facts are always tentative, mere hypotheses covering very specific and limited ranges of reality. A theory of physical chemistry, for example, tells one nothing about botany, for the two types of theory cover completely different strata and phenomena. What are believed to be scientific facts are always subject to disconfirmation as more data is accumulated over time and better theories emerge. Apparently recalcitrant data must be somehow explained away by the best confirmed current theory, and when that proves impossible to do, then the theory must, rationally speaking, be abandoned.

Scientists throughout history have clung religiously to their favorite theories (especially those devised by themselves), but eventually, as new generations of scientists emerge, older theories become amenable to revision and even wholesale rejection by researchers not religiously devoted to them. It is not easy to do such a thing because one risks offending the true believers, some of whom may wield extraordinary institutional power and will vehemently resist suggestions to the effect that they are wrong. No one wants to believe that they have devoted their entire professional career to the elaboration of a theory which was false all along.

Philosopher Thomas Kuhn wrote a gripping book, The Structure of Scientific Revolutions (1962), about the social and psychological dynamics involved in theory construction and testing, the nuts and bolts of the scientific enterprise, which, like it or not, is conducted by human beings, with all of their foibles. It seems safe to say that the COVID-19 cheerleaders for The ScienceTM have never read the work of Thomas Kuhn. To refuse to subject data to scrutiny, to decline to reevaluate initial hypotheses, naïvely accepting instead the prescriptions of select gurus on faith, even in the face of overwhelming evidence that they were wrong, is to succumb to the charlatanry of scientism, not to champion science.

Not everyone accepts Thomas Kuhn’s rather derogatory depiction of how scientists operate; some prefer to uphold the image of scientists as supremely rational and objective analysts. But even if Kuhn’s picture was an exaggeration—some would say a caricature—even supposing that scientific hypothesis testing were some sort of supremely rational and objective endeavor, what could even the best confirmed and most widely accepted theories of science tell us about what we ought to do? The answer is: absolutely nothing. For a scientific theory’s having survived in tact over a reasonable period of time does not alone dictate anything whatsoever about human action. To suggest otherwise is to commit what is known in philosophy as “the is-ought fallacy,” usually credited to David Hume, an eighteenth-century Scottish philosopher with a skeptical bent. Facts are one thing; normative prescriptions for action are quite another. People blinded by science (who I have noticed tend to be those with no higher education in science), those who, like Milgram’s unwitting experimental subjects, accept the decrees of men in white lab coats and decline to examine the values and interests being promoted by them, have simply been duped. A most stunning aspect of this intellectual submission (which has analogues in foreign policy as well) is when subjects are persuaded to believe that conflict of interest is somehow impossible among scientists—despite being possible in every other realm. Why are scientists supposed to be untainted by worldly temptation? Because they are scientists! As though human beings did not choose to become scientists.

To see the distinction between the deliverances of science and the promotion of values, consider one example of a fact widely considered to be true, based on many decades of data collection. Science tells us that smoking will greatly increase the chances of one’s dying prematurely. One’s decision whether to smoke or not, however, depends on one’s values. If you find the pleasure of smoking great enough, then you may simply not care today that at the terminus of your life some number of years will likely have been shaved off as a result of your insistence on smoking. (No guarantee, of course. There are examples of chain smokers who somehow beat the odds to become nonagenarians or even centurions.) All things considered, you are much more likely to die of a lung-related illness if you smoke than if you do not. In fact, all activities in which human beings engage involve risks along with benefits. Each individual must make his own choices for his own life about which benefits do and do not outweigh the risks incurred in doing those things—driving, drinking, rock climbing, flying, scuba diving, traveling to countries where violent crime is prevalent—the list goes on and on.

What has happened in 2020 is that a few COVID-19 policymakers have decided for all of humanity that the risk of dying from COVID-19 outweighs all other considerations about what we ought to do. This is a value judgment, pure and simple, yet it has been fobbed off as some sort of “expert” wisdom. Those who crafted the initial responses to the virus, beginning with the very labeling of COVID-19 as a pandemic, have rallied the “listen to the science” troops for many months, with the result that their stance has become very difficult to challenge. Few of them seem capable of assessing the new data and revising their theory as the scientific method would require. Despite adamantly claiming that they “listen to The science,” they fail altogether to recognize that science is not a static, eternal totem, but a method used to marshal a dynamic, metamorphosing body of hypotheses. The irony, of course, is that the most vociferous denouncers of anyone who questions the gospel are conducting themselves in the manner of religious fanatics incapable of admitting that mistakes may have been made.

Thus we find that without any evidence whatsoever for the efficacy of lockdowns, and in fact a recent pronouncement by the World Health Organization (WHO) that lockdowns have side effects which vastly outweigh any alleged benefits, the lockdowns of western states, along with border restrictions and quarantine requirements, continue on, with local authorities tweaking their policies only slightly whenever they decide that the latest “case” tally is too high. No matter that different kinds of tests are administered differently and to different groups in different places. No matter that the very accuracy of the tests has been impugned. No matter that there is no other example of a respiratory disease (to my knowledge) for which one may repeatedly test positive as “infected” while manifesting no symptoms. No matter that cases in younger persons are rarely fatal, yet serial, obligatory testing of college students continues on. The COVID-19 gurus have decided that a case is a case. None of the death data matters because these people, who never understood the scientific method in the first place, much less the fact-value dichotomy, continue to claim that The ScienceTM is on their side and that those who disagree are selfish and illiterate ignoramuses. In the United States, the people of California, Michigan, Massachusetts, and other states have had to endure severe restrictions of their liberty and much economic hardship for eight months, with no end in sight. Across the pond, both Wales and Ireland, along with various counties in England, recently re-imposed strict lockdowns as a form of “circuit breaker” after surges of cases in some places where no or nearly no new COVID-19 deaths had been reported.

Proclaiming that we must “listen to the science” has become the worst type of virtue signaling on the part of people many of whom have nothing to lose from the lockdowns (their own financial security being immune to whatever policies are imposed). Shutting down the hospitality and tourism sectors of entire cities, counties and countries causes untold harm to anyone working in the gig economy, and yet the victims are themselves portrayed as immoral for refusing to sing along with the cheery refrain, “We’re all in this together!” Few among the populace have been able effectively to press these points, because the media and tech industries have overwhelmingly joined forces with the COVID-19 policymakers, promoting The ScienceTM company line while silencing those who demur. Needless to say, there is nothing more unscientific than censorship, for the scientific enterprise requires a continual reassessment of the facts. When new hypotheses are forbidden because they conflict with what one believed to be true, then science has come to a screeching halt.

Even more devastating than the effects in Europe, Britain, and the United States are the same policies enacted in third world countries by leaders who emulate western politicians religiously committed to their initial responses. The same lockdown and quarantine “strategies” have been implemented in places where they could never, even in principle, diminish the incidence of COVID-19 death, even if it were true—which is not supported by data—that lockdowns worked in the West. In countries where large populations live in extremely close proximity to one another in open-air shanty towns—places such as Brazil, South Africa, Kenya, India, and many other countries as well—there is no chance that staying in one’s hut is going to prevent transmission of the dreaded disease. Meanwhile, police have ended the lives of persons in violation of emergency laws which in no way serve the people’s interests. But to understand how absurd it is to impose curfews and quarantine requirements on the residents of shared outdoor space, one would have to be familiar with basic concepts of molecular entropy, which we know from the many closed beaches and outdoor mask requirements in the West are altogether beyond the capacity of the COVID-19 gurus to comprehend.

Perhaps the grandest irony of all is that, by focusing exclusively on the hope of minimizing the deaths of the small percentage of the population vulnerable to the dreaded disease, the medical professionals who have been advising the COVID-19 policymakers have violated the most sacred oath of physicians: Do No Harm. Lockdown policies have harmed every person whose risk of death by other causes has been increased by preventing them from doing whatever they would have done, left to their own devices: working, visiting the doctor, and engaging in normal social activities which make life worthwhile, including interacting with family and friends.

With regard to scarce resources and policies which affect entire populations, science is silent about who should and should not be saved. Should limited health resources be dedicated to the mass testing of young people not at serious risk from COVID-19? Should healthy children at nearly no risk of death be used in experimental vaccine trials? These are value judgments about which science has nothing to say. Anyone who suggests otherwise is a shyster or confused, and anyone who believes that men in white lab coats should be the ones to answer such questions has been fooled.

Destroying the Village to Save It: Government Overreach in Fearful Times

Destroying the Village to Save It: Government Overreach in Fearful Times

After months of lockdowns, border closures, and inconsistent injunctions issued by local authorities to protect some of their constituents by severely limiting everyone’s freedom not only to move, but also to act, and even to speak, the time has arrived for a robust discussion of the proper scope and role of government. The range of “emergency laws” being imposed by authorities all over the world in order to stem the tide of COVID-19, or to prevent so-called second waves of the illness in countries where it has already taken a steep toll, is amazing to behold. I imagine that more and more of these laws will be overturned in Western liberal democracies as lawsuits force judges soberly to confront the mountain of statistics being amassed. One hopes that they will find ways to objectively assess the real danger of the disease (relative to other causes of death) rather than continue to permit government administrators to base their abrupt and arbitrary policy changes on scary-sounding “case surges,” which have not been followed up by surges in deaths, thankfully. It is unclear to me why anyone would ever have worried about a second wave of deaths to begin with, given what we now know about the discriminate targeting of the disease, which at the outset the hard-hit Italians certainly did not. But, alas, fear acts as a powerful vise on the minds of even intelligent beings.

It is surprising that so much emphasis has been placed on cases, rather than deaths, because nearly everyone now does seem to know that many “infected” persons show only minor or no symptoms at all. The CDC (Centers for Disease Control and Prevention) itself included a text to that effect in all of its early reports on the new coronavirus, back when no one really understood what was going on, and it seemed a matter of simple prudence to do whatever “the experts” decreed. Hardly anyone seemed to wonder at the time why, if COVID-19 was a genuine pandemic, the CDC would be stating, almost in passing, that “most people” would not be adversely affected by it in the least. So is it a pandemic? Or is it not a pandemic? Here is the definition of pandemic in the Merriam Webster dictionary:

pandemic = an outbreak of a disease that occurs over a wide geographic area (such as multiple countries or continents) and typically affects a significant proportion of the population

In war theaters, those running the show have always hedged their bets, and the same thing is happening today in the theater of COVID-19. Better to err to the side of caution! appears to be the thinking, as at least some politicians must have believed when in October 2002 they granted President George W. Bush the authority to wage war on Iraq whenever he pleased, with no further need to consult the legislature. Once war has already been waged, the citizenry tends to line up behind leaders in a show of solidarity, even when it becomes indisputable that they have no idea what they are doing, as in the cases of Vietnam, Iraq, Afghanistan, Libya and Syria, to list only a few of the many catastrophes in recent U.S. military history. Sadly, the same seems to be true in the “war” on COVID-19, as some have characterized it, and not without reason.

When authoritarian measures are implemented in the name of national defense, we are right to examine whether those measures actually promote rather than undermine our own interests, as in the case of the twenty-year War on Terror. The summary execution of U.S. citizens without indictment (much less trial) was carried out under the authority of President Barack Obama before the not-so-critical eyes of the populace, most of whom did not even blink. Mass surveillance of all U.S. citizens—and, indeed, anyone, anywhere in the world—was justified in the minds of government leaders because of the danger supposedly posed to “our way of life” by violent terrorist groups such as Al Qaeda and ISIS. “They hate us for our freedom” became an oft-parroted trope, despite the ample evidence that, in truth, they hate us for our bombs, which leaders continue to this day to lob, killing innocent people while disrupting and degrading societies in lands far away.

The ever-proliferating “emergency laws” penned in response to the COVID-19 virus reflect a similar sense of urgency among bureaucrats. When persons are swabbed and then effectively punished (quarantined) for having “failed” the COVID-19 test, some among them are understandably baffled. One anecdotal case among thousands is that of my uncle, who needed to have surgery for the removal of a painful kidney stone but was forced to wait two weeks as a result of his positive COVID-19 test, despite exhibiting no symptoms whatsoever of the dreaded disease and personally suffering only from his kidney problem. There are much worse cases, of course, which involve potentially fatal illnesses: cancer, stroke, heart attacks, and the like. Nevertheless, asymptomatic patients continue to be denied access to treatment until they have first survived a quarantine intended to protect other people from death.

Making matters worse, there appear to have been many instances of false positive tests for COVID-19. Indeed, by some estimates, a large proportion of those who test positive but do not exhibit symptoms are not even contagious. A bit of inactive COVID-19 debris (or “dust,” as it might be termed) may lead diagnosticians to red-flag patients who are not dangerous in the least. The testing of people varies from place to place, with local authorities determining not only who should be tested but also what the threshold test sensitivity should be. These judgments are made on the basis of whatever strikes them—in consultation with their local “experts”—as relevant at the time. All of this makes it very difficult to know what any of the case surge reports actually mean. Many of the abrupt increases in new cases are obviously accounted for by the implementation of robust testing programs, particularly in places where no or very little testing was being done before. Yet government administrators continue to craft new quarantine, lockdown, mask and social distancing requirements based on The ScienceTM, because they do not know what else to do. Border restrictions on people hailing from countries with unacceptably high infection rates (in England, the magic number is 20 or more cases per 100,000 inhabitants) continue to be used to prevent entire populations from entering other countries. In this way, all people of such nations are being effectively punished as though everyone living there were infected.

The government of Spain, no doubt viewing itself as taking extra precautions to protect its population, has gone one step further, refusing entry even to Americans residing in so-called corridor countries (deemed safe) and who have not been in the United States since the crisis began. So what is the health pretext in that case supposed to be, exactly? It is also worth noting that the tit-for-tat restrictions being implemented by countries (where one slams down a quarantine requirement, and then the other follows suit, preserving reciprocity) would seem to be based purely on politics, not public health. It makes no sense whatsoever for a country with a higher rate of infection to bar entry of people from a country with a lower rate of infection, who, by coming, would lower the host country’s rate of infection, would they not? No, I am afraid that the numbers do not bear this out, for any changes in infection rate would be on the magnitude of rounding errors. If in a given country 21 people out of 100,000 are COVID-19 positive, even assuming that they are contagious (which many may not be), then what is the probability that any one person on a 200 passenger plane originating from that country will be a carrier? I leave this calculation as an exercise for the reader.

Having recently watched a few pandemic movies (Contagion, Outbreak, 93 Days…), I have come to suspect that the primary problem with the new COVID-19 czars is that they are basing their policies on such apocalyptic portrayals, under the assumption that a pandemic is a pandemic. It has become abundantly clear that many of these people are altogether devoid of basic statistical analysis and critical thinking skills. As a result, they are indeed hedging their bets by waving their “Science” flags, under the assumption that anything bad enough to be labeled a “pandemic” by the World Health Organization (WHO) could kill us all. Thus we have, on opposite sides of the planet, the prime minister of Australia and the governor of Michigan proclaiming that emergency measures will be necessary until such time as an effective vaccine is readily available. The hubris of such a pronouncement is awe-inspiring. These leaders seem to believe that by wishing hard enough and pouring enough resources into labs all over the world—whatever it takes!—we can and will eventually defeat The Evil Enemy with a manmade vaccine. Alas, reality does not always conform to our wishes, and hoping for a safe and effective vaccine is one thing, while developing and testing one is quite another. There have, in fact, been attempts in the past few decades to come up with vaccines against other coronavirus and SARS variants, with no success.

The movies in which pandemics are The Evil Enemy present truly existential threats to humanity, unlike COVID-19, which specifically targets the aged and the infirm (usually both at the same time). Proponents of lockdowns and severe restrictions of movement and activity are reacting to COVID-19 as though everyone has a 99% chance of dying if they become infected, when in fact that is much closer to their chance of surviving. So if COVID-19 is nothing like Ebola (which does kill nine out of ten people it infects), then why are policymakers acting as though it is?

Consider Victoria, Australia, where the government has imposed one of the strictest lockdowns on the planet in response to an outbreak of cases in Melbourne. The people in that city are living under martial law, with police storming the homes of “criminals” who “incite” illegal behavior by encouraging others to attend public gatherings in order to protest the lockdown, mask mandate, curfew, and social distancing requirements preventing them from living their lives with any semblance of normality. How did the Australian government know that people were “inciting” such “criminal” behavior? Because the “perpetrators” posted their views on Facebook. In the United States, Northeastern University suspended eleven students for violating social distancing dictates by partying at a nearby hotel (Note: they were not on campus). The recalcitrant students will not be refunded their tuition and fees and are barred from returning for the year.

Some may protest that I am making trivial objections. ‘World travel is a luxury and a leisure activity. Better to stay home and play it safe than to die! College students do not need to party! They should stay in their rooms and hit the books, helping others to survive!’ But many businesses have also been fined or shut down for violating an ever-mutating array of regulations and requirements. Small business owners and contract employees have suffered enormously through the lockdowns in places where they are ineligible for government assistance, and thousands of small businesses will never recover. Perhaps it will seem impolite to point this out, but it is nonetheless true that the individuals laying down the new laws have salaries which will never be disrupted, no matter what they do. They will not be losing their jobs and will not be rendered homeless, no matter how long the lockdowns remain in place, and no matter how often the rules for businesses are changed.

As an indirect result of political measures implemented to combat COVID-19, suicides are on the rise (including among people who are retired), and cancer deaths will soon be, too, thanks to severely restricted access to medical care especially during the first months of the crisis. Some of the measures taken by governments to combat the dreaded disease have directly ended rather than protected their citizens’ lives. Consider the recent raid by Peruvian police of a Lima night club in violation of curfew and social distancing edicts. In the rush to leave the place rather than be arrested, thirteen people were stampeded to death. Assuming that the people at night clubs tend to be on the younger side, their chance of dying from COVID-19 is much less than 1%. There were about 120 people at the nightclub, more than ten percent of whom are now dead.

Six months into the crisis, many of the multi-million dollar facilities constructed to accommodate the expected flood of critically ill patients have been shuttered (some having never been used). Nonetheless, many citizens seem to be thoroughly convinced that the extreme measures which continue to be implemented worldwide—even in places where much of the populace depends on tourism to survive—suffice to demonstrate that the danger is real. Just as in the case of war, the harsher the means being used, the more fervently the people paying those in charge to do whatever they decide to do come to believe. What is the alternative? To accept that one was completely and utterly duped? There is a lot of conspiracy mongering going on, no doubt an effort to understand the massive, concerted, global apparatus erected to combat a disease less dangerous to most people than is the seasonal flu. Conspiracy theories have swept in to fill the epistemological void because, some are convinced, there must be some reason, some agenda, some plan (“Plandemic”) devised by a cabal of evil and mercenary geniuses (think Dick Cheney, the consummate war entrepreneur) who stand to profit and gain control of the ignorant masses at the same time. Otherwise none of this makes any sense.

Certainly there are agents involved (Bill Gates, Anthony Fauci, the CEOs of pharma firms, et al.) who have self-interested financial motives to create, produce, and distribute 9 billion doses of a vaccine. Suppose, further, that COVID-19 mutates, making it impossible for a person’s natural immune system to provide protection for more than a few months at a time. What if, like the common flu, COVID-19 presents new variants each year, and governments decide (as some have hinted) to require everyone everywhere to line up not only for flu shots but also the latest and greatest COVID-19 vaccine? As improbable as that may sound, the State of Massachusetts decreed in August 2020 (amidst a flurry of new “emergency laws”) that all schoolchildren and university students (both undergraduate and graduate) are now required to have seasonal flu vaccinations, despite the fact that the CDC itself reports an efficacy rate of 19% for the 2019 vaccine (the five-year range is from 19% to 48%). Imagine, then, that this requirement were expanded to include a jab for flu and a jab for COVID-19 for everyone. That would obviously be the biggest Big Pharma coup of them all—far surpassing the medicalization of ordinary troubles to which human beings have always been susceptible and which, since the U.S. launch of Prozac in early 1988, have been increasingly addressed through the popping of psychotropic pills. The lockdowns alone are likely to cause a huge surge in patients seeking a bit of help from their doctors to allay anxiety and stress in this ever-more uncertain world, where it has become nearly impossible to make any long-term plans involving anything beyond the perimeters of one’s own home. Or tent.

The reason why conspiracy theories are flourishing is not only that people have too much time on their hands and nowhere to go. The truth is that the experts do not agree. Some maintain that shielding children from all germs will make it difficult for them to develop hardy immune systems; others deny that this is the case. Do lockdowns help, or do they not? (See: Sweden and South Dakota.) Is herd immunity possible, or is it not? If it is not, then why would anyone hold out hope for a safe and effective vaccine to be developed, tested, produced and distributed before the virus, of its own accord, turns into something else or runs out of steam? Does anyone truly believe that the virus is going to exhaust its source of elderly and infirm targets and then mutate, in an unprecedented display of viral intelligence, so as to be able to target toddlers? In a climate of fear stoked over many months, The Evil Enemy comes to seem much bigger and more powerful than it is.

Once again, the case is not unlike recent foreign policy initiatives rationalized on the grounds that we must take the battle to the enemy before they have the chance to come to U.S. shores. I suppose that one positive consequence of COVID-19 is that nearly nobody fear-mongers about terrorism anymore, as there is a new, bigger, badder bogeyman in town. Which is not, however, to say that the Middle East is not being bombed on a regular basis, just that even fewer journalists talk about it than before. In fact, there is not a lot of non-COVID-19 talk going on at all among media pundits. Across social media, people have already picked sides and spend their time denouncing as stupid anyone who happens to disagree. Again, this may have much to do with the fact that, having once invested in something, having been true believers, it becomes very difficult to admit error in the face of even overwhelming evidence to that effect. Politicians will continue to uphold their policies even as they destroy the lives of countless human beings. It happened in Vietnam, and it is happening today. To save the village, must it be destroyed?

There is no question that vulnerable people incapable of protecting themselves should be protected from COVID-19, because vulnerable people incapable of protecting themselves should always be protected by decent societies. But there are rational limits to the forms which that protection can take. Are terminally ill patients being helped by being denied the right to spend the last days and hours of their lives with their loved ones? Are independent seniors forced to live like recluses being helped by policies which prevent them from having any visitors? I think not.

What is being overlooked by policymakers is that there is much more at stake than simple existence. The “village” currently under siege is the social sphere. We are asked to wear masks, stay away from each other (no hugs or kisses!), avoid interacting with people beyond our “bubble,” and not go anywhere unnecessarily. From the perspective of lockdown proponents, all of these measures are minor inconveniences in the face of a much worse consequence, should we fail to comply: death. So we see children in schools wearing masks and sitting at Plexiglas-shielded desks to avoid the horror of anyone’s tiny drop of spittle hitting anyone else in the eye. In fact, hardly any of those children would die, even if all of them were exposed.

‘We will do anything necessary to prevent even one death!’ proclaim some of the COVID-19 czars, apparently oblivious to the fact that human beings die all the time. They want to protect the grandparents of the children, when in fact, the grandparents are perfectly capable of deciding what are and are not acceptable risks to themselves. For some, interacting with grandchildren is a primary source of joy. Being retired, they look forward to nothing more than spending time with their extended family. That is a choice which they can and should be able freely to make. And, lest we forget, children are vulnerable, too, not to the dreaded disease, but to the climate of fear in which they are currently being raised. Schoolchildren forced to wear masks do not see their peers’ smiles and frowns, and hear only their muffled words and laughs. Some of them may avoid socializing at all because it is has become not only so strange but also prohibitively difficult to do. They have less reason than ever before for putting their iphones away.

Before COVID-19, people who washed their hands a hundred times a day and avoided contact with others for fear of contracting diseases were diagnosed as germophobes suffering from obsessive compulsive disorder (OCD). Human beings who scrupulously avoided social gatherings were said to suffer from social anxiety disorders. Now, however, social distancing requirements in venues as banal as grocery stores are causing people to behave as though their fellow shoppers were suffering from the Black Plague. In some places, store clerks upbraid customers for violating one-way arrow requirements when they run back to pick up a forgotten carton of milk before returning to their place on one of the circular floor stickers at the checkout line. (Yes, that happened to me. Yes, I was wearing a mask.)

Many people have accepted all of the new restrictions on behavior as “the new normal,” and two weeks of this sort of thing may not cause lasting harm to anyone. Six months, however, is a significant portion of a child’s life, and we have experts today forecasting that emergency measures will persist well into 2021 or beyond. But should the existence of a virus, which may or may not ever go away, be used as the pretext for dictating how conscious, intelligent, free creatures should live?

COVID-19 and Collateral Damage: Killing vs. Letting Die

COVID-19 and Collateral Damage: Killing vs. Letting Die

The question of killing versus letting die has long been a source of puzzlement to me, particularly as it arises in rallies for so-called “humanitarian” wars abroad. Wealthy nations regularly “allow” people to die all over the world—of disease and starvation, as a result of natural disasters, etc.—so how, I have often wondered, does the professed desire to improve the lot of nonnationals serve to rationalize the dropping of massively destructive bombs upon their homelands? Assuming the most charitable of all possible scenarios (as unrealistic as that may be), even if leaders have the best of intentions, some of the innocent people living in places being bombed will die as a direct result of the military intervention, not the danger allegedly necessitating the use of deadly force abroad. Such deaths are written off as “collateral damage” and the policymakers thereby exonerated in the minds of nearly all of the people who paid for the bombing campaigns.

The case of COVID-19 has begun to raise the question of killing versus letting die, for some of the persons allegedly being protected by the government are being or will be killed not by the disease but by policies enacted to combat the disease. Notwithstanding the stentorian outcry of government interventionists thoroughly convinced of the necessity of lockdowns, closed borders, universal vaccination and face masks, the situation is not at all black-and-white. Indeed, it is quite complex. Everything turns on the contentious concept of “preventable deaths.”

The number one killer in the world, according to the World Health Organization (WHO) is heart disease. No one chooses to die of a heart attack, but is heart disease a case of sometimes preventable death? Obesity is a major contributing factor to heart disease, so one might not unreasonably suppose that if people were permitted to eat only lean proteins, vegetables and fruits, along with whole grains, if they were prevented from consuming fatty foods and highly processed sugar-laden snacks with no nutritive content (beyond calories), then the incidence of death by heart disease would diminish. This could be accomplished most straightforwardly by outlawing the offending foods and imposing government-enforced portion control. No state has to date prohibited or limited the production and consumption of fried foods, ice cream, and doughnuts. Which is not, however, to say that no one has ever tried something along those lines. When former New York City Mayor Michael Bloomberg attempted to outlaw large volume sodas (defined by his administration as exceeding 16 ounces), the law was struck down as unconstitutional. In free societies, it is up to individuals, not executives, to decide what and how much to eat and whether the risk of dying of complications arising from obesity—not only heart disease but also diabetes and other problems—is worth the freedom to choose what to consume.

One meme circulating around the internet shows a severely obese patient in a wheelchair wearing a mask and lashing out at a young, thin person for not wearing a mask. The meme is intended as a not-so-gentle reproach of those who created the negative health conditions which make them personally vulnerable to COVID-19. The insinuation is that severe lockdown and quarantine measures are problematic in a free society in part because some of the people vulnerable to COVID-19 have health-risk factors to which they themselves contributed. Obviously no one should hold octogenarians and nonagenarians “morally responsible” for the age-induced fragility which makes them more likely to succumb to respiratory infections than are younger, hardier persons. But surely there are also some people who suffer from obesity, diabetes, and other risk factors for which they, too, are not fully responsible, given their backgrounds and, in some cases, genetic predispositions. Bad choices are bad choices, but when they are made in part because of how one was raised, say, by parents who made similarly bad choices (and going back perhaps generations…), then there is some cause for restraint in judgment.

On the flip side, some people should not wear masks; not only young children but also persons with asthmatic and other respiratory conditions. While in the Dublin airport, where I had to fly in order to get to Wales (absurdly enough—because the Austrian government had decreed the United Kingdom a red zone), I noticed placards around the bathrooms pronouncing that “Not all disabilities are visible.” The signs were most likely intended to prevent anyone from upbraiding persons using handicapped bathroom facilities who by all appearances are perfectly normal. But it applies now, too, to people who do not wear masks because of their pre-existing health conditions, to which only they and their doctors are privy. Angry mask-wearers who shout shoppers out of stores for not covering their faces simply assume that they are not doing so because they are selfish or stupid (or evil or ignorant), when in at least some cases those people should not be wearing masks because doing so would be more dangerous to them than is COVID-19.

No matter what people do, death by disease cannot be fully eradicated, but other categories of death would seem to be preventable. Take the obvious example of traffic accidents, which has been discussed quite a bit on social media in recent months. If there were no cars, then there would be no car accidents and, therefore, no fatal car accidents. Make driving illegal, and road traffic injuries, which account for more than a million deaths each year worldwide, would come to a screeching halt. Despite knowing the risks involved, people choose to continue to drive vehicles. Despite the evident perils of motorcycles, which afford no protection in collisions with cars and trucks, people continue to choose to ride them. In some places, seat belts are required by law and motorcyclists must wear helmets (on pain of punishment for refusal to comply). Yet there are people who ignore those laws, unconcerned as they are about the increased risk of death which they will thereby face, and knowing that they will likely be fined if they are caught. Those are the rogues, of course, but even some of the people who do wear seat belts and helmets will be killed in traffic accidents, not to mention the many pedestrians who endanger themselves every time they cross a street. These activities are inherently dangerous to greater or lesser extents, depending upon the place and population density, but rather than outlaw all personal vehicles everywhere, governments permit individuals to assume the risk involved in activities which may tragically end in their deaths.

New Zealand has been heralded by some as a “success story” in the global battle against COVID-19, for the country imposed a complete lockdown of residents and slammed its borders shut with the result that hardly anyone in the country has succumbed to the disease—as of August 19, 2020, the grand total of deaths ascribed to COVID-19 is twenty-two. This makes New Zealand an interesting case to consider in thinking about the analogy to fatal traffic accidents. I say this because, in recent years, debate has raged over fatal auto accidents in New Zealand caused by foreign drivers. Like COVID-19, such cases tend to command a great deal of media air time, contributing to the perception of grave danger to the people of New Zealand. In 2016, there were twenty-six fatal accidents in which foreign drivers appear to have been at fault, and by 2019 the total number of traffic fatalities approached 400. At least some of those deaths were caused by foreign drivers, even if the perceived danger is higher than the reality.

The government of New Zealand might have prevented at least some of the fatal accidents by placing a moratorium on nonnational drivers, preventing them from renting cars and exacting severe penalties upon those who borrow cars from their friends and those residents who furnish cars to visitors. But this has never happened. Before COVID-19 (which we may in the future refer to as “B.C.”), despite knowing that foreign drivers from places where traffic flows down the right-hand side of the street do occasionally drift over the line on the sometimes steep and windy roads of New Zealand, thereby directly causing head-on collisions culminating in preventable deaths, the government of that nation has, at least up until now, permitted foreigners to rent vehicles and drive, even while knowing that some Kiwis (New Zealand nationals) will die as a result.

Now, with the sudden appearance of COVID-19, most foreigners are no longer allowed to drive in New Zealand for the simple reason that they are no longer permitted to travel to New Zealand. There were no doubt visitors around when the borders closed, and some may have decided to hunker down and wait for the virus to go away, but the moratorium on new tourists means a sudden and significant reduction of foreigners renting cars and killing Kiwis in New Zealand. Win-win! Well, except for the thousands of poor souls who are now out of work because twenty-two people in New Zealand died of a virus. The thinking among the powers that be, of course, is that if not for the severe lockdowns and restriction of liberties, many more people would have died there by now. Unfortunately, despite the refusal of Sweden to lockdown, we do not have as a test case any place where elderly care facilities were competently protected while the rest of the populace was allowed to roam free. Note, however, that Sweden’s per capita COVID-19 death rate is still lower than that of some countries which did impose months of severe lockdowns.

After the recent discovery of an outbreak of a few new cases (not deaths, mind you, but cases), the New Zealand government extended its lockdown of Auckland again and went one step further down a slippery slope, adopting as a national policy forcibly to place persons who test positive for COVID-19, along with their families, in quarantine camps. National elections have been postponed for a month as well. Authoritarian habits die hard, and one might surmise that once bureaucrats begin crunching the numbers of actual deaths caused in New Zealand by foreigners, they will eventually conclude that if ever they are permitted to return there for vacations, they should not be permitted to drive. In reality, that would and could happen there—and, frankly, everywhere—if and only if all of the new COVID-19 czars had some sort of consistent principles and worldview, which clearly they do not.

For example, while in Austria for more than half of 2020, I was surprised to find that smokers were permitted to puff away in public places, even though it was impossible to do so while complying with the Mund-Nasen-Schutz (face mask) requirement imposed in response to the arrival of COVID-19. Apparently, then, it is fine with the Austrian government for people today to induce in themselves lung cancer in the years to come, while endangering other residents with both second-hand smoke and COVID-19 simultaneously, but healthy nonsmokers not at significant risk of death from the virus are required by law to don face masks. If the risk aversion demonstrated by government bureaucrats in the face of COVID-19 were applied consistently, then cigarettes and personal automobiles would need to be altogether banned, in order to save people from themselves.

At first glance, smoking might seem to be a more straightforward case than obesity, for no one needs to smoke to survive, while all people must eat. Many human beings succumb to death by lung disease each year, usually as a result of having smoked. The dangers of smoking have been well-documented, and this information is now clearly printed on every pack of cigarettes, along with accompanying photos frightening enough to be screen shots from a horror film. And yet, some people continue to choose to smoke, and many continue to die each year of lung cancer and emphysema induced or exacerbated by smoking. Who is ultimately responsible when citizens die of such preventable deaths? Is it the manufacturers and distributors of cigarettes? Is it the government? Is it the voters who elect the government? Is it those who stand idly by watching others act in ways which endanger their own and in some cases other people’s health? Or are not individuals themselves ultimately responsible for what they do and thereby become?

The truth is that we never really know how and why people became the way that they became, nor why they do what they do. This is equally true for those who choose to smoke, to overeat, to ride motorcycles without helmets, and to drive while intoxicated or on steep mountain roads overhanging cliffs even when the traffic rules are the opposite of those to which they are accustomed. Given the many complex factors involved in our choices, each one of which contributes to who we finally become, the default position is generally regarded as one of personal responsibility, at least in Western liberal societies, where people are free to drink themselves to destruction or to gamble their lives away in other ways, whether literally or figuratively.

Contradictions abound in the Animal Farm-esque world of COVID-19 because different government officials the world over, and within large countries such as the United States, have very different views on what is and is not reasonable to ask of citizens. Quarantine, border restrictions and testing requirements change on a daily basis, and it is difficult to resist the suspicion that much of what is going on since the height of the crisis, in the spring of 2020, is purely the result of opportunistic politicians’ attempts to do something, do anything, so that they can take credit when the virus finally disappears.

In the current terror-tinged global pandemic milieu, where self-proclaimed “experts” are a dime a dozen, I continue to puzzle over why people are not simply being permitted to act on their own beliefs. Is not that the very basis of conscience? If anyone is truly terrified of being in the presence of unmasked persons, I would heartily exhort them to stay at home and do all of their shopping online. Just as in the case of drunk drivers and motorcyclists with no helmets, there will always be people who do not do as they are told—or as you believe that they should. If you decide to interact with those people (for example, by driving), that is a choice which you make. To those who would protest that, in the case of COVID-19, many people are ignorant of the relevant scientific literature, or “The ScienceTM,” I would counter that the very same argument would lead to the conclusion that representative democracy should be abolished. Certainly the manifest ignorance of both voters and elected officials in interpreting statistical data has become undeniable in recent months, with apparently intelligent people reading “death rates” of critically ill persons already in hospital intensive care units as applicable to the population at large.

Plato observed more than two thousand years ago that democracy is the second worse form of government—after tyranny, which is the system under which an executive is free to issue arbitrary edicts at his own caprice. The last bulwark against tyranny today remains a republican constitution—and the insistence of some people to uphold that constitution. The clear and present danger is that of citizens permitting themselves to be transformed into subjects, which can however be achieved through inducing a widespread fear of death—whether warranted by the facts or not.

In thinking about killing versus letting die, the case of COVID-19 is no less complicated than the cases of driving, eating, and smoking, all activities with built-in dangers and which are easy to abuse. Despite the strange, sudden and surprising near-unanimity of federal governments worldwide in deciding to implement a range of draconian policies intended to save the lives of those vulnerable to the disease by restricting the liberty of everyone else, and prohibiting normal activities in which healthy people would otherwise engage, the unsavory truth is that governments are in fact increasing the risk of death for many people who are in nearly no danger of dying from the virus itself.

Among the more drastic policy measures implemented in response to the appearance of COVID-19 is that of restricting access to medical services for anyone who does not exhibit acute symptoms of the dreaded disease. In this way, the new virus has been given a much higher priority than notorious killers such as cancer, heart disease, stroke and suicide. Hospitals all over the world have put “elective” surgeries on hold, postponed cancer treatment and refused admission to anyone not clearly suffering from COVID-19. This is tantamount to claiming that death by COVID-19 is somehow worse than death by cancer, heart disease, stroke, or suicide. But why should anyone believe that to be the case? The answer appears to be that because COVID-19 has been labeled a global pandemic, it is supposed to be worse than every other cause of death taken together. The numbers tell quite a different story.

In Italy, the average age of persons said to have succumbed to COVID-19 has been about eighty. Many people in that age cohort die of the flu every year. Fewer people are dying of the flu in 2020, because some of them are dying, instead, of COVID-19. So the question is not whether death is fully preventable in all of those cases, for it is not. Sometimes one’s number is just up. The question becomes, instead: is the rate of death by other causes being significantly increased for other age cohorts as a result of efforts to prevent COVID-19 deaths in persons over seventy years of age? It will take some time to sort out the data, which is an ever-shifting sandcastle of poorly reported and misleadingly presented statistics. The United Kingdom, for example, recently reduced its official COVID-19 death toll from 46K to 41K, when it was discovered that people who died of other causes but tested positive for the virus nearly a month earlier had been included in the tally. In New York, critically ill patients sent from nursing homes to hospitals (where many died) were not counted as elderly care facility deaths. In some hospitals, workers were instructed to write “COVID-19” on death reports, even when the patient had never been tested and may well have died of something else.

Amidst all of this murkiness, one thing is clear: from the moment when COVID-19 was christened a “pandemic,” people have been conflating the effects of Covid-19 (illness and death from the disease) with the effects of government policies implemented in response to the disease. COVID-19 did not itself cause the collapse of the tourism and entertainment industries. Healthy people in those sectors stopped working not because they were ill or moribund, but because their governments made it illegal for them to do so. The mass unemployment around the globe of persons prohibited from working during the lockdowns, many of whom will not be returning to their jobs because they have been eliminated as businesses have either permanently shuttered due to insolvency, or jumped on the fast-track to downsizing via automation, will have ramifying health effects, both physical and psychological, in some cases culminating in suicide.

Millions of people in the United States alone are at risk of homelessness as a result of having suddenly lost, through no fault of their own, their source of income. Homelessness will increase the risk of all forms of illness (including COVID-19), to which some of those persons would not otherwise have been vulnerable. Formerly healthy persons may succumb to alcoholism, excessive drug use, and other forms of bodily harm and disease as a direct result of no longer having adequate shelter. None of these effects will have been caused by COVID-19 but by government policies implemented in response to COVID-19. Will the government administrators who created the conditions resulting in excess deaths be held responsible for the sudden spike in suicides, the cancer deaths caused by late-detection and the deaths from strokes and heart attacks which might have been treated? That seems unlikely, for politicians are busy appending immunity clauses to COVID-19 legislation underway.

When it comes to wars fought abroad, the populace tends to accept whatever their leaders say, so long as they profess to be acting with good intentions. We should expect, then, that the concept of “collateral damage”, invoked so often in excusing the inexcusable, the annihilation of innocent people by self-proclaimed good-doers who kill rather than protect them, will be dusted off in the case of COVID-19. Death is death, at the end of the day, and the dead have no interest in the intentions of their killers. But “collateral damage” is a trope devised to absolve those who kill, under the assumption that good intentions wipe the moral slate clean. In this way, the policies being implemented to combat the new virus raise a much more general question about the power of governments to destroy the lives of people whom they claim to be protecting. Now, however, in contrast to bombing campaigns abroad, it’s personal.

Why and how are governments being permitted to enact policies which endanger so many of their constituents in the name of the few? It’s no longer just a barrel of “bad apple” cops who kill some of the very people who summon them for help or are walking unarmed down the street or fall asleep in parked cars. Millions of citizens in countries all over the world are experiencing an unprecedented level of insecurity caused by the very governments whose raison d’ȇtre it is to protect them. Tragically, the people who could and should be protected have not been (see the case of Governor Cuomo in the state of New York), while those who never needed protection have had their lives upended, and some will die as a result. Citizens have no difficulty forgetting about the carnage committed in their name abroad, but what happens when the government wreaks massive havoc in the homeland? We are in the process of finding out.

COVID-19 Controversies and Communitarianism

COVID-19 Controversies and Communitarianism

The ongoing controversies swirling about COVID-19 continue to confound me. Not the fact that questions have been posed and “conspiracies” rejected but, rather, that many parties on both sides of every COVID-19 divide—regarding lockdowns, masks, vaccines, whether children should go to school and healthy people should go to work, etc.—appear to be thoroughly convinced that the truth is on their side and that those who disagree with them are “nut cases.” Of course, the same is true about most any dispute on social media today, but when it comes to COVID-19, the adherents to various “self-evident tenets” have achieved a new and more vicious degree of smug sanctimoniousness.

On the one hand, we have people who seem truly to be convinced that those who don masks are Jesus-like characters who engage in “radical acts of kindness,” as one person on my Facebook timeline characterized them, including, apparently, herself. On the other hand, we have people who guffaw at the sight of face-masked persons sunbathing on a vast expanse of sandy beach or while driving all alone in their cars, windows rolled up. Surely there are facts, grounded in science, to consider, but proponents of masks are so convinced that The ScienceTM is on their side that they facilely (and fallaciously) slide between interpretations according to which those who refuse to wear masks are evil, selfish, stupid and/or ignorant. Common sense would certainly seem to dictate that illnesses can be transmitted through saliva—is that not in fact why restaurants sterilize glassware and eating utensils? But the COVID-19 mask controversy was considerably exacerbated by the government’s own mixed messages on the topic. Even pandemic guru Anthony Fauci appeared in an early 2020 YouTube clip in which he stated that masks were unnecessary and mainly for show, serving to make people feel better psychologically. Later, after the video had already gone viral, Fauci’s claim was clarified as an attempt to mitigate a PPE shortage among health professionals.

I am less interested in questions such as whether masks diminish the incidence of disease (obviously surgeons wear “surgical masks” to prevent sepsis in the persons into whom they slice), or whether molecules do in fact disperse and diffuse rapidly in open volumes of air (see: Chemistry 101), than in why people are so vehement in their disagreement over whether and where masks should be required by law. From the beginning, the characterization of COVID-19 as a “pandemic” seems to have conjured in many people’s minds images of wheelbarrows rolling through the neighborhood to collect corpses. (I suspect that to this day some people continue to check their bodies for oozing boils.) Nothing of the sort has of course occurred, and the risk of death to anyone under fifty years of age is lower than the risk of death associated with all sorts of activities in which we regularly engage. No wonder young people are not worried. They are not being reckless at all when they go out with friends. Are they being selfish, as the mask brigade maintains?

At one point I attempted to reason with some people on Facebook who were denouncing as “evil” (in a refrain reminiscent of ancient Greek tragedy) those who do not wear masks. Among other things, I observed that, in fact, contrary to the apparent beliefs of the pro-mask chorus, not everyone who does not wear a mask lives in the United States and worships Donald Trump, who famously “opted” not to wear a mask for months. This was met with a flurry of denunciatory responses, until I revealed that I myself had in fact been wearing a mask, at which point I became “evil, stupid, ignorant, and/or selfish” for entertaining the possibility that other people might hold slightly different beliefs. RIP civil discourse in the twenty-first century world of social media. Alas, as virtual and physical reality converge, fueled by an amorphous blob of pseudo-information, fake news, propaganda memes, omissive charts, incommensurable data and, above all, emotive outbursts, the verbal violence has been acted upon by some. Mask shaming in the states now takes the form of people attacking people who call out the unmasked and, for their part, mask wearers joining forces to shout people out of stores who dare to enter without what are regarded as appropriate prophylactic coverings.

I was in Austria for more than half of 2020, at the height of the Coronapocalypse, where the incidence of the virus has been quite low and the death toll still hovers just under 700. I know, I know: 700 dead people who need not have died, if only… (If only what? If all men were not mortal, perhaps?) Why was the situation so much less dire in Austria than in Italy, Spain, or France? My best guess is that the powers that be effectively locked down their elderly care facilities and did not, as New York Governor Andrew Cuomo did, send persons infected with COVID-19 into nursing homes to convalesce, thereby directly causing thousands of excess deaths. No one intended to kill those people, of course, but given the precedents in Italy and Spain, where healthcare workers proved to be the primary transmitters of the disease, having not been tested unless they exhibited symptoms, it seems not unreasonable to characterize Cuomo’s action as negligent, at best.

Cuomo is not alone in having imposed government measures which will end by increasing the rate of death of some of the persons supposedly being protected. When for months hospitals refused to admit or treat any patients who did not exhibit acute COVID-19 symptoms, they were turning away thousands of persons with heart problems, minor strokes, and developing cancer whose lives will end earlier than they might have, had they received medical treatment in a timely way. In other words, not all of the excess deaths recorded will be due to COVID-19 itself; some will have been caused by government policies implemented in response to the disease. Small wonder that the latest U.S. stimulus bill will contain broad immunity clauses preventing lawsuits regarding COVID-19.

In Austria, the situation seemed to be largely under control by June, at which point the mask requirement in indoor places was lifted, allowing me to travel happily about the country as a tourist without having to deal with the usual summer mobs, as places of business were open, while the borders remained closed. Masks continued to be required on public transport, but it was plain to see by mid-June that many people in Vienna were not at all concerned about COVID-19, for they often stepped onto trains and trams with no mask anywhere near their face. They might take five minutes fumbling around finding their mask in their bag, then fumble around some more while getting their mask on. In some cases, they would then proceed to remove the mask, in order to eat a piece of pizza or some other snack. They talked and laughed and sometimes coughed with their friends as they entered the closed space (and while munching), with no apparent recognition that the whole purpose of the mask requirement was to prevent their saliva from infecting fellow passengers with the dreaded disease. I must say that I find it somewhat amusing that there were three simple ways legally to evade the mask requirement in Austria while avoiding the risk of a 50 euro fine: always be eating; always be drinking; or, oddly enough, always be smoking. So a non-smoker could always get around the mask requirement by spending time in a smoking area. I’ll leave that one for you to parse.

I also noticed that in the markets, museums, and shopping centers, almost no one actually observed the government’s ongoing recommendation to adhere to social distancing or “Abstand,” despite the brightly colored circles glued on the floor nearly everywhere to indicate how far people were supposed to be staying away from one another. (Does anyone have any idea where and how all of those circular floor stickers were produced and applied, apparently all over the world, during the lockdowns? Just curious.) I noticed the lack of adherence to social distancing guidelines especially on escalators, which are probably the easiest place to gauge whether anyone is making any attempt whatsoever to keep their distance, given that it is so straightforward to do in that case. I tend to mount an escalator two or three steps behind the person in front of me anyway, because I find it rude to breathe down someone’s neck, but in the midst of the “global pandemic” said to necessitate the closure of all European borders, both internal and external, people were there, right behind me on the escalator, unmasked and breathing down my neck. The idea that such persons might be evil, stupid, ignorant and/or selfish never crossed my mind. They simply did not believe that they were in any real danger, nor that they were endangering anyone else.

Even more strident than the “I am Jesus” mask wearers are those agitating for universal vaccination. This is another source of ongoing perplexity to me, as many of those who sing the praises of vaccines as the only solution to the crisis also vociferously maintain, sometimes in the very same breath (filtered through a mask), that herd immunity is not possible with COVID-19, because of its mutating quality. This is conclusively demonstrated, they say, by cases in South Korea where recovered patients became ill again with COVID-19 later on down the line. So let me get this straight: herd immunity is not possible, but the bars in Massachusetts will remain closed until such time as an effective vaccine exists? (Is this some sort of sly backdoor route to reinstating Prohibition, I have to wonder?) In pointing out that vaccines are in effect a fast-track to herd immunity, and so, if the latter is not possible, then the former is a pipe dream, I appear to have upset some people on Twitter, one of whom abruptly announced that he would no longer be continuing our discussion because he disagreed with my view on vaccines. What? Who knew that I had “a view” about vaccines? Is it really all or nothing? May I not express a modicum of skepticism about the prospects for a COVID-19 vaccine while simultaneously affirming that I am indeed glad that I got the yellow fever vaccine before going to Ghana (even though I was quite ill for about five days), because then once in Africa I knew I was safe from that disease? No, apparently a person who raises questions about the feasibility of an experimental vaccine for dealing with a virus for which some claim herd immunity cannot be achieved must be categorically denounced as an anti-vax “nut case.” My aim was not to denounce universally the very idea of vaccines, but to make a much more modest, purely logical, claim: not (p & not-p). Either herd immunity is possible, in which case the surge in cases across the United States suggests that we are well on our way to achieving it, or it is impossible, in which case the prospects for an effective vaccine seem quite dim, no matter how many dozens (hundreds?) of companies may be aggressively recruiting volunteers for experimental trials of what they hope to be the miracle eradicator of the dreaded disease.

In several contexts, I have heard seniors lashing out against “selfish” young people for congregating together in public places—at concerts, on beaches, in clubs and parks, and … at work!—which naturally raises yet another quandary in my skeptical mind. Who is being selfish here, really? My impression is that elderly persons, who are quite right to stay home in order to protect themselves, appear to misunderstand the nature of the world which they have created and are leaving behind for young people. What could be more selfish than to destroy the livelihood of millennials who have been eking out their existence in what has become a piecemeal gig economy—with no house or pension anywhere in sight, and short-term contracts to earn just enough money to survive while whittling slowly away at their quasi-eternal student debt? If all of the people attempting to go back to work had neither rent payments nor student debt, then it might be reasonable to ask them to take even more time off. But when financial insecurity reaches the point where even having a roof over one’s head becomes tentative, when the tent industry becomes a hot stock option, then that is where it seems time to draw the line.

To reiterate: those who are at a substantial risk of death from COVID-19 should, by all means, stay at home (which many of them do in fact own). They can freely decide for themselves whether visiting with young family members is worth the risk of being infected by the disease, given its specific targeting of advanced seniors. But how does preventing young people from living their lives offer any extra protection to those who are already in reclusion, terrified as they are (and in some cases rightly so) to step outside? Answer: it does not. If you are disinfecting everything which comes your way and refusing entry to anyone into your home, then why should you care whether other people go back to school and return to work?

Now it does sound as though I am taking sides. But what I have concluded after a great deal of reflection is that the extreme measures taken by governments the world over to protect a tiny portion of the population fly in the face of the more general ethos of modern-day Western society. For better or for worse, we have found ourselves in a world where people are held responsible for their failures and given credit for their success. We do not live in a communitarian society, where economic equality is imposed and maintained by the state or by mutual agreement of the group. In our liberal capitalist society, when the government itself prevents people from succeeding, by making their only possible source of gainful employment illegal, then those people are doomed to fail, not due to their own moral flaws but because they have been prohibited from doing what they would otherwise have done.

The untenable scenario in which young, healthy people have found themselves is what I take to be the best explanation for the magnitude and range of indiscriminately violent protests across the United States. People are not looting Chanel boutiques in search of bread or criminal justice. Rather, communities all across the United States are literally exploding under pressure. They have nothing to lose and so are striking out in outrage, not so much because of the murder of George Floyd (why did these riots not happen, to this extent, in response to the many African Americans killed by police officers before George Floyd?), but in an expression of frustration and anger and, above all, fear about their uncertain future. Millions of persons (hundreds of thousands in California alone) are at serious risk of being evicted from their homes. While some states have implemented measures which will allow rent and mortgage payments to be postponed, they will have to be paid eventually, which means that those who were only barely getting by will not be able to catch up.

Whose interests matter most, in the end? When the advanced seniors with empty vacation properties decide to share their resources (in “acts of radical kindness”) with the people being impoverished, and in some cases rendered homeless, as a result of government measures designed to protect those most vulnerable to COVID-19 at the expense of everyone else, then they will be practicing the communitarianism which they preach. I don’t see that happening in my lifetime.

Enoughalreadyproof

Enough Already: Time to End the War on Terrorism

by Scott Horton

Book Foolssm

No Quarter: The Ravings of William Norman Grigg

by Will Grigg

Book Foolssm

Fool’s Errand: Time to End the War in Afghanistan

by Scott Horton

Book Foolssm

Coming to Palestine

by Sheldon Richman

Book Foolssm

The Great Ron Paul

by Scott Horton

Book Foolssm

What Social Animals Owe to Each Other

by Sheldon Richman

Pin It on Pinterest