Featured Articles

U.S. Raises Stakes in South China Sea Naval Exercises

U.S. Raises Stakes in South China Sea Naval Exercises

The U.S. Navy conducted massive drills in the South China Sea on Saturday, with two aircraft carriers involved in the exercises. According to The Wall Street Journal, hundreds of jets, helicopters, and surveillance planes took off from the USS Nimitz and the USS Ronald Reagan in Washington’s largest military drills in the region in recent years.

“The Nimitz Carrier Strike Force celebrated Independence Day with unmatched sea power while deployed to the South China Sea conducting dual carrier operations and exercises in support of a free and open Indo-Pacific,” the U.S. Navy’s Seventh Fleet said in a statement.

The exercise is a show of force aimed at Beijing, who held its own drills over the weekend near the Paracel Islands, a disputed archipelago that China, Vietnam, and Taiwan all lay claim to. China’s build-up of military and research facilities on the Paracel Islands and the Spratly islands, another contested archipelago, has drawn the ire of Washington.

Since 2015, the U.S. has run what it calls Freedom of Navigation Operations (FONOP) in the South China Sea, increasing tensions in the region. The FONOPs usually involve sailing a warship near the contested archipelagos and always draw sharp condemnation from Beijing.

“The fundamental cause of instability in the South China Sea is the large-scale military activities and flexing of muscles by some non-regional country that lies tens of thousands of miles away,” Chinese Foreign Ministry spokesperson Zhao Lijian said at a press conference on Friday.

The Bashi Channel, a waterway just south of Taiwan, has turned into another flashpoint for the US and China. Friday marked the 13th day in a row that US military aircraft flew over the Bashi Channel. The South China Morning Post reported that the U.S. sent six large reconnaissance aircraft and two refueling tankers on Friday’s mission. The planes were reportedly searching for Chinese submarines in the area.

Dave DeCamp is the assistant news editor of Antiwar.com. Follow him on Twitter @decampdave.This article was originally featured at Antiwar.com and is republished with permission.

How the U.S. Government Debased My Coin Collection

How the U.S. Government Debased My Coin Collection

Old coins vaccinated me against trusting politicians long before I grew my first scruffy beard. I began collecting coins when I was eight years old in 1965, the year President Lyndon Johnson began eliminating all the silver in new dimes, quarters, and half dollars. LBJ swore that there would be no profit in “hoarding” earlier coins “for the value of their silver content.” Wrong, dude: silver coins are now worth roughly fifteen times their face value.

History had always enthralled me, and handling old coins was like shaking hands with the pioneers who built this country. I wondered if the double dented 1853 quarter I bought at a coin show was ever involved in Huckleberry Finn–type adventures when “two bits” could buy a zesty time. I had a battered copper two-cent piece from 1864, the same year that Union general Phil Sheridan burned down the Shenandoah Valley where I was raised. Some of the coins I collected might now be banned as hate symbols, such as Indian Head pennies and Buffalo nickels (with an Indian portrait engraved on the front).

In the era of this nation’s birth, currency was often recognized as a character issue—specifically, the contemptible character of politicians. Shortly before the 1787 Constitutional Convention, George Washington warned that unsecured paper money would “ruin commerce, oppress the honest, and open the door to every species of fraud and injustice.”

But as time passed, Americans forgot the peril of letting politicians ravage their currency. In 1933, the US had the largest gold reserves of any nation in the world. But fear of devaluation spurred a panic, which President Franklin Roosevelt invoked to justify seizing people’s gold to give himself “freedom of action” to lower the dollar’s value. FDR denounced anyone who refused to turn in their gold as a “hoarder” who faced ten years in prison and a $250,000 fine.

FDR’s prohibition effectively banished from circulation the most glorious coin design in American history—the twenty-dollar Saint-Gaudens Double Eagle gold piece. I was captivated by early American coin designs, especially those featuring idealized female images emblazoned with the word liberty. I was unaware that George Washington refused to allow his own image on the nation’s coins because it would be too “monarchical.” Until 1909, there was an unwritten law that no portrait appear on any American coin in circulation. That changed with the hundredth anniversary of the birth of Abraham Lincoln, whom the Republican Party found profitable to canonize on pennies.

By the mid-twentieth century, American coinage had degenerated into paeans to dead politicians. Portraits of Franklin Roosevelt, John F. Kennedy, and Dwight Eisenhower were slapped onto coins almost as soon as their pulses stopped. This reflected a sea change in values as Americans were encouraged to expect more from their leaders than from their own freedom.

Coin dealing helped me recognize early on that a government promise is not worth a plug nickel. From 1878 onwards, the US Mint printed silver certificates, a form of paper currency. My 1935 silver certificate stated: “This certifies that there is on deposit in the Treasury of the United States of America One Dollar in Silver Payable to the Bearer on Demand.” But in the 1960s, that became inconvenient so the government simply nullified the promise.

On August 15, 1971, President Richard Nixon announced that the US would cease paying gold to redeem the dollars held by foreign central banks. The dollar thus became a fiat currency—something which possessed value solely because politicians said so. Nixon assured Americans that his default would “help us snap out of the self-doubt, the self-disparagement that saps our energy and erodes our confidence in ourselves.” Regrettably, this particular treachery was not included on the list of indictable offenses that the House Judiciary Committee enacted a few years later.

After Nixon’s declaration of economic martial law, I lost my enthusiasm for squirreling away one memento from each mint and each year in the Whitman blue coin folders that permeated many 1960s childhoods. I shifted from collecting to investing, hoping that old coins would be a good defense against Nixon’s “New Economics.” Prices for pristine coin specimens were far higher and more volatile than the value of some of the barely legible slabs of metal I previously amassed. A single blemish could slash the value of a rare coin by 80 percent (same problem I had with some manuscripts I’ve submitted over the years).

Coin values were pump primed by the Federal Reserve’s deluge of paper dollars to create an artificial boom to boost Nixon’s reelection campaign and supplemented by wage and price controls that wreaked havoc. Inflation almost quadrupled between 1972 and 1974, and I soaked up the cynicism and outrage prevailing in coin investment and hard money newsletters. I poured most of the money from the jobs I did during high school into rare coins. Because rare coins were appreciating almost across the board, it was difficult not to be lucky in a rising market. The biggest peril was the endless scam artists seeking to fleece people with false promises of lofty gains or fraudulent grading of rare coins—a pox that continues to this day.

After graduating high school in 1974, I began working a construction job. When I got laid off, I saw it as a sign from God (or at least from the market) to buy gold. Investment newsletters and political debacles convinced me the dollar was heading for a crash. I sold most of my rare coins and plunked all my available cash into gold and also took out a consumer finance loan at 18 percent to purchase even more. That interest rate was the gauge of my blind confidence. Nixon’s resignation in August 1974 did wonders to redeem my gamble.

My coin and gold speculations helped pay for my brief stints in college, with some greenbacks left over to cover living expenses during my first literary strikeouts. I eventually shifted into journalism and migrated to the Washington area.

Two weeks after I moved into a shabby group house in the District of Columbia in 1983, I pawned the last gem of my coin collection—the 1885 five-dollar gold piece that my Irish American grandmother had given me fifteen years earlier. She was a dear sweet lady who would have appreciated that her gift helped cover the rent for a few more weeks until I finally consistently hit solid paydirt later that year. (Thanks, Reader’s Digest!)

Wheeling and dealing with coins inoculated me against Beltway-style agoraphobia—a pathological dread of any unregulated market. The market set the price for 1950 Jefferson nickels coined in Denver based on the relatively small mintage chased by growing legions of young collectors. Nixon boosted the price of milk after the dairy lobby pledged $2 million in illegal contributions. It was nuts to permit politicians to control prices when there was no way to control politicians. Having watched coin values whipsaw over the prior decade, I recognized that value was subjective. The test of a fair price is the voluntary consent of each party to the bargain, “the free will which constitutes fair exchanges,” as Senator John Taylor wrote in 1822. Seven years ago, President Barack Obama, talking about how the government was losing money minting the lowest denomination coin, declared, “The penny, I think, ends up being a good metaphor for some of the larger problems we got.” Actually, the collapse of our currency’s value is a curse, not a metaphor. The dollar has lost 85 percent of its purchasing power since Nixon closed the gold window.

For a century, American coinage and currency policies have veered between “government as a damn rascal” and “government as a village idiot.” I remain mystified how anyone continues trusting their rulers after the government formally repudiates its promises. But I still appreciate old coins with beautiful designs that incarnated the American creed that no man has a right to be enshrined above anyone else.

The unanimous Declaration of the thirteen united States of America

The unanimous Declaration of the thirteen united States of America

In Congress, July 4, 1776.

When in the Course of human events, it becomes necessary for one people to dissolve the political bands which have connected them with another, and to assume among the powers of the earth, the separate and equal station to which the Laws of Nature and of Nature’s God entitle them, a decent respect to the opinions of mankind requires that they should declare the causes which impel them to the separation.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.–That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed, –That whenever any Form of Government becomes destructive of these ends, it is the Right of the People to alter or to abolish it, and to institute new Government, laying its foundation on such principles and organizing its powers in such form, as to them shall seem most likely to effect their Safety and Happiness. Prudence, indeed, will dictate that Governments long established should not be changed for light and transient causes; and accordingly all experience hath shewn, that mankind are more disposed to suffer, while evils are sufferable, than to right themselves by abolishing the forms to which they are accustomed. But when a long train of abuses and usurpations, pursuing invariably the same Object evinces a design to reduce them under absolute Despotism, it is their right, it is their duty, to throw off such Government, and to provide new Guards for their future security.–Such has been the patient sufferance of these Colonies; and such is now the necessity which constrains them to alter their former Systems of Government. The history of the present King of Great Britain is a history of repeated injuries and usurpations, all having in direct object the establishment of an absolute Tyranny over these States. To prove this, let Facts be submitted to a candid world.

He has refused his Assent to Laws, the most wholesome and necessary for the public good.

He has forbidden his Governors to pass Laws of immediate and pressing importance, unless suspended in their operation till his Assent should be obtained; and when so suspended, he has utterly neglected to attend to them.

He has refused to pass other Laws for the accommodation of large districts of people, unless those people would relinquish the right of Representation in the Legislature, a right inestimable to them and formidable to tyrants only.

He has called together legislative bodies at places unusual, uncomfortable, and distant from the depository of their public Records, for the sole purpose of fatiguing them into compliance with his measures.

He has dissolved Representative Houses repeatedly, for opposing with manly firmness his invasions on the rights of the people.

He has refused for a long time, after such dissolutions, to cause others to be elected; whereby the Legislative powers, incapable of Annihilation, have returned to the People at large for their exercise; the State remaining in the mean time exposed to all the dangers of invasion from without, and convulsions within.

He has endeavoured to prevent the population of these States; for that purpose obstructing the Laws for Naturalization of Foreigners; refusing to pass others to encourage their migrations hither, and raising the conditions of new Appropriations of Lands.

He has obstructed the Administration of Justice, by refusing his Assent to Laws for establishing Judiciary powers.

He has made Judges dependent on his Will alone, for the tenure of their offices, and the amount and payment of their salaries.

He has erected a multitude of New Offices, and sent hither swarms of Officers to harrass our people, and eat out their substance.

He has kept among us, in times of peace, Standing Armies without the Consent of our legislatures.

He has affected to render the Military independent of and superior to the Civil power.

He has combined with others to subject us to a jurisdiction foreign to our constitution, and unacknowledged by our laws; giving his Assent to their Acts of pretended Legislation:

For Quartering large bodies of armed troops among us:

For protecting them, by a mock Trial, from punishment for any Murders which they should commit on the Inhabitants of these States:

For cutting off our Trade with all parts of the world:

For imposing Taxes on us without our Consent:

For depriving us in many cases, of the benefits of Trial by Jury:

For transporting us beyond Seas to be tried for pretended offences

For abolishing the free System of English Laws in a neighbouring Province, establishing therein an Arbitrary government, and enlarging its Boundaries so as to render it at once an example and fit instrument for introducing the same absolute rule into these Colonies:

For taking away our Charters, abolishing our most valuable Laws, and altering fundamentally the Forms of our Governments:

For suspending our own Legislatures, and declaring themselves invested with power to legislate for us in all cases whatsoever.

He has abdicated Government here, by declaring us out of his Protection and waging War against us.

He has plundered our seas, ravaged our Coasts, burnt our towns, and destroyed the lives of our people.

He is at this time transporting large Armies of foreign Mercenaries to compleat the works of death, desolation and tyranny, already begun with circumstances of Cruelty & perfidy scarcely paralleled in the most barbarous ages, and totally unworthy the Head of a civilized nation.

He has constrained our fellow Citizens taken Captive on the high Seas to bear Arms against their Country, to become the executioners of their friends and Brethren, or to fall themselves by their Hands.

He has excited domestic insurrections amongst us, and has endeavoured to bring on the inhabitants of our frontiers, the merciless Indian Savages, whose known rule of warfare, is an undistinguished destruction of all ages, sexes and conditions.

In every stage of these Oppressions We have Petitioned for Redress in the most humble terms: Our repeated Petitions have been answered only by repeated injury. A Prince whose character is thus marked by every act which may define a Tyrant, is unfit to be the ruler of a free people.

Nor have We been wanting in attentions to our Brittish brethren. We have warned them from time to time of attempts by their legislature to extend an unwarrantable jurisdiction over us. We have reminded them of the circumstances of our emigration and settlement here. We have appealed to their native justice and magnanimity, and we have conjured them by the ties of our common kindred to disavow these usurpations, which, would inevitably interrupt our connections and correspondence. They too have been deaf to the voice of justice and of consanguinity. We must, therefore, acquiesce in the necessity, which denounces our Separation, and hold them, as we hold the rest of mankind, Enemies in War, in Peace Friends.

We, therefore, the Representatives of the united States of America, in General Congress, Assembled, appealing to the Supreme Judge of the world for the rectitude of our intentions, do, in the Name, and by Authority of the good People of these Colonies, solemnly publish and declare, That these United Colonies are, and of Right ought to be Free and Independent States; that they are Absolved from all Allegiance to the British Crown, and that all political connection between them and the State of Great Britain, is and ought to be totally dissolved; and that as Free and Independent States, they have full Power to levy War, conclude Peace, contract Alliances, establish Commerce, and to do all other Acts and Things which Independent States may of right do. And for the support of this Declaration, with a firm reliance on the protection of divine Providence, we mutually pledge to each other our Lives, our Fortunes and our sacred Honor.


Button Gwinnett

Lyman Hall

George Walton


North Carolina

William Hooper

Joseph Hewes

John Penn


South Carolina

Edward Rutledge

Thomas Heyward, Jr.

Thomas Lynch, Jr.

Arthur Middleton



John Hancock


Samuel Chase

William Paca

Thomas Stone

Charles Carroll of Carrollton



George Wythe

Richard Henry Lee

Thomas Jefferson

Benjamin Harrison

Thomas Nelson, Jr.

Francis Lightfoot Lee

Carter Braxton



Robert Morris

Benjamin Rush

Benjamin Franklin

John Morton

George Clymer

James Smith

George Taylor

James Wilson

George Ross


Caesar Rodney

George Read

Thomas McKean


New York

William Floyd

Philip Livingston

Francis Lewis

Lewis Morris


New Jersey

Richard Stockton

John Witherspoon

Francis Hopkinson

John Hart

Abraham Clark


New Hampshire

Josiah Bartlett

William Whipple



Samuel Adams

John Adams

Robert Treat Paine

Elbridge Gerry


Rhode Island

Stephen Hopkins

William Ellery



Roger Sherman

Samuel Huntington

William Williams

Oliver Wolcott


New Hampshire

Matthew Thornton

Woodrow Wilson: A President Worth ‘Canceling’

Woodrow Wilson: A President Worth ‘Canceling’

Princeton University has made it official: Woodrow Wilson’s name no longer will have any place on campus. The former president, or at least his memory, now is part of cancel culture, which is sweeping the nation. The Woodrow Wilson School of Public and International Affairs will replace the former president’s name with “Princeton,” and Wilson College now will be called First College.

This hardly is surprising but in many ways discouraging, but not for reasons that many people might assume. Wilson did, after all, leave a sorry legacy of Jim Crow racial segregation and actively sought to damage if not destroy race relations in the United States, so the drive to remove his name is not a surprise given the wave of renaming and destruction of statues and monuments that has dominated the headlines ever since Minneapolis police killed George Floyd.

The reason for discouragement is not that the university where Wilson served as president before becoming president of the United States has “canceled” him for his racism—something that no one ever sought to hide when discussing Wilson’s legacy—but rather the stubborn insistence that despite his racial policies Wilson’s record of pushing progressive legislation as well as his role in bringing the United States into World War I should be considered as pluses for his presidency. Declares Princeton president Christopher L. Eisgruber:

Wilson remade Princeton, converting it from a sleepy college into a great research university. Many of the virtues that distinguish Princeton today—including its research excellence and its preceptorial system—were in significant part the result of Wilson’s leadership. He went on to the American presidency and received a Nobel Prize. People will differ about how to weigh Wilson’s achievements and failures. Part of our responsibility as a University is to preserve Wilson’s record in all of its considerable complexity.

Translation: Wilson’s record is complex, as he did many positive things both for Princeton and for the USA when he was in the White House. In fact, the “complex” review of Wilson is quite common with historians and journalists, many of whom seem to believe that if it were not for his fealty to Jim Crow and institutionalized racism Woodrow Wilson would have been a great president. That is the legacy that we need to reexamine, and as we do, we find that Wilson’s presidency was a complete disaster, one that reverberates to the present time and still inflicts great harm to our body politic. There is nothing complex at all when examining the cataclysmic aftermath of those eight years Wilson spent in office.

Dick Lehr of The Atlantic seems to be typical of journalists, as he condemns Wilson’s racism but portrays him positively when it comes to his imposition of a progressive legislative and social agenda:

Wilson might have bumbled, and worse, on civil rights, but he was overseeing implementation of a “New Freedom” in the nation’s economy—his campaign promise to restore competition and fair labor practices, and to enable small businesses crushed by industrial titans to thrive once again. In September 1914, for example, he had created the Federal Trade Commission to protect consumers against price-fixing and other anticompetitive business practices, and shortly after signed into law the Clayton Antitrust Act. He continued monitoring the so-called European War, resisting pressure to enter but moving to strengthen the nation’s armed forces.

It is hard to know where to begin here. First, and most important, “industrial titans” were not “crushing” small businesses. They made their fortunes through mass production of iron, steel, petroleum, railroad locomotives, and farm implements, along with making automobiles affordable for those people they allegedly were “crushing.” These industries required large-scale capital, not backyard furnaces, and this was a time when the American standard of living was rising rapidly. It is one thing to write about how “price fixing” allegedly was cheating American consumers but quite another to provide credible examples.

Most historians and journalists writing about this period take it on faith that antitrust laws and other so-called reforms brought on by progressives actually improved the lot of most people in this country. Finding proof that these “reforms” did what supporters claim can be a bit more quixotic.

Let us look at some of the actions that Wilson and his progressive Democratic Congress accomplished during his presidency. For example, most historians and journalists see the Sixteenth Amendment, which provided the legal base for a national income tax, as a “reform” that made the lives of most Americans better. How a tax that takes a significant share of individuals’ earnings has been spent such that those paying are better off having the government spend those monies than they would be by directing their own resources requires creative thinking. Given that most federal employees receive better pay and benefits than the people who work to create the wealth those federal workers consume, one is hard-pressed to explain why the taxpayers are getting a better deal than if they hadn’t paid those taxes at all.

Then there was the creation of the Federal Reserve System in 1914. It is the rare journalist, historian, and even economist who does not lavish praise upon the Fed even though one can effectively argue that it is often responsible for the very conditions that breed financial crises in the first place. Most people would not praise an arsonist who throws fuel on a fire he started, but somehow Federal Reserve governors who provide “liquidity” for financial institutions that acted irresponsibly—often with government and Fed encouragement—are seen as economic saviors.

There is much more. During Wilson’s first term, Democrats pushed through law after law that bolstered the Jim Crow system of racial segregation in the federal government system, which up until then had not followed the lead of many states that were instituting an apartheid system for whites and African Americans. While the federal government was not directly involved in medical care, nonetheless progressives such as Wilson were also firmly behind the guiding principles of the Flexner Report of 1910, which according to Murray N. Rothbard created and maintained the medical cartel that even now deprives Americans of many healthcare options. (Note that very few, if any, journalists and historians have any problem with the cartelization of medical care despite their supposed love affair with competition and their uncritical endorsement of antitrust laws.) Furthermore, the Flexner Report and its aftermath doomed medical education for black Americans and women and left the country woefully short of physicians.

Yet the “crowning achievement” of Wilson’s presidency is American involvement in World War I and its role in the disastrous “peace process” that followed Germany’s surrender. Not surprisingly, journalists and historians see Wilson’s manipulation of this country into the war as being something both inevitable and necessary, a move that launched the USA as a “great power” in world affairs.

Germany posed no danger to the United States, the infamous Zimmerman Telegram notwithstanding. Its armies could not have invaded our shores, and had the Americans not turned the tide in favor of Great Britain and France, almost certainly the belligerents would have entered into a negotiated settlement that would not have laid the conditions for the rise of Adolph Hitler and what turned out to be an even more cataclysmic World War II and its warring aftermath.

Wilson’s contempt for black Americans extended into military service. Like other Americans, they were conscripted into the armed forces and forced into subservient roles, as the prejudices of the day held that blacks were cowards in battle despite their fighting records in previous American wars. Those who did carry a rifle mostly did so under French leadership, where they excelled on the battlefield but also were slaughtered like so many others in the hellish trenches that came to define that war.

On the home front, Wilson’s Congress pushed through laws that turned the USA into a virtual police state, such as the Espionage Act of 1917 (used to prosecute people who dissented against US involvement in the war) and the Trading with the Enemy Act of 1917 (which Franklin Roosevelt used as the “basis of authority” for his executive order to seize gold from Americans). The legacy of both laws continues to this day, as the Obama administration used the Espionage Act to prosecute Julian Assange and Edward Snowden.

If one defines “greatness” as dragging a country into a disastrous war, promoting legislation that hamstrung the economy, vastly increasing taxation, and leaving a racial legacy that wreaks havoc to this very day, then Woodrow Wilson was a “great president.” However, if one sees “greatness” in the Oval Office as someone, according to Robert Higgs, “who acts in accordance with his oath of office to ‘preserve, protect, and defend the Constitution of the United States,’” then Wilson is neither great nor “near great” (the ranking bestowed on him by progressive historians).

Woodrow Wilson does not have a “mixed” legacy. The America that existed before Wilson took office was a very different and less free country after his second term ended in 1921. The dictator-like military organization of the economy that was used to direct war production would form part of the basis for FDR’s attempts to further cartelize the US economy during the New Deal. Wilson pushed through laws to eviscerate the First Amendment and to imprison dissenters, and his racial policies speak for themselves. He did not “lead” the nation during crises; he drove the country into crisis, and this nation never has recovered.

William L. Anderson is a professor of economics at Frostburg State University in Frostburg, Maryland. This article was originally featured at the Ludwig von Mises Institute and is republished with permission. 

The Russian Bounties Hoax

The Russian Bounties Hoax

There’s no reason for you to accept the story about the Russian military paying Afghan militants to kill American troops in Afghanistan. The New York TimesWall Street Journal and Washington Post all started this controversy late last week with incredibly thin stories. They did not even pretend to claim that it was true the Russians had put bounties on U.S. troops, only that they had anonymous sources who claimed there was a government report somewhere that said that. They were reporting the “fact” that there was a rumor.

They wouldn’t even say which agencies were leaking the story. All we were told was the story came from “intelligence officials” or even “people familiar” with the story.

They did not cite any evidence and did not claim to connect the rumored bounties to the deaths of any particular American soldiers or marines.

All three stories were written in language conceding they did not know if the story was true. The Times wrote this “would be an escalation,” “officials have said,” “it would be the first time,” and again, “would also be a huge escalation.” [Emphasis added.] (“Escalation” of what? Russia’s global dark arts war against American interests which also happens to only exist in the form of claims of anonymous government officials.)

The New York Times follow-up story was still very thin. Again, the extremely vague “intelligence officials” and now the extremely broad “special operations forces,” who are not intelligence officials, are their claimed sources. They do not cite the CIA, who refused to comment.

The sources claim that the intelligence report says that captured “militants”—again deliberately vague—were caught with some American cash and later admitted to Afghan National Security Force interrogators that they had been paid these Russian bounties.

Well the entire country is awash in American cash as well as every form of black market in drugs, guns, prostitution and the rest. The U.S. itself has been paying the Taliban since 2005 or 2006 literally billions of dollars in protection money for convoys of U.S. supplies in the country. There’s even a whole book devoted to that subject called Funding the Enemy: How US Taxpayers Bankroll the Taliban by Douglas Wissing. They then spend that money buying American weapons, night-vision equipment and even Humvees from the Afghan Army the U.S. has built there.

Of course the Afghan government has a huge interest in perpetuating such tales as these, whether they tortured these statements out of these prisoners or not. They want desperately for U.S. forces to stay to protect their power. If making up a story about Russia and the Taliban could undermine the Trump administration’s peace talks with the Taliban, then they just might do that.

Remember, just in this century, America’s intelligence agencies have lied about Iraq’s unconventional weapons and alliance with Osama bin Laden, Libya’s impending genocide, Syria’s “moderate rebel” bin Ladenite terrorists and false-flag chemical weapons attacks, and most recently the massive hoax that Donald Trump was a brainwashed, blackmailed secret agent of Putin’s Kremlin who had conspired with Russia to usurp Hillary Clinton’s rightful throne in the 2016 election. They’re liars.

After all we’ve been through, we’re supposed to give anonymous “intelligence officials” in the New York Times the benefit of the doubt on something like this? I don’t think so.

The Wall Street Journal conceded yesterday that the National Security Agency is dissenting from the conclusion about the bounties, though of course not saying why. However, just the fact that they put that in the paper seems to signal a very strong dissent from the conclusion and the media and political war that is being waged in the name of it. The Pentagon also said on Monday it has not seen “corroborating evidence” to support the claims.

Current reports are that the supposed events all happened last year. This raises major questions why the story was leaked to the three most important newspapers in the country in the way that it was last week. The national security state has done everything they can to keep the U.S. involved in that war, successfully badgering Obama and Trump both into expanding it against their better judgement. If Trump had listened to his former Secretary of Defense James Mattis and National Security Advisor H.R. McMaster, we’d be on year three of an escalation with plans to begin talks with the Taliban next year. Instead, Trump talked to them for the last year-and-a-half and has already signed a deal to have us out by the end of next May.

The national security state also has a continuing interest in preventing Trump from “getting along with Russia.” Anything they can do to advance the tired debunked old narrative that Trump puts Russian interests before America’s, they will. Of course that is the story TV is pushing again this week. (I am not a Trump supporter. But lies are lies and his position on Afghanistan is now correct.)

Before this supposed story broke last week, Sen. Angus King, the Democrat, was already complaining about Trump’s plans for a “precipitous” and “hasty” withdrawal from Afghanistan, after two decades—a withdrawal planned for completion another year from now. Shocking but not surprising, as they say.

What interest might Russia have in doing this?

It’s America who switched sides in the Afghan war, not Russia. They have supported the U.S. effort and U.S.-created government in Kabul since 2001. In 2012, when the Pakistanis closed the “southern route” from Karachi through the Khyber Pass, Russia re-opened the “northern route” through their country to allow American supplies into Afghanistan for Obama’s “surge.” They have sent arms to the Afghan National Army. To get around their own sanctions, the U.S. has even had India buy helicopters from the Russians to give to the Afghan government.

There’s no question they are talking to the Taliban. But so are we.

There were claims in 2017 that Russia was arming and paying the Taliban, but then the generals admitted to Congress they had no evidence of either. In a humiliating debacle, also in 2017, CNN claimed a big scoop about Putin’s support for the Taliban when furnished with some photos of Taliban fighters with old Russian weapons. The military veteran journalists at Task and Purpose quickly debunked every claim in their piece.

Let’s say hypothetically that the story was true: The simplest explanation for Russia’s motive then would be that they were trying to provoke exactly the reaction they have gotten, which is renewed pressure on Trump to back out of the withdrawal deal with the Taliban since his political enemies will spin it as a “win” for Russia if we leave. But why would Russia want to provoke America to stay in Afghanistan? Could it be for the same reason that Jimmy Carter and Ronald Reagan backed the mujahideen against the USSR back in the 70s and 80s—to provoke them into committing national suicide by bogging them down in a no-win quagmire, killing hundreds of thousands of people and wasting uncountable billions of dollars?

So what would that say about our policy now?

Of course that’s all nonsense too. The reason the Russians have supported our efforts in Afghanistan for the last 19 years is because we’re protecting their friends in power and at least supposedly have been fighting to keep transnational Islamist terrorism at bay. If they are backing the Taliban at all now it would be just a small version of their own “Awakening” policy of supporting the local mujahideen against the new smaller and more radical groups claiming loyalty to ISIS there, since the Taliban have been their most effective opponents.

This is not much different than the current American policy which prioritizes the Taliban’s keeping ISIS and al Qaeda down and out for us.

Of course it’s America’s (dis)loyal Saudi and Pakistani allies who have been backing the Afghan Taliban insurgency against the U.S. occupation all these years, not the Russians.

Afghanistan will probably be mired in protracted conflict for years after U.S. forces finally leave, though hopefully all sides are tired enough of fighting now that they can negotiate acceptable power-sharing arrangements instead. If the pressure is bad enough that Trump renounces his own deal, the Taliban will almost certainly go back to war against U.S. forces there. That is not likely to happen though.

As far as America’s relationship with Russia—the single most important thing in the world for all people—this is just another setback on the road to a peaceful and acceptable coexistence.

Dangerous Game: How the Wreckage of Russiagate Ignited a New Cold War

Dangerous Game: How the Wreckage of Russiagate Ignited a New Cold War

It’s been nearly four years since the myth of Trump-Russia collusion made its debut in American politics, generating an endless stream of stories in the corporate press and hundreds of allegations of conspiracy from pundits and officials. But despite netting scores of embarrassing admissions, corrections, editor’s notes and retractions in that time, the theory refuses to die.

Over the years, the highly elaborate “Russiagate” narrative has fallen away piece-by-piece. Claims about Donald Trump’s various back channels to Moscow—Carter Page, George Papadopoulos, Michael Flynn, Paul Manafort, Alfa Bank—have each been thoroughly discredited. House Intelligence Committee transcripts released in May have revealed that nobody who asserted a Russian hack on Democratic computers, including the DNC’s own cyber security firm, is able to produce evidence that it happened. In fact, it is now clear the entire investigation into the Trump campaign was without basis.

It was alleged that Moscow manipulated the president with “kompromat” and black mail, sold to the public in a “dossier” compiled by a former British intelligence officer, Christopher Steele. Working through a DC consulting firm, Steele was hired by Democrats to dig up dirt on Trump, gathering a litany of accusations that Steele’s own primary source would later dismiss as “hearsay” and “rumor.” Though the FBI was aware the dossier was little more than sloppy opposition research, the bureau nonetheless used it to obtain warrants to spy on the Trump campaign.

Even the claim that Russia helped Trump from afar, without direct coordination, has fallen flat on its face. The “troll farm” allegedly tapped by the Kremlin to wage a pro-Trump meme war—the Internet Research Agency—spent only $46,000 on Facebook ads, or around 0.05 percent of the $81 million budget of the Trump and Clinton campaigns. The vast majority of the IRA’s ads had nothing to do with U.S. politics, and more than half of those that did were published after the election, having no impact on voters. The Department of Justice, moreover, has dropped its charges against the IRA’s parent company, abandoning a major case resulting from Robert Mueller’s special counsel probe.

Though few of its most diehard proponents would ever admit it, after four long years, the foundation of the Trump-Russia narrative has finally given way and its edifice has crumbled. The wreckage left behind will remain for some time to come, however, kicking off a new era of mainstream McCarthyism and setting the stage for the next Cold War.

It Didn’t Start With Trump

The importance of Russiagate to U.S. foreign policy cannot be understated, but the road to hostilities with Moscow stretches far beyond the current administration. For thirty years, the United States has exploited its de facto victory in the first Cold War, interfering in Russian elections in the 1990s, aiding oligarchs as they looted the country into poverty, and orchestrating Color Revolutions in former Soviet states. NATO, meanwhile, has been enlarged up to Russia’s border, despite American assurances the alliance wouldn’t expand “one inch” eastward after the collapse of the USSR.

Unquestionably, from the fall of the Berlin Wall until the day Trump took office, the United States maintained an aggressive policy toward Moscow. But with the USSR wiped off the map and communism defeated for good, a sufficient pretext to rally the American public into another Cold War has been missing in the post-Soviet era. In the same 30-year period, moreover, Washington has pursued one disastrous diversion after another in the Middle East, leaving little space or interest for another round of brinkmanship with the Russians, who were relegated to little more than a talking point. That, however, has changed.

The Crisis They Needed

The Washington foreign policy establishment—memorably dubbed “the Blob” by one Obama adviser—was thrown into disarray by Trump’s election win in the fall of 2016. In some ways, Trump stood out as the dove during the race, deeming “endless wars” in the Middle East a scam, calling for closer ties with Russia, and even questioning the usefulness of NATO. Sincere or not, Trump’s campaign vows shocked the Beltway think tankers, journalists, and politicos whose worldviews (and salaries) rely on the maintenance of empire. Something had to be done.

In the summer of 2016, WikiLeaks published thousands of emails belonging to then-Democratic candidate Hillary Clinton, her campaign manager, and the Democratic National Committee. Though damaging to Clinton, the leak became fodder for a powerful new attack on the president-to-be. Trump had worked in league with Moscow to throw the election, the story went, and the embarrassing email trove was stolen in a Russian hack, then passed to WikiLeaks to propel Trump’s campaign.

By the time Trump took office, the narrative was in full swing. Pundits and politicians rushed to outdo one another in hysterically denouncing the supposed election-meddling, which was deemed the “political equivalent” of the 9/11 attacks, tantamount to Pearl Harbor, and akin to the Nazis’ 1938 Kristallnacht pogrom. In lock-step with the U.S. intelligence community—which soon issued a pair of reports endorsing the Russian hacking story—the Blob quickly joined the cause, hoping to short-circuit any tinkering with NATO or rapprochement with Moscow under Trump.

The allegations soon broadened well beyond hacking. Russia had now waged war on American democracy itself, and “sowed discord” with misinformation online, all in direct collusion with the Trump campaign. Talking heads on cable news and former intelligence officials—some of them playing both roles at once—weaved a dramatic plot of conspiracy out of countless news reports, clinging to many of the “bombshell” stories long after their key claims were blown up.

A large segment of American society eagerly bought the fiction, refusing to believe that Trump, the game show host, could have defeated Clinton without assistance from a foreign power. For the first time since the fall of the USSR, rank-and-file Democrats and moderate progressives were aligned with some of the most vocal Russia hawks across the aisle, creating space for what many have called a “new Cold War.

Stress Fractures 

Under immense pressure and nonstop allegations, the candidate who shouted “America First” and slammed NATO as “obsolete” quickly adapted himself to the foreign policy consensus on the alliance, one of the first signs the Trump-Russia story was bearing fruit.

Demonstrating the Blob in action, during debate on the Senate floor over Montenegro’s bid to join NATO in March 2017, the hawkish John McCain castigated Rand Paul for daring to oppose the measure, riding on anti-Russian sentiments stoked during the election to accuse him of “working for Vladimir Putin.” With most lawmakers agreeing the expansion of NATO was needed to “push back” against Russia, the Senate approved the request nearly unanimously and Trump signed it without batting an eye—perhaps seeing the attacks a veto would bring, even from his own party.

Allowing Montenegro—a country that illustrates everything wrong with NATO—to join the alliance may suggest Trump’s criticisms were always empty talk, but the establishment’s drive to constrain his foreign policy was undoubtedly having an effect. Just a few months later, the administration would put out its National Security Strategy, stressing the need to refocus U.S. military engagements from counter-terrorism in the Middle East to “great power competition” with Russia and China.

On another aspiring NATO member, Ukraine, the president was also hectored into reversing course under pressure from the Blob. During the 2016 race, the corporate press savaged the Trump campaign for working behind the scenes to “water down” the Republican Party platform after it opposed a pledge to arm Ukraine’s post-coup government. That stance did not last long.

Though even Obama decided against arming the new government—which his administration helped to install—Trump reversed that move in late 2017, handing Kiev hundreds of Javelin anti-tank missiles. In an irony noticed by few, some of the arms went to open neo-Nazis in the Ukrainian military, who were integrated into the country’s National Guard after leading street battles with security forces in the Obama-backed coup of 2014. Some of the very same Beltway critics slamming the president as a racist demanded he pass weapons to out-and-out white supremacists.

Ukraine’s bid to join NATO has all but stalled under President Volodymyr Zelensky, but the country has nonetheless played an outsized role in American politics both before and after Trump took office. In the wake of Ukraine’s 2014 U.S.-sponsored coup, “Russian aggression” became a favorite slogan in the American press, laying the ground for future allegations of election-meddling.

Weaponizing Ukraine

The drive for renewed hostilities with Moscow got underway well before Trump took the Oval Office, nurtured in its early stages under the Obama administration. Using Ukraine’s revolution as a springboard, Obama launched a major rhetorical and policy offensive against Russia, casting it in the role of an aggressive, expansionist power.

Protests erupted in Ukraine in late 2013, following President Viktor Yanukovych’s refusal to sign an association agreement with the European Union, preferring to keep closer ties with Russia. Demanding a deal with the EU and an end to government corruption, demonstrators—including the above-mentioned neo-Nazis—were soon in the streets clashing with security forces. Yanukovych was chased out of the country, and eventually out of power.

Through cut-out organizations like the National Endowment for Democracy, the Obama administration poured millions of dollars into the Ukrainian opposition prior to the coup, training, organizing and funding activists. Dubbed the “Euromaidan Revolution,” Yanukovych’s ouster mirrored similar US-backed color coups before and since, with Uncle Sam riding on the back of legitimate grievances while positioning the most U.S.-friendly figures to take power afterward.

The coup set off serious unrest in Ukraine’s Russian-speaking enclaves, the eastern Donbass region and the Crimean Peninsula to the south. In the Donbass, secessionist forces attempted their own revolution, prompting the new government in Kiev to launch a bloody “war on terror” that continues to this day. Though the separatists received some level of support from Moscow, Washington placed sole blame on the Russians for Ukraine’s unrest, while the press breathlessly predicted an all-out invasion that never materialized.

In Crimea—where Moscow has kept its Black Sea Fleet since the late 1700s—Russia took a more forceful stance, seizing the territory to keep control of its long term naval base. The annexation was accomplished without bloodshed, and a referendum was held weeks later affirming that a large majority of Crimeans supported rejoining Russia, a sentiment western polling firms have since corroborated. Regardless, as in the Donbass, the move was labeled an invasion, eventually triggering a raft of sanctions from the U.S. and the EU (and more recently, from Trump himself).

The media made no effort to see Russia’s perspective on Crimea in the wake of the revolution—imagining the U.S. response if the roles were reversed, for example—and all but ignored the preferences of Crimeans. Instead, it spun a black-and-white story of “Russian aggression” in Ukraine. For the Blob, Moscow’s actions there put Vladimir Putin on par with Adolf Hitler, driving a flood of frenzied press coverage not seen again until the 2016 election.

Succumbing to Hysteria 

While Trump had already begun to cave to the onslaught of Russiagate in the early months of his presidency, a July 2018 meeting with Putin in Helsinki presented an opportunity to reverse course, offering a venue to hash out differences and plan for future cooperation. Trump’s previous sit-downs with his Russian counterpart were largely uneventful, but widely portrayed as a meeting between master and puppet. At the Helsinki Summit, however, a meager gesture toward improved relations was met with a new level of hysterics.

Trump’s refusal to interrogate Putin on his supposed election-hacking during a summit press conference was taken as irrefutable proof that the two were conspiring together. Former CIA Director John Brennan declared it an act of treason, while CNN gravely contemplated whether Putin’s gift to Trump during the meetings—a World Cup soccer ball—was really a secret spying transmitter. By this point, Robert Mueller’s special counsel probe was in full effect, lending official credibility to the collusion story and further emboldening the claims of conspiracy.

Though the summit did little to strengthen U.S.-Russia ties and Trump made no real effort to do so—beyond resisting the calls to directly confront Putin—it brought on some of the most extreme attacks yet, further ratcheting up the cost of rapprochement. The window of opportunity presented in Helsinki, while only cracked to begin with, was now firmly shut, with Trump as reluctant as ever to make good on his original policy platform.


After taking a beating in Helsinki, the administration allowed tensions with Moscow to soar to new heights, more or less embracing the Blob’s favored policies and often even outdoing the Obama government’s hawkishness toward Russia in both rhetoric and action.

In March 2018, the poisoning of a former Russian spy living in the United Kingdom was blamed on Moscow in a highly elaborate storyline that ultimately fell apart (sound familiar?), but nonetheless triggered a wave of retaliation from western governments. In the largest diplomatic purge in US history, the Trump administration expelled 60 Russian officials in a period of two days, surpassing Obama’s ejection of 35 diplomats in response to the election-meddling allegations.

Along with the purge, starting in spring 2018 and continuing to this day, Washington has unleashed round after round of new sanctions on Russia, including in response to “worldwide malign activity,” to penalize alleged election-meddling, for “destabilizing cyber activities,” retaliation for the UK spy poisoning, more cyber activity, more election-meddling—the list keeps growing.

Though Trump had called to lift rather than impose penalties on Russia before taking office, worn down by endless negative press coverage and surrounded by a coterie of hawkish advisers, he was brought around on the merits of sanctions before long, and has used them liberally ever since.

Goodbye INF, RIP OST

By October 2018, Trump had largely abandoned any idea of improving the relationship with Russia and, in addition to the barrage of sanctions, began shredding a series of major treaties and arms control agreements. He started with the Cold War-era Intermediate-Range Nuclear Forces Treaty (INF), which had eliminated an entire class of nuclear weapons—medium-range missiles—and removed Europe as a theater for nuclear war.

At this point in Trump’s tenure, super-hawk John Bolton had assumed the position of national security advisor, encouraging the president’s worst instincts and using his newfound influence to convince Trump to ditch the INF treaty. Bolton—who helped to detonate a number of arms control pacts in previous administrations—argued that Russia’s new short-range missile had violated the treaty. While there remains some dispute over the missile’s true range and whether it actually breached the agreement, Washington failed to pursue available dispute mechanisms and ignored Russian offers for talks to resolve the spat.

After the U.S. officially scrapped the agreement, it quickly began testing formerly-banned munitions. Unlike the Russian missiles, which were only said to have a range overstepping the treaty by a few miles, the U.S. began testing nuclear-capable land-based cruise missiles expressly banned under the INF.

Next came the Open Skies Treaty (OST), an idea originally floated by President Eisenhower, but which wouldn’t take shape until 1992, when an agreement was struck between NATO and former Warsaw Pact nations. The agreement now has over 30 members and allows each to arrange surveillance flights over other members’ territory, an important confidence-building measure in the post-Soviet world.

Trump saw matters differently, however, and turned a minor dispute over Russia’s implementation of the pact into a reason to discard it altogether, again egged on by militant advisers. In late May 2020, the president declared his intent to withdraw from the nearly 30-year-old agreement, proposing nothing to replace it.

Quid Pro Quo

With the DOJ’s special counsel probe into Trump-Russia collusion coming up short on both smoking-gun evidence and relevant indictments, the president’s enemies began searching for new angles of attack. Following a July 2019 phone call between Trump and his newly elected Ukrainian counterpart, they soon found one.

During the call, Trump urged Zelensky to investigate a computer server he believed to be linked to Russiagate, and to look into potential corruption and nepotism on the part of former Vice President Joe Biden, who played an active role in Ukraine following the Obama-backed coup.

Less than two months later, a “whistleblower”—a CIA officer detailed to the White House, Eric Ciaramella—came forward with an “urgent concern” that the president had abused his office on the July call. According to his complaint, Trump threatened to withhold U.S. military aid, as well as a face-to-face meeting with Zelensky, should Kiev fail to deliver the goods on Biden, who by that point was a major contender in the 2020 race.

The same players who peddled Russiagate seized on Ciaramella’s account to manufacture a whole new scandal: “Ukrainegate.” Failing to squeeze an impeachment out of the Mueller probe, the Democrats did just that with the Ukraine call, insisting Trump had committed grave offenses, again conspiring with a foreign leader to meddle in a U.S. election.

At a high point during the impeachment trial, an expert called to testify by the Democrats revived George W. Bush’s “fight them over there” maxim to argue for U.S. arms transfers to Ukraine, citing the Russian menace. The effort was doomed from the start, however, with a GOP-controlled Senate never likely to convict and the evidence weak for a “quid pro quo” with Zelensky. Ukrainegate, like Russiagate before it, was a failure in its stated goal, yet both served to mark the administration with claims of foreign collusion and press for more hawkish policies toward Moscow.

The End of New START?

The Obama administration scored a rare diplomatic achievement with Russia in 2010, signing the New START Treaty, a continuation of the original Strategic Arms Reduction Treaty inked in the waning days of the Soviet Union. Like its first iteration, the agreement places a cap on the number of nuclear weapons and warheads deployed by each side. It featured a ten-year sunset clause, but included provisions to continue beyond its initial end date.

With the treaty set to expire in early 2021, it has become an increasingly hot topic throughout Trump’s presidency. While Trump sold himself as an expert dealmaker on the campaign trail—an artist, even—his negotiation skills have shown lacking when it comes to working out a new deal with the Russians.

The administration has demanded that China be incorporated into any extended version of the treaty, calling on Russia to compel Beijing to the negotiating table and vastly complicating any prospect for a deal. With a nuclear arsenal around one-tenth the size of that of Russia or the U.S., China has refused to join the pact. Washington’s intransigence on the issue has put the future of the treaty in limbo and largely left Russia without a negotiating partner.

A second Trump term would spell serious trouble for New START, having already shown willingness to shred the INF and Open Skies agreements. And with the Anti-Ballistic Missile Treaty (ABM) already killed under the Bush administration, New START is one of the few remaining constraints on the planet’s two largest nuclear arsenals.

Despite pursuing massive escalation with Moscow from 2018 onward, Trump-Russia conspiracy allegations never stopped pouring from newspapers and TV screens. For the Blob—heavily invested in a narrative as fruitful as it was false—Trump would forever be “Putin’s puppet,” regardless of the sanctions imposed, the landmark treaties incinerated or the deluge of warlike rhetoric.

Running for an Arms Race

As the Trump administration leads the country into the next Cold War, a renewed arms race is also in the making. The destruction of key arms control pacts by previous administrations has fed a proliferation powder keg, and the demise of New START could be the spark to set it off.

Following Bush Jr.’s termination of the ABM deal in 2002—wrecking a pact which placed limits on Russian and American missile defense systems to maintain the balance of mutually assured destruction—Russia soon resumed funding for a number of strategic weapons projects, including its hypersonic missile. In his announcement of the new technology in 2018, Putin deemed the move a response to Washington’s unilateral withdrawal from ABM, which also saw the U.S. develop new weapons.

Though he inked New START and campaigned on vows to pursue an end to the bomb, President Obama also helped to advance the arms build-up, embarking on a 30-year nuclear modernization project set to cost taxpayers $1.5 trillion. The Trump administration has embraced the initiative with open arms, even adding to it, as Moscow follows suit with upgrades to its own arsenal.

Moreover, Trump has opened a whole new battlefield with the creation of the US Space Force, escalated military deployments, ramped up war games targeting Russia and China and looked to reopen and expand Cold War-era bases.

In May, Trump’s top arms control envoy promised to spend Russia and China into oblivion in the event of any future arms race, but one was already well underway. After withdrawing from INF, the administration began churning out previously banned nuclear-capable cruise missiles, while fielding an entire new class of low-yield nuclear weapons. Known as “tactical nukes,” the smaller warheads lower the threshold for use, making nuclear conflict more likely. Meanwhile, the White House has also mulled a live bomb test—America’s first since 1992—though has apparently shelved the idea for now.

A Runaway Freight Train

As Trump approaches the end of his first term, the two major U.S. political parties have become locked in a permanent cycle of escalation, eternally compelled to prove who’s the bigger hawk. The president put up mild resistance during his first months in office, but the relentless drumbeat of Russiagate successfully crushed any chances for improved ties with Moscow.

The Democrats refuse to give up on “Russian aggression” and see virtually no pushback from hawks across the aisle, while intelligence “leaks” continue to flow into the imperial press, fueling a whole new round of election-meddling allegations.

Likewise, Trump’s campaign vows to revamp U.S.-Russian relations are long dead. His presidency counts among its accomplishments a pile of new sanctions, dozens of expelled diplomats and the demise of two major arms control treaties. For all his talk of getting along with Putin, Trump has failed to ink a single deal, de-escalate any of the ongoing strife over Syria, Ukraine or Libya, and been unable to arrange one state visit in Moscow or DC.

Nonetheless, Trump’s every action is still interpreted through the lens of Russian collusion. After announcing a troop drawdown in Germany on June 5, reducing the U.S. presence by just one-third, the president was met with the now-typical swarm of baseless charges. MSNBC regular and retired general Barry McCaffrey dubbed the move “a gift to Russia,” while GOP Rep. Liz Cheney said the meager troop movement placed the “cause of freedom…in peril.” Top Democrats in the House and Senate introduced bills to stop the withdrawal dead in its tracks, attributing the policy to Trump’s “absurd affection for Vladimir Putin, a murderous dictator.”

Starting as a dirty campaign trick to explain away the Democrats’ election loss and jam up the new president, Russiagate is now a key driving force in the U.S. political establishment that will long outlive the age of Trump. After nearly four years, the bipartisan consensus on the need for Cold War is stronger than ever, and will endure regardless of who takes the Oval Office next.

Men’s Lives Matter: Male Victims of Sexual Abuse

Men’s Lives Matter: Male Victims of Sexual Abuse

When a woman sexually abuses a man, the pain inflicted is dismissed by those whose political paradigm does not include men as victims.Unfortunately, these voices dominate the research, media, and legislation that surrounds the issue of sexual victimization.

Their narrative: “Women are victims, men are perpetrators.” This rigid approach even ignores the glaring male-on-male sexual violence in prisons because it does not fit the mold; in fact, if prisoners were counted, then men might well show a greater rate of rape than women. The blindness to male victimization inflicts a second injustice on every male survivor who is either viewed as a liar or as unimportant. Female survivors of rape in the ‘50s were treated in much the same way by law enforcement and the court system.

The situation with the sexual abuse of men is slowly changing. A few years ago, the CDC added a category of sexual abuse called “made to penetrate,” which is an important but almost entirely overlooked form of sexual abuse, perhaps because it applies only to males. A report entitled “The Sexual Victimization of Men in America: New Data Challenge Old Assumptions” was published in a 2014 issue of the American Journal of Public Health. There, UCLA researchers Lara Stemple and Ilan Meyer explain, “By introducing the term, ‘made to penetrate,’ the CDC has added new detail to help understand what happens when men are sexually victimized…Therefore, to the extent that males experience nonconsensual sex differently (i.e., being made to penetrate), male victimization will remain vastly undercounted in federal data collection on violent crime.” Stemple and Meyer conclude, “we first argue that it is time to move past the male perpetrator and female victim paradigm [of rape]. Overreliance on it stigmatizes men who are victimized.”

Including the category ‘made to penetrate’ could shift the entire paradigm of sexual violence. The CDC, National Intimate Partner and Sexual Violence Survey: 2010-2012 State Report, released in 2017, indicates (Tables 3.1 and 3.5):


  • Made to penetrate: 1.7 million
  • Rape: Numbers too small to report a reliable estimate
  • Sexual coercion: 1.6 million
  • Unwanted sexual contact: 1.9 million


  • Rape: 1.5 million
  • Made to penetrate: Numbers too small to report a reliable estimate
  • Sexual coercion: 2.4 million
  • Unwanted sexual contact: 2.5 million

With this shift of paradigm, female-on-male violence may start to receive attention. The problem is enormous. BOJ statistics for 2018 show the imprisonment rate in America as 431 sentenced per 100,000 residents. Extrapolating from the figures for federal prison, well over 90% of prisoners are male. In its report “No Escape: Male Rape in U.S. Prisons,” Human Rights Watch indicates that approximately 20% of male prisoners are sexually victimized. Most sexual violence professionals, however, seem indifferent to the situation.

The rate of male sexual victims is likely understated because men notoriously underreport their abuse. The National Domestic Violence Abuse Hotline offers several reasons why. They are socialized against appearing “weak”; their abuse becomes the brunt of jokes; they believe there is no support out there for them…and with reason. The Maryland nonprofit Stop Abusive and Violence Environments summarized the situation. “On the one hand, about 25% of men who sought help from DV hotlines were connected with resources that were helpful. On the other hand, nearly 67%…reported that these DV agencies and hotline were not at all helpful. Many reported being turned away.” Their research tells “a story of male helpseekers who are often doubted, ridiculed, and given false information. This…impacts men’s physical and mental health.”

How can this happen in a society preoccupied with sexual abuse? The dominant narrative of sexual violence is identity politics. This claims that a person’s identity is not defined by a common humanity and the choices he or she makes as an individual. It is defined by the specific subcategories of humanity into which a person fits; gender and race are examples. Instead of humanity having common interests, such as freedom of religion, the subcategories have interests that clash. Men benefit at the expense of women. They do so by sexually oppressing women…or so the story goes.

The preceding statement reflects some historical truth. In the ‘50s, men as a class did receive preferential treatment in the workplace, academia, and from the general culture. Women who experienced sexual and domestic violence were blamed for their own victimization. But society has changed at a dizzying pace. We live in a healthier society that contains a fatal flaw: identity politics. If people are defined by biological subcategories that are in conflict, then society is at war—a forever war. Fairness or equality to one class means pulling down another. If men perpetrate sexual violence, then raising up women means policing and punishing males. When facts contradict the narrative, they may be given a cursory nod, then the narrators return to the script.

In the ‘60s, the unjust treatment of female rape victims sparked a sexual revolution that ripped open the culture. But no revolution defends male victims. Instead, the ‘60s revolution has been institutionalized within academia, bureaucracy, and law. It has calcified. The established sexual order now depends on ignoring male victims or dismissing them as inconvenient. It must ignore the fact that violence has no gender. Violence only has individuals who abuse and individuals who are abused. Excluding women or men from either category shows a willful blindness to reality. It expresses political self-interest, not justice.

Three Observations on the Second Wave of COVID-19

Three Observations on the Second Wave of COVID-19

The long-feared second wave of COVID-19 in the United States appears to have arrived. National case numbers are making new records and two states have started to move back towards quarantine.

Since news reports on the virus continue to emphasize the wrong metrics, some important facts about the new wave often get missed. Here are three things to know about the new rise in cases.

1. The recent rise in cases cannot be explained by increases in testing.

This is an important point because the onset of a second jump in cases has been declared prematurely several times before now. These types of reports came in different flavors. Sometimes, they calculated percentage growth rates off extremely small numbers (at a county level for instance) to report an eye-popping rate of growth. More commonly, they failed to highlight the fact that total testing was increasing faster than new cases–suggesting the virus was probably just as common as it had been days earlier, but the state was able to confirm more cases.

This time is different. Many states are experiencing both a rise in absolute cases, and a rise in the percentage of cases that is coming back positive (the positivity rate). This is a clear sign that things in these places is getting worse with respect to the coronavirus.

As of June 27, these are all the states that was seeing both a positivity rate above 10% and weekly rise in that rate. The original data for the table below comes from The COVID Tracking Project:

So, while not all of these states are in crisis right now, it is correct to say we are seeing a pronounced jump in cases in many places. It’s not just a figment of the data / reporting like it had been before.

2. The rise in cases is primarily occurring in places that were not hit hard before.

When you look at a chart of the national numbers of new cases, you can clearly see the second wave that’s occurring. Using the data through June 27, our weekly figures of new cases is rapidly approaching the previous peaks set in April.

The trend below shows rolling 7-day positive cases for the US (all states plus DC and Puerto Rico) per 100,000 people:

On a national basis, the second wave description looks appropriate. But this obscures very different trends at the state level. In reality, the places where cases are rising did not experience much of a peak earlier on the crisis.

Consider the trends below for four states that have been making headlines in the past few days: Texas, Florida, California, and Arizona.

In this data, we see that Texas experienced a slight uptick in April, but it was much smaller than what we’re seeing now. For Arizona, Florida, and California, the recent rise they are experiencing is their first serious increase when adjusted for population. Conversely, states like New York and New Jersey saw their large increase in March and April, but are now seeing stable or declining cases.

This pattern demonstrates one of the many problems with demanding a nationwide lockdown in a country as large as the US. In effect, all states shut down (to varying degrees) based on the experience of New York, New Jersey, and a couple other hotspots. They did this without regard to whether the outbreaks they were experiencing could possibly warrant such dramatic action.

Now that some of these same states are facing a real outbreak close to home, they’re starting to do a new round of limited restrictions. In the face of an economy on life support, widespread social unrest, and an election year, it’s unclear much people will tolerate or comply with another aggressive attempt at quarantine.

3. So far, the second wave appears to be less lethal than the first wave.

Another important characteristic of the second wave is that, at least so far, it looks like to be less deadly than the earlier spikes.

Some commentators have made this point by looking at the trends in death counts for the new hotspots. However, this is not a good way to evaluate the lethality of the second wave at this point.

Deaths are a lagging indicator in this data. According to facts summarized by Our World in Data, death typically occurs between 2 weeks to 8 weeks after the onset of symptoms, which in turn show up several days after initial infection. Many of the cases being discovered now will eventually prove fatal, but they won’t be counted for several weeks. Looking at death rates today in the new hotspots risks providing a false sense of reassurance.

A better way to evaluate the likely lethality of this second wave in real time is to look at the trends in hospitalized COVID-19 cases.

We saw previously that the rise in new cases is now above the levels seen in April. Fortunately, for now the hospitalization data is not following the same trajectory.

In the chart below, we see the national trend in per capita positive cases combined with per capita hospitalization:

Here, we see that national hospitalization data has stabilized but has started to move upward, but is not accelerated at the same pace as total cases.

When we look on a state-by-state basis, a similar pattern emerges. Below, we present the hospitalization trends for current and prior hotspot states. (Florida is omitted due to a lack of reliable hospitalization data.):

Of the new epicenters, Arizona again shows up as an outlier on hospitalization. But even so, it’s still a ways out from the extreme per capita hospitalization levels seen earlier in the northeast. Meanwhile, the other states are on a slight upswing, but still low in terms of overall numbers.

There are several different reasons that might help account for the lower hospitalization rates in this cycle.

One explanation is that the average age of COVID-19 individuals is lower than it was in the first wave of cases. CNN recently reported on this rise in infections among young people as a cause for alarm…

“It’s a little bit of a disturbing trend, and what frightens me is not only that they are younger, the potential of them infecting other people, particularly parents and grandparents,” Dr. Robert Jansen, chief medical officer at Grady Health System, told WSB.

…but in fact, it’s much better than the alternative. If more people are going to be infected, it’s obviously preferable that the people with the lowest chance of serious illness are the ones that get it.

It’s worth remembering that one of the reasons that New York and New Jersey fared worse than other states is that they had a policy which inadvertently increased the probability that older, more vulnerable people would be infected. In an attempt to preserve hospital capacity, these governments required hospitals to discharge COVID-19 patients back to nursing homes before it was confirmed that they no longer had the virus. The result was that the virus was effectively being reintroduced in nursing homes, spreading widely among a high-risk population.

Another reason for lower relative hospitalization rate in the new epicenters is that the testing is far more widespread. When New York was dealing with its peak in April, testing capacity was still being ramped up. This meant that the tests had to be reserved for healthcare workers and people with severe symptoms. In turn, we know that the total confirmed cases in April for New York and other states significantly understated the true number of cases. We just don’t know how large the understatement was.

Since the virus is hitting states like Arizona later, the testing limitations are not as severe. In all probability, this means that the confirmed case count in Arizona today is closer to the true number of infections.

This is not to suggest hospitals will have the capacity they need. When new cases are concentrated in specific parts of a state (as is occurring in Houston, Texas), hospitals will again be strained beyond their normal limits.

The point is that, at least for now, the trend is not nearly as dire as what the northeastern states saw previously. That nuance is easily lost among a sea of news headlines about record new cases.

Kosovo Indictment Proves Bill Clinton’s Serbian War Atrocities

Kosovo Indictment Proves Bill Clinton’s Serbian War Atrocities

President Bill Clinton’s favorite freedom fighter just got indicted for mass murder, torture, kidnapping, and other crimes against humanity. In 1999, the Clinton administration launched a 78-day bombing campaign that killed up to 1500 civilians in Serbia and Kosovo in what the American media proudly portrayed as a crusade against ethnic bias. That war, like most of the pretenses of U.S. foreign policy, was always a sham.

Kosovo President Hashim Thaci was charged with ten counts of war crimes and crimes against humanity by an international tribunal in The Hague in the Netherlands. It charged Thaci and nine other men with “war crimes, including murder, enforced disappearance of persons, persecution, and torture.” Thaci and the other charged suspects were accused of being “criminally responsible for nearly 100 murders” and the indictment involved “hundreds of known victims of Kosovo Albanian, Serb, Roma, and other ethnicities and include political opponents.”

Hashim Thaci’s tawdry career illustrates how anti-terrorism is a flag of convenience for Washington policymakers. Prior to becoming Kosovo’s president, Thaci was the head of the Kosovo Liberation Army (KLA), fighting to force Serbs out of Kosovo. In 1999, the Clinton administration designated the KLA as “freedom fighters” despite their horrific past and gave them massive aid. The previous year, the State Department condemned “terrorist action by the so-called Kosovo Liberation Army.” The KLA was heavily involved in drug trafficking and had close to ties to Osama bin Laden.

But arming the KLA and bombing Serbia helped Clinton portray himself as a crusader against injustice and shift public attention after his impeachment trial. Clinton was aided by many shameless members of Congress anxious to sanctify U.S. killing. Sen. Joe Lieberman (D-CN) whooped that the United States and the KLA “stand for the same values and principles. Fighting for the KLA is fighting for human rights and American values.” And since Clinton administration officials publicly compared Serb leader Slobodan Milošević to Hitler, every decent person was obliged to applaud the bombing campaign.

Both the Serbs and ethnic Albanians committed atrocities in the bitter strife in Kosovo. But to sanctify its bombing campaign, the Clinton administration waved a magic wand and made the KLA’s atrocities disappear. British professor Philip Hammond noted that the 78-day bombing campaign “was not a purely military operation: NATO also destroyed what it called ‘dual-use’ targets, such as factories, city bridges, and even the main television building in downtown Belgrade, in an attempt to terrorize the country into surrender.”

NATO repeatedly dropped cluster bombs into marketplaces, hospitals, and other civilian areas. Cluster bombs are anti-personnel devices designed to be scattered across enemy troop formations. NATO dropped more than 1,300 cluster bombs on Serbia and Kosovo and each bomb contained 208 separate bomblets that floated to earth by parachute. Bomb experts estimated that more than 10,000 unexploded bomblets were scattered around the landscape when the bombing ended and maimed children long after the ceasefire.

In the final days of the bombing campaign, the Washington Post reported that “some presidential aides and friends are describing Kosovo in Churchillian tones, as Clinton’s ‘finest hour.’” The Post also reported that according to one Clinton friend “what Clinton believes were the unambiguously moral motives for NATO’s intervention represented a chance to soothe regrets harbored in Clinton’s own conscience…The friend said Clinton has at times lamented that the generation before him was able to serve in a war with a plainly noble purpose, and he feels ‘almost cheated’ that ‘when it was his turn he didn’t have the chance to be part of a moral cause.’” By Clinton’s standard, slaughtering Serbs was “close enough for government work” to a “moral cause.”

Shortly after the end of the 1999 bombing campaign, Clinton enunciated what his aides labeled the Clinton doctrine: “Whether within or beyond the borders of a country, if the world community has the power to stop it, we ought to stop genocide and ethnic cleansing.” In reality, the Clinton doctrine was that presidents are entitled to commence bombing foreign lands based on any brazen lie that the American media will regurgitate. In reality, the lesson from bombing Serbia is that American politicians merely need to publicly recite the word “genocide” to get a license to kill.

After the bombing ended, Clinton assured the Serbian people that the United States and NATO agreed to be peacekeepers only “with the understanding that they would protect Serbs as well as ethnic Albanians and that they would leave when peace took hold.” In the subsequent months and years, American and NATO forces stood by as the KLA resumed its ethnic cleansing, slaughtering Serb civilians, bombing Serbian churches and oppressing any non-Muslims. Almost a quarter-million Serbs, Gypsies, Jews, and other minorities fled Kosovo after Mr. Clinton promised to protect them. By 2003, almost 70 percent of the Serbs living in Kosovo in 1999 had fled, and Kosovo was 95 percent ethnic Albanian.

But Thaci remained useful for U.S. policymakers. Even though he was widely condemned for oppression and corruption after taking power in Kosovo, Vice President Joe Biden hailed Thaci in 2010 as the “George Washington of Kosovo.” A few months later, a Council of Europe report accused Thaci and KLA operatives of human organ trafficking. The Guardian noted that the report alleged that Thaci’s inner circle “took captives across the border into Albania after the war, where a number of Serbs are said to have been murdered for their kidneys, which were sold on the black market.” The report stated that when “transplant surgeons” were “ready to operate, the [Serbian] captives were brought out of the ‘safe house’ individually, summarily executed by a KLA gunman, and their corpses transported swiftly to the operating clinic.”

Despite the body trafficking charge, Thaci was a star attendee at the annual Global Initiative conference by the Clinton Foundation in 2011, 2012, and 2013, where he posed for photos with Bill Clinton. Maybe that was a perk from the $50,000 a month lobbying contract that Thaci’s regime signed with The Podesta Group, co-managed by future Hillary Clinton campaign manager John Podesta, as the Daily Caller reported.

Clinton remains a hero in Kosovo where a statue of him was erected in the capital, Pristina. The Guardian newspaper noted that the statue showed Clinton “with a left hand raised, a typical gesture of a leader greeting the masses. In his right hand he is holding documents engraved with the date when NATO started the bombardment of Serbia, 24 March 1999.” It would have been a more accurate representation to depict Clinton standing on a pile of corpses of the women, children, and others killed in the U.S. bombing campaign.

In 2019, Bill Clinton and his fanatically pro-bombing former Secretary of State, Madeline Albright, visited Pristina, where they were “treated like rock stars” as they posed for photos with Thaci. Clinton declared, “I love this country and it will always be one of the greatest honors of my life to have stood with you against ethnic cleansing (by Serbian forces) and for freedom.” Thaci awarded Clinton and Albright medals of freedom “for the liberty he brought to us and the peace to entire region.” Albright has reinvented herself as a visionary warning against fascism in the Trump era. Actually, the only honorific that Albright deserves is “Butcher of Belgrade.”

Clinton’s war on Serbia was a Pandora’s box from which the world still suffers. Because politicians and most of the media portrayed the war against Serbia as a moral triumph, it was easier for the Bush administration to justify attacking Iraq, for the Obama administration to bomb Libya, and for the Trump administration to repeatedly bomb Syria. All of those interventions sowed chaos that continues cursing the purported beneficiaries.

Bill Clinton’s 1999 bombing of Serbia was as big a fraud as George W. Bush’s conning this nation into attacking Iraq. The fact that Clinton and other top U.S. government officials continued to glorify Hashim Thaci despite accusations of mass murder, torture, and body trafficking is another reminder of the venality of much of America’s political elite. Will Americans again be gullible the next time that Washington policymakers and their media allies concoct bullshit pretexts to blow the hell out of some hapless foreign land?

Exposing Jerome Powell’s Lies About the Fed and Inequality

Exposing Jerome Powell’s Lies About the Fed and Inequality

If the heads of the Federal Reserve are to be believed, Fed policies do not make wealth inequality worse.

When asked recently if the Fed’s policies widen inequality, San Francisco Federal Reserve President Mary Daly stated without reservation: “Not in my judgment.”

Previously, Fed Chairman Jay Powell at the end of May was less forceful in his response, but nevertheless danced around the question of Fed policy increasing inequality. “Everything we do is focused on creating an environment in which those people will have their best chance to keep their job or maybe get a new job,” was his response.

Of course, we know Powell and Daly are lying.

How the Fed Benefits the Investor Class

Austrian school investor Jesse Colombo writes at his site explainingcapitalism.org, “the Fed and the ‘paper’ dollar are the main reasons for America’s growing economic inequality.”

Why is this so?

“In simple terms,” Colombo explains, “inflation benefits the rich while hurting the middle class and poor due to the way each group’s finances are structured.”

In short, the rich receive a significant share of their income from investments, while the middle class primarily relies on their income from labor, and the poor a combination of labor income and government welfare payments.

When the Fed creates new fiat money out of thin air, it isn’t distributed evenly throughout the economy. Instead, it is inserted at specific points, typically via credit to business investors. As the Fed inflates a bubble, speculation with the new money also increases—which inflates the stock market and other major asset classes like housing, benefitting the investor class.

To see just how acute the rise in asset value for the investor class has been, massive fiat money printing has helped the S&P 500 balloon by more than 360% in the last 30 years, a nearly five-fold increase, and more than doubling in the last ten years alone.

Moreover, median home values have nearly tripled over the last 30 years, far surpassing the rate of inflation.

The overwhelming majority of these benefits accrue to a small group of investors.

This June 2 article on quartz.com reported that the “wealthiest 10% of U.S. households owned about 83%” of stock market wealth, according to a 2016 Federal Reserve Bank of St. Louis report.

“The richest one percent of Americans now account for more than half the value of equities owned by U.S. households, according to Goldman Sachs,” reported this February 2020 Financial Post article. Conversely, the bottom 90 percent of households owned just 12 percent of stock market wealth.

Additionally, rapidly rising home prices puts homeownership out of reach for more and more people. “Homeownership is increasingly out of reach for the typical American,” Redfin Chief Economist Daryl Fairweather said in this 2019 HousingWire.com article. “Over the last few years builders have focused on luxury homes, and there hasn’t been enough construction of affordable starter homes.”

After peaking in 2006 before the Great Recession, overall homeownership rates fell from a high of 69 percent to 63 percent in 2016. Ownership rates have been climbing again in recent years, but nevertheless the gains from housing value increases accrue not only to just those who can afford a home, but even more acutely to those in more expensive houses. Meanwhile, non-homeowners and those with lower-priced homes fall further behind.

Racial Wealth Gap

With a sharper and more critical eye being focused on racial issues—and the racial wealth gap in particular—due to recent events, the Fed is due its fair share of blame in this realm as well.

For starters, the benefits of rising home prices fueled by easy Fed money can only benefit actual homeowners. And, according to this February 2020 Urban Institute paper, “the gap between the black and white homeownership rates in the United States has increased to its highest level in 50 years” in 2017.

The white homeownership rate stood at 71.9 percent, compared to just 41.8 percent for blacks.

Furthermore, Federal Reserve data analyzed at capitalist.com shows that 61 percent of white households own publicly traded stock compared to just 31 percent of black households.

Even in middle and upper class households, the discrepancy persists. A March 2019 Investor’s Business Daily article reported that “A 2015 survey by Ariel asked Americans with household income of at least $50,000 whether they owned stocks or stock mutual funds. Eighty-six percent of whites said they did. For African-Americans, the number was 67%.”

In short, as Fed easy money policies benefit stockholders and homeowners, a disproportionate amount of those benefits are going to white households, further exacerbating the racial wealth gap.


There’s little doubt that the Federal Reserve increases wealth inequality overall, but deepens the racial wealth gap as well. The easy money policies of the last decade as the nation attempted to recover from the Great Recession provide a prime example.

As this 2019 MarketWatch.com article noted, “the Fed lowered interest rates, which had the knock-on effect of pushing easy money into the hands of the already-wealthy.”

As Deutsche Bank’s Securities’ chief economist Torsten Sløk said, “The response to the financial crisis was for the Fed to lower interest rates which in turn pushed home prices and stock prices steadily higher over the past decade.”

Like the old state lottery ads used to say “Lotto: You’ve got to be in it to win it.”

Similarly, to “win” benefits from Federal Reserve easy money policies, you’ve got to already be in the stock and homeownership game, i.e. the investor class.

It’s beyond disingenuous for the likes of Powell and Daly to claim the Federal Reserve doesn’t increase inequality. Any discussion of wealth inequality—be it overall or the racial wealth gap—is incomplete without a discussion of the Fed.

Bradley Thomas is creator of the website Erasethestate.com and is a libertarian activist who enjoys researching and writing on the freedom philosophy and Austrian economics. Follow him on Twitter: @erasestate

Hightide for Foreign Policy Restrainers

Hightide for Foreign Policy Restrainers

Ten years ago, “restraint” was considered code for “isolationism” and its purveyors were treated with nominal attention and barely disguised condescension. Today, agitated national security elites who can no longer ignore the restrainers—and the positive attention they’re getting—are trying to cut them down to size.

We saw this recently when Peter Feaver, Hal Brands, and William Imboden, who all made their mark promoting George W. Bush’s war policies after 9/11, published “In Defense of the Blob” for Foreign Affairs in April. My own pushback received an attempted drubbing in The Washington Post by national security professor Daniel Drezner (he of the Twitter fame): “For one thing, her essay repeatedly contradicts itself. The Blob is an exclusive cabal, and yet Vlahos also says it’s on the wane.”

One can be both, Professor. As they say, Rome didn’t fall in a day. What we are witnessing are individuals and institutions sensing existential vulnerabilities. The restrainers have found a nerve and the Blob is feeling the pinch. Now it’s starting to throw its tremendous girth around.

Read the rest of this article at The American Conservative.

A Rebuke of ‘Modern Monetary Theory’

A Rebuke of ‘Modern Monetary Theory’

[Review of Stephanie Kelton, The Deficit Myth: Modern Monetary Theory and the Birth of the People’s Economy (New York: PublicAffairs, 2020).]

I’ve got good news and bad news. The good news is that Stephanie Kelton—economics professor at Stony Brook and advisor to the 2016 Bernie Sanders campaign—has written a book on modern monetary theory (MMT) that is very readable and will strike many readers as persuasive and clever. The bad news is that Stephanie Kelton has written a book on MMT that is very readable and will strike many readers as persuasive and clever.

To illustrate the flavor of the book, we can review Kelton’s reminiscences of serving as chief economist for the Democratic staff on the US Senate Budget Committee. When she was first selected, journalists reported that Senator Sanders had hired a “deficit owl”—a new term Kelton had coined. Unlike a deficit hawk or a deficit dove, Kelton’s deficit owl was “a good mascot for MMT because people associate owls with wisdom and also because owls’ ability to rotate their heads nearly 360 degrees would allow them to look at deficits from a different perspective” (Kelton 2020, p. 76).

Soon after joining the Budget Committee, Kelton the deficit owl played a game with the staffers. She would first ask if they would wave a magic wand that had the power to eliminate the national debt. They all said yes. Then Kelton would ask, “Suppose that wand had the power to rid the world of US Treasuries. Would you wave it?” This question—even though it was equivalent to asking to wipe out the national debt—“drew puzzled looks, furrowed brows, and pensive expressions. Eventually, everyone would decide against waving the wand” (Kelton 2020, p. 77).

Such is the spirit of Kelton’s book, The Deficit Myth. She takes the reader down trains of thought that turn conventional wisdom about federal budget deficits on its head. Kelton makes absurd claims that the reader will think surely can’t be true…but then she seems to justify them by appealing to accounting tautologies. And because she uses apt analogies and relevant anecdotes, Kelton is able to keep the book moving despite its dry subject matter. She promises the reader that MMT opens up grand new possibilities for the federal government to help the unemployed, the uninsured, and even the planet itself…if we would only open our minds to a paradigm shift.

So why is this bad news? Because Kelton’s concrete policy proposals would be an absolute disaster. Her message can be boiled down to two sentences (and these are my words, not an exact quotation): Because the Federal Reserve has the legal ability to print an unlimited number of dollars, we should stop worrying about how the government will “pay for” the various spending programs the public desiresIf they print too much money we will experience high inflation, but Uncle Sam doesn’t need to worry about “finding the money” the same way a household or business does.

This is an incredibly dangerous message to be injecting into the American discourse. If it were mere inflationism, we could hope that enough of the public and the policy wonks would rely on their common sense to reject it. Yet because Kelton dresses up her message with equations and thought experiments, she may end up convincing an alarming number of readers that MMT really can turn unaffordable government boondoggles into sensible investments, just by changing the way we think about them.

Precisely because Kelton’s book is so unexpectedly impressive, I would urge longstanding critics of MMT to resist the urge to dismiss it with ridicule. Although it’s fun to lambaste “magical monetary theory” on social media and to ask, “Why don’t you move to Zimbabwe?” such moves will only serve to enhance the credibility of MMT in the eyes of those who are receptive to it. Consequently, in this review I will craft a lengthy critique that takes Kelton quite seriously in order to show the readers just how wrong her message actually is, despite its apparent sophistication and even charm.

Monetary Sovereignty

In her introductory chapter, Kelton lures the reader with the promise of MMT and also sheds light on her book title:

[W]hat if the federal budget is fundamentally different than your household budget? What if I showed you that the deficit bogeyman isn’t real? What if I could convince you that we can have an economy that puts people and planet first? That finding the money to do this is not the problem? (Kelton 2020, p. 2, bold added)

The first chapter of the book makes the fundamental distinction for MMT, between currency issuers and currency users. Our political discourse is plagued, according to Kelton, with the fallacy of treating currency issuers like Uncle Sam as if they were mere currency users, like you, me, and Walmart.

We mere currency users have to worry about financing our spending; we need to come up with the money—and this includes borrowing from others—before we can buy something. In complete contrast, a currency issuer has no such constraints, and needn’t worry about revenue when deciding which projects to fund.

Actually, the situation is a bit more nuanced. To truly reap the advantages unlocked by MMT, a government must enjoy monetary sovereignty. For this, being a currency issuer is a necessary but insufficient condition. There are two other conditions, as Kelton explains:

To take full advantage of the special powers that accrue to the currency issuer, countries need to do more than just grant themselves the exclusive right to issue the currency. It’s also important that they don’t promise to convert their currency into something they could run out of (e.g. gold or some other country’s currency). And they need to refrain from borrowing…in a currency that isn’t their own. When a country issues its own nonconvertible (fiat) currency and only borrows in its own currency, that country has attained monetary sovereignty. Countries with monetary sovereignty, then, don’t have to manage their budgets as a household would. They can use their currency-issuing capacity to pursue policies aimed a maintaining a full employment economy. (Kelton 2020, pp. 18–19, bold added)

Countries with a “high degree of monetary sovereignty” include “the US, Japan, the UK, Australia, Canada, and many more” (Kelton 2020, p. 19). (And notice that even these countries weren’t “sovereign” back in the days of the gold standard, because they had to be careful in issuing currency lest they run out of gold.) In contrast, countries like Greece and France today are not monetarily sovereign, because they no longer issue the drachma and franc but instead adopted the euro as their currency.

The insistence on countries issuing debt in their own currency helps to explain away awkward cases such as Venezuela, which is suffering from hyperinflation and yet has the ability to issue its own currency. The answer (from an MMT perspective) is that Venezuela had a large proportion of its foreign-held debt denominated in US dollars, rather than the bolivar, and hence the Venezuelan government couldn’t simply print its way out of the hole. In contrast, goes the MMT argument, the US government owes its debts in US dollars, and so never need worry about a fiscal crisis.

Yes, Kelton Knows about Inflation

At this stage of the argument, the obvious retort for any postpubescent reader will be, “But what about inflation?!” And here’s where the critic of MMT needs to be careful. Kelton repeatedly stresses throughout her book—and I’ve seen her do it in interviews and even on Twitter—that printing money is not a source of unlimited real wealth. She (and Warren Mosler too, as he explained when I interviewed him on my podcast) understands and warns her readers that if the federal government prints too many dollars in a vain attempt to fund too many programs, then the economy will hit its genuine resource constraint, resulting in rapidly rising prices. As Kelton puts it:

Can we just print our way to prosperity? Absolutely not! MMT is not a free lunch. There are very real limits, and failing to identify—and respect—those limits could bring great harm. MMT is about distinguishing the real limits from the self-imposed constraints that we have the power to change. (Kelton 2020, p. 37, bold added)

In other words, when someone like Alexandria Ocasio-Cortez proposes a Green New Deal, from an MMT perspective the relevant questions are not, “Can the Congress afford such an expensive project? Will it drown us in red ink? Are we saddling our grandchildren with a huge credit card bill?” Rather, the relevant questions are, “Is there enough slack in the economy to implement a Green New Deal without reducing other types of output? If we approve this spending, will the new demand largely absorb workers from the ranks of the unemployed? Or will it siphon workers away from existing jobs by bidding up wages?”

The Fundamental Problem with MMT

Now that we’ve set the table, we can succinctly state the fundamental problem with Kelton’s vision: regardless of what happens to the “price level,” monetary inflation transfers real resources away from the private sector and into the hands of political officials. If a government project is deemed unaffordable according to conventional accounting, then it should also be denied funding via the printing press.

What makes MMT “cool” is that it’s (allegedly) based on a fresh insight showing how all of the mainstream economists and bean counters are locked in old habits of thought. Why, these fuddy-duddies keep treating Uncle Sam like a giant corporation, which has to make ends meet and always has to satisfy the bottom line. In contrast, the MMTers understand that the feds can print as many dollars as they want. It’s not revenue but (price) inflation that limits the government’s spending capacity.

I hate to break it to Kelton and the other MMT gurus, but economists—particularly those in the free market tradition—have been teaching this for decades (and perhaps centuries). For example, here’s Murray Rothbard in his 1962 treatise, Man, Economy, and State:

At this time, let us emphasize the important point that government cannot be in any way a fountain of resources; all that it spends, all that it distributes in largesse, it must first acquire in revenue, i.e., it must first extract from the “private sector.” The great bulk of the revenues of government, the very nub of its power and its essence, is taxation, to which we turn in the next section. Another method is inflation, the creation of new money, which we shall discuss further below. A third method is borrowing from the public. (Rothbard 1962, pp. 913–14, bold added)

To repeat, this is standard fare in the lore of free market economics. After explaining that government spending programs merely return resources to the private sector that had previously been taken from it, the economist will inform the public that there are three methods by which this taking occurs: taxation, borrowing, and inflation. The economist will often add that government borrowing can be considered merely deferred taxation, while inflation is merely hidden taxation.

And it’s not merely that inflation is equivalent to taxation. Because it’s harder for the public to understand what’s happening when government money printing makes them poorer, there is a definite sense in which standard taxation is “honest” whereas inflation is insidious. This is why Ludwig von Mises considered inflationary finance to be “essentially antidemocratic”: the printing press allows the government to get away with spending that the public would never agree to explicitly pay for through straightforward tax hikes.

Kelton and other MMT theorists argue that inflation isn’t a problem right now in the US and other advanced economies and so we don’t need to be shy about cranking up the printing press. But whether or not the Consumer Price Index is rising at an “unacceptably” high rate, it is a simple fact that when the government prints an extra $1 million to finance spending, then prices (quoted in US dollars) are higher than they otherwise would have been, and people holding dollar-denominated assets are poorer than they otherwise would have been. Suppose that prices would have fallen in the absence of government money printing. In this case, everybody holding dollar assets would have seen their real wealth go up because of the price deflation. If the government merely prints enough new dollars to keep prices stable, it’s still the case that those original dollar holders end up poorer relative to what otherwise would have happened.

Now to be sure, Kelton and other MMT theorists would object at this point in my argument. They claim that if there is still some “slack” in the economy, in the sense of unemployed workers and factories operating below capacity, then a burst of monetary inflation can put those idle resources to work. Even though the rising prices lead to redistribution, if total output is higher, then per capita output must be higher too. So, on average, the people still benefit from the inflation, right?

On this score, we simply have a disagreement about how the economy works, and in this dispute I think the Austrians are right while the MMTers are wrong. According to Mises’s theory of the business cycle, “idle capacity” in the economy doesn’t just fall out of the sky, but is instead the result of the malinvestments made during the preceding boom. So if we follow Kelton’s advice and crank up the printing press in an attempt to put those unemployed resources back to work, it will simply set in motion another unsustainable boom/bust cycle. In any event, in the real world, government projects financed by inflation won’t merely draw on resources that are currently idle, but will also siphon at least some workers and raw materials out of other, private sector outlets, as I elaborate in this article.

In summary, the fundamental “insight” of MMT—namely, that governments issuing fiat currencies need only fear price inflation, not insolvency—is something that other economists have acknowledged for decades. Where the MMTers do say something different is when they claim that printing money only carries an opportunity cost when the economy is at full employment. But on this point, the MMTers—like their more orthodox cousins the Keynesians—are simply wrong.

Tough Questions for MMT

A standard rhetorical move is for proponents to claim that MMT isn’t ideological, but merely describes how a financial system based on fiat money actually works. (For example, this was the lead argument Mike Norman used when he and I were dueling with YouTube videos.) Yet since so much hinges on whether a government has “monetary sovereignty,” it’s amazing that the MMTers never seem to ask why some governments enjoy this status while others don’t.

For her part, Kelton criticizes certain nonmonetarily sovereign governments for particular actions, such as joining a currency union (Kelton 2020, p. 145), but she doesn’t ask the basic question: Once an MMT economist explains its benefits, why doesn’t every government on earth follow the criteria for becoming a monetary sovereign? Indeed, why don’t all of us as individuals issue our own paper notes—in my case, I’d print RPMs, which has a nice ring to it—and furthermore only borrow from lenders in our own personal currencies? That way, if you fell behind in your mortgage payments, you could simply print up more of your own personal notes to get current with the bank.

Posed in this way, these questions have obvious answers. The reason Greece adopted the euro, and why Venezuela borrows so much in US dollar–denominated debt, and the reason I use dollars rather than conducting transactions in RPMs, is that the rest of the financial community is very leery of the Greek drachma, the Venezuelan bolivar, or the Murphyian RPM note. Consequently, the Greek and Venezualan governments, as well as me personally, all subordinated our technical freedom to be “monetary sovereigns” and violated one or more of Kelton’s criteria.

In short, the reason most governments (including state governments in the US) in the world aren’t “monetary sovereigns” is that members of the financial community are worried that they would abuse a printing press. The Greek government knew its economy would receive more investment, and that it would be able to borrow on cheaper terms, if it abandoned the drachma and adopted the euro. The Venezuelan government knew it could obtain much larger “real” loans if they were denominated in a relatively hard currency like the USD rather than the Venezuelan currency, which could so readily be debased (as history has shown). And I personally can’t interest anybody in financial transactions involving my authentic RPM notes, and so, reluctantly, I have to join the dollar zone.

Now that we’ve covered this basic terrain, I have a follow-up question for the MMT camp: What would it take for a government to lose its monetary sovereignty? In other words, of those governments that are currently monetary sovereigns, what would have to happen in order for the governments to start borrowing in foreign currencies, or for them to tie their own currency to a redemption pledge, or even abandon their own currency and embrace one issued by a foreign entity?

Here again the answer is clear: a government that engaged too recklessly in monetary inflation—thus leading investors to shun that particular “sovereign” currency—would be forced to pursue one or more of these concessions in order to remain part of the global financial community. Ironically, current monetary sovereigns would run the risk of forfeiting their coveted status if they actually followed Stephanie Kelton’s policy advice.

MMT Is Actually Wrong about Money

For a framework that prides itself on neutrally describing the actual operation of money and banking since the world abandoned the gold standard, it’s awkward that MMT is simply wrong about money. In this section I’ll summarize three of the main errors Kelton makes about money.

Money Mistake #1: Contrary to MMT, the Treasury Needs Revenue before It Can Spend

A bedrock claim of the MMT camp is that unlike you, me, and Walmart, the US Treasury doesn’t need to have money before spending it. Here’s an example of Kelton laying out the MMT description of government financing:

Take military spending. In 2019, the House and Senate passed legislation that increased the military budget, approving $716 billion…There was no debate about how to pay for the spending…Instead, Congress committed to spending money it did not have. It can do that because of its special power over the US dollar. Once Congress authorizes the spending, agencies like the Department of Defense are given permission to enter into contracts with companies like Boeing, Lockheed Martin, and so on. To provision itself with F-35 fighters, the US Treasury instructs its bank, the Federal Reserve, to carry out the payment on its behalf. The Fed does this by marking up the numbers in Lockheed’s bank account. Congress doesn’t need to “find the money” to spend it. It needs to find the votes! Once it has the votes, it can authorize the spending. The rest is just accounting. As the checks go out, the Federal Reserve clears the payments by crediting the sellers’ account with the appropriate number of digital dollars, known as bank reserves. That’s why MMT sometimes describes the Fed as the scorekeeper for the dollar. The scorekeeper can’t run out of points. (Kelton 2020, p. 29, bold added)

For a more rigorous, technical treatment, the advanced readers can consult Kelton’s peer-reviewed journal article from the late 1990s on the same issues. Yet whether we rely on Kelton’s pop book or her technical article, the problem for the MMTers is still there: nothing in their description is unique to the US Treasury.

For example, when I write a personal check for $100 to Jim Smith, who also uses my bank, we could explain what happens like this: “Murphy instructed Bank of America to simply add 100 digital dollars to the account of Jim Smith.” Notice that this description is exactly the same thing that Kelton said about the Treasury buying military hardware in the block quotation above. (It’s true that Bank of America can’t create legal tender base money the way the Fed can; I plug that hole in the analogy a bit below with my Goldman Sachs example.)

Now of course, I can’t spend an unlimited amount of dollars, since I’m a currency user, not a monetary sovereign. In particular, if I “instruct” Bank of America to mark up Jim Smith’s checking account balance by more dollars than I have in my own checking account, the bank may ignore my instructions. Or, if my overdraft isn’t too large, the bank might go ahead and honor the transaction but then show that I have a negative balance (and charge me an insufficient funds fee on top of it).

The only difference between my situation and the US Treasury’s is that I actually have bounced checks and online payments before, whereas the US Treasury hasn’t. Indeed, Kelton’s own journal article shows that the Treasury consistently maintained (as of the time of her research) a checking account balance of around $5 billion and that the daily closing amount never dipped much below this level (Kelton 1998, p. 11, figure 4).

Indeed, the Treasury itself sure acts as if it needed revenue before it can spend. That’s why the Treasury secretary engages in all sorts of fancy maneuvers—such as postponing contributions to government employees’ retirement plans—whenever there’s a debt ceiling standoff and Uncle Sam hits a cash crunch.

The MMTers take it for granted that if the Treasury ever actually tried to spend more than it contained in its Fed checking account balance, the Fed would honor the request. Maybe it would, and maybe it wouldn’t; CNBC’s John Carney (who moderated the debate at Columbia University between MMT godfather Warren Mosler and yours truly) thinks it’s an open question in terms of the actual legal requirements, though Carney believes that in practice the Fed would go ahead and cash the check.

Yet, to reiterate, so far the Treasury has never tried to spend money that it didn’t already have sitting in its checking account. The MMT camp would have you believe that there is something special occurring day in and day out when it comes to Treasury spending, but they are simply mistaken: so far, at least, the Treasury has never dared the Fed by overdrawing its account.

Indeed, Kelton herself in her technical article from the late 1990s implicitly gives away the game when she defends the MMT worldview in this fashion:

[S]ince the government’s balance sheet can be considered on a consolidated basis, given by the sum of the Treasury’s and Federal Reserve’s balance sheets with offsetting assets and liabilities simply canceling one another out…the sale of bonds by the Treasury to the Fed is simply an internal accounting operation, providing the government with a self-constructed spendable balance. Although self-imposed constraints may prevent the Treasury from creating all of its deposits in this way, there is no real limit on its ability to do so. (Kelton 1998, p. 16, italics in original)

What Kelton writes here is true, but by the same token we can consider the Federal Reserve and Goldman Sachs balance sheets on a consolidated basis. If we do that, then Goldman Sachs can now spend an infinite amount of money. Sure, its accountants might still construct profit and loss statements and warn about bad investments, but these are self-imposed constraints; so long as the Fed in practice will honor any check Goldman Sachs writes, then all overdrafts are automatically covered by an internal loan from the Fed to the investment bank. The only reason this wouldn’t work is if the Fed actually stood up to Goldman and said no. But that’s exactly what the situation is with respect to the Treasury too.

Whenever I argue the merits of MMT, I debate whether or not to bring up this particular quibble, because wondering whether the Fed would actually cover a Treasury overdraft doesn’t get to the essence of what’s wrong with MMT. I’m actually sympathetic to the MMT claims that the Fed would be obligated to backstop the Treasury in all circumstances; it would be very naïve to think that the Fed actually enjoys “independence” from the federal government that grants the central bank its power. Furthermore, I believe that the various rounds of quantitative easing (QE) during the Obama years weren’t merely driven by a desire to minimize the output gap, but instead were necessary to help monetize the boatloads of new federal debt being issued. (Of course Trump and Powell are performing a similar dance.)

Even so, I think it’s important for the public to realize that the heroes of MMT are misleading them when they claim there is something unique to Uncle Sam in the way he interacts with his banker. So far, this is technically not the case. Even when the Fed has clearly been monetizing new debt issuance—such as during the world wars—all of the players involved technically have gone through the motions of having the Treasury first float bonds in order to fill its coffers with borrowed funds and only then spending the money. The innocent reader wouldn’t know this if he or she relied on the standard MMT accounts of how the world works.

Money Mistake #2: Contrary to MMT, Taxes Don’t Prop Up (Most) Currencies

Another central mistake in the MMT approach is its theory of the origin and value of money. (If you want to see the Austrian view, consult my article on the contributions of Menger and Mises.) To set the stage, here is Kelton explaining how Warren Mosler stumbled upon the worldview that would eventually be dubbed modern monetary theory:

Mosler is considered the father of MMT because he brought these ideas to a handful of us in the 1990s. He says…it just struck him after his years of experience working in financial markets. He was used to thinking in terms of debits and credits because he had been trading financial instruments and watching funds transfer between bank accounts. One day, he started to think about where all those dollars must have originally come from. It occurred to him that before the government could subtract (debit) any dollars away from us, it must first add (credit) them. He reasoned that spending must have come first, otherwise where would anyone have gotten the dollars they needed to pay the tax? (Kelton 2020, p. 24)

This MMT understanding ties in with its view of the origin of money and how taxes give money its value. Kelton explains by continuing to summarize what she learned from Mosler:

[A] currency-issuing government wants something real, not something monetary. It’s not our tax money the government wants. It’s our time. To get us to produce things for the state, the government invents taxes…This isn’t the explanation you’ll find in most economics textbooks, where a superficial story about money being invented to overcome the inefficiencies associated with bartering…is preferred. In that story, money is just a convenient device that sprang up organically as a way to make trade more efficient. Although students are taught that barter was once omnipresent, a sort of natural state of being, scholars of the ancient world have found little evidence that societies were ever organized around barter exchange.

MMT rejects the ahistorical barter narrative, drawing instead on an extensive body of scholarship known as chartalism, which shows that taxes were the vehicle that allowed ancient rulers and early nation-states to introduce their own currencies, which only later circulated as a medium of exchange among private individuals. From inception, the tax liability creates people looking for paid work…in the government’s currency. The government…then spends its currency into existence, giving people access to the tokens they need to settle their obligations to the state. Obviously, no one can pay the tax until the government first supplies its tokens. As a simple point of logic, Mosler explained that most of us had the sequencing wrong. Taxpayers weren’t funding the government; the government was funding the taxpayers. (Kelton 2020, pp. 26–27, bold added)

I have included these lengthy quotations to be sure the reader understands the superficial appeal of MMT. Isn’t that intriguing—Mosler argues that the government funds the taxpayers! And when you think through his simple point about debits and credits, it seems that he isn’t just probably correct, but that he must be correct.

Again, it’s a tidy little demonstration. The only problem is that it’s demonstrably false. It is simply not true that dollars were invented when some autocratic ruler out of the blue imposed taxes on a subject population, payable only in this new unit called “dollar.” The MMT explanation of where money comes from doesn’t apply to the dollar, the euro, the yen, the pound…Come to think of it, I don’t believe the MMT explanation applies even to a single currency issued by a monetary sovereign. All of the countries that currently enjoy monetary sovereignty have built their economic strength and goodwill with investors by relying on a history of hard money.

In a review of Kelton’s book, I’m not going to delve into the problems with the alleged anthropological evidence that purportedly shows that ancient civilizations used money that was invented by political fiat, rather than money that emerged spontaneously from trade in commodities. For that topic, I refer the interested reader to my review of David Graeber’s book.

Yet let me mention before leaving this subsection that the MMT story at best only explains why a currency has a nonzero value; it doesn’t explain the actual amount of its purchasing power. For example, if the IRS declares that every US citizen must pay $1,000 in a poll tax each year, then it’s true, US citizens will need to obtain the requisite number of dollars. But they could do so whether the average wage rate were $10 per hour or $10,000 per hour, and whether a loaf of bread cost $1 or $1,000.

Furthermore, other things equal, if the government lowers tax rates, then it strengthens the currency. That’s surely part of the reason that the US dollar rose some 50 percent against other currencies after the tax rate reductions in the early Reagan years. So the MMT claim that taxes are necessary, not to raise revenue (we have a printing press for that), but to prop up the value of the currency, is at best seriously misleading.

Money Mistake #3: MMT Confuses Debt with Money

Amazingly, even though their system claims to explain how money works, the MMTers apparently don’t know the simple difference between money and debt. Here’s Kelton trying to defuse hysteria over the national debt:

The truth is, we’re fine. The debt clock on West 43rd Street simply displays a historical record of how many dollars the federal government has added to people’s pockets without subtracting (taxing) them away. Those dollars are being saved in the form of US Treasuries. If you’re lucky enough to own some, congratulations! They’re part of your wealth. While others may refer to it as a debt clock, it’s really a US dollar savings clock. (Kelton 2020, pp. 78–79, bold added)

The part I’ve put in bold in the quotation above is simply wrong. And I don’t mean, “It’s wrong according to Austrian economics but right according to MMT.” No, even in the MMT framework, Kelton’s claim about the national debt is wrong. The outstanding federal debt would only correspond to “how many dollars [have been] added to people’s pockets without subtracting…them away” to the extent that the Federal Reserve had monetized the debt by taking the Treasury securities onto its own balance sheet. But to the extent that some of the outstanding Treasury debt is currently held by individuals and entities that aren’t the Federal Reserve, Kelton’s statement is simply wrong.

In the MMT framework, federal government spending creates new dollars, while taxing destroys them. Since a federal budget deficit refers to a situation where Uncle Sam spends more than he taxes, it’s understandable why Kelton concluded that the federal debt—which reflects the cumulative history of the net budget deficits and surpluses over time—is equal to the net number of dollars that Uncle Sam “spent into existence.”

But to repeat, this is wrong. Kelton forgot that when the Treasury floats new bonds, that action (in the MMT framework) also destroys dollars by removing them from the hands of the public. So if all of the outstanding Treasury debt were held by the public (or foreign central banks), then the cumulative federal budget deficits wouldn’t correspond to any net dollar creation, even in the MMT framework.

Stay with me; we have one more step: in the MMT framework (and the Austrian framework too, for that matter), when the Federal Reserve buys outstanding Treasury securities in the secondary market and takes them onto its balance sheet, this creates new dollars. Therefore, to the extent that the outstanding Treasury securities are sitting on the Fed’s balance sheet, then that portion of the national debt would correspond to “how many dollars [have been] added to people’s pockets without subtracting…them away.”

Does the reader see how cumbersome the MMT framework is? It led its chief proponent to make an elementary mistake in her attempt to explain the basics to the public. In contrast, coming from an orthodox background, I immediately knew Kelton’s claim was wrong, because borrowing money per se doesn’t create money. This is true whether corporations do it or whether Uncle Sam does it. (Just imagine $1 billion in actual currency and that the Treasury keeps issuing new $1 billion bonds to keep borrowing that same pile of green pieces of paper to continually respend them. This procedure would run up the national debt as much as we want, but at any moment there would still be the same $1 billion in currency.)

To drive home just how confused Kelton is on the difference between US Treasurys and US dollars, later in the book she writes, “Heck, I don’t even think we should be referring to the sale of US Treasuries as borrowing or labeling the securities themselves as the national debt. It just confuses the issue and causes unnecessary grief” (Kelton 2020, p. 81).

Here’s another way I can demonstrate that Kelton’s discussion is obviously missing something: if Kelton were right and the US national debt were a tally of how many dollars on net the government has “spent into existence,” then when Andrew Jackson paid off the national debt, the American people would have had no money—the last dollar would have been destroyed. And yet even Kelton doesn’t claim that dollars were temporarily banished from planet Earth, she merely claims that Jackson’s policy caused a depression. (For the Austrian take on this historical episode, see this article.)

For an even starker illustration of the MMT confusion between debt and money, consider Kelton’s approving quotations of a thought experiment from Eric Lonergan, who asked, “What if Japan monetized 100% of outstanding JGBs [Japanese government bonds]?” That is, What if the Bank of Japan issued new money in order to buy up every last Japanese government bond on earth? Lonergan argues that “nothing would change,” because the private sector’s wealth would be the same; the BOJ will have engaged in a mere asset swap. In fact, because their interest income would now be lower while their wealth would be the same, people in the private sector would spend less after the total debt monetization (!), according to Lonergan.

In response to these claims, I make a simple point: you can’t spend Japanese government bonds in the grocery store. That’s why money and debt are different things. If Lonergan were correct, then we could also go the other way: specifically, if the Japanese government issued enough bonds to absorb every last yen on planet Earth, then apparently Lonergan would have to say that aggregate demand measured in yen would go through the roof. Yet how could it, if nobody held any yen anymore? Remember, you can’t pay your rent or buy groceries with government bonds.

Do Government Deficits = Private Savings?

In chapter 4, Kelton lays out the MMT case that government deficits, far from “crowding out” private sector saving, actually are the sole source of net private assets. Using simple accounting tautologies, Kelton seems to demonstrate that the only way the nongovernment sector can run a fiscal surplus is if the government sector runs a fiscal deficit.

Going the other way, when the government is “responsible” by running a budget surplus and starts paying down its debt, by sheer accounting we see that this must be reducing net financial assets held by the private sector. (This is why it should come as no surprise, Kelton argues, that every major government surplus led to a bad recession [Kelton 2020, p. 96].)

In the present review, I won’t carefully review and critique this particular argument, as I’ve done so in this article. Suffice it to say that you could replace “government” in the MMT argument with any other entity and achieve the same outcome. For example, if Google borrows $10 million by issuing corporate bonds and then it spends the money, then the net financial assets held by The World Except Google go up by precisely $10 million. (Or rather, the way you define terms in order to make these claims true is the same way Kelton gets the MMT claims about Uncle Sam to go through.) So did I just prove something really important about Google’s finances?

Obviously something is screwy here. Using standard definitions, people in the private sector can save, and even accumulate net financial wealth, without considering the government sector at all. (I spell all of this out in this article.) For example, Robinson Crusoe on his deserted island can “save” out of his coconut income in order to finance his investment of future labor hours into a boat and net. Even if we insist on a modern financial context, individuals can acquire shares of equity in new corporations, thus acquiring assets that don’t correspond to a “debit” of anyone else.

It is a contrived and seriously misleading use of terminology when MMT proponents argue that government deficits are a source of financial wealth for the private sector. Forget the accounting and look at the big picture: even if the central bank creates a new $10 million and simply hands it to Jim Smith for free, it hasn’t made the community $10 million richer—except in the nominal sense in which we could all be “millionaires” with this practice. Mere money creation doesn’t make any more houses or cars or acres of arable farmland available. Printing new money doesn’t make the community richer. At best it’s a wash with redistribution, and in fact in practice it makes the community poorer by distorting the ability of prices to guide economic decisions.

The MMT Job Guarantee

The last item I wish to discuss is the MMT job guarantee. Strictly speaking, this proposal is distinct from the general MMT framework, but in practice I believe every major MMT theorist endorses some version of it.

Under Kelton’s proposal, the federal government would have a standing offer to employ any worker at $15 per hour (Kelton 2020, p. 68). This would set a floor against all other jobs; Kelton likens it to the Federal Reserve setting the federal funds rate, which then becomes the base rate for every other interest rate in the economy.

Kelton argues that her proposal would eliminate the unnecessary slack in our economic system, where millions of workers languish in involuntary unemployment. Furthermore, she claims that her job guarantee would raise the long-term productivity of the workforce and even help people find better private sector job placement. This is because currently “Employers just don’t want to take a chance on hiring someone who has no recent employment record” (ibid., p. 68).

There are several problems with this proposal. First of all, why does Kelton assume that it would only draw workers out of the ranks of the unemployed? For example, suppose Kelton set the pay at $100 per hour. Surely even she could see the problem here, right? Workers would be siphoned out of productive private sector employment and into the government realm, providing dubious service at best at the direction of political officials.

Second, why would employers be keen on hiring someone who has spent, say, the last three years working in the guaranteed job sector? These would be, by design, the cushiest jobs in America. Kelton admits this when she says that the base wage rate would be the floor for all other jobs.

Looking at it another way, it’s not really a job guarantee if it’s difficult to maintain the position. In other words, if the people running the federal jobs program are allowed to fire employees who show up drunk or who are simply awful workers, then it’s no longer a guarantee.


Stephanie Kelton’s new book The Deficit Myth does a very good job explaining MMT to new readers. I must admit that I was pleasantly surprised at how many different topics Kelton could discuss from a new view, in a manner that was simultaneously absurd and yet apparently compelling.

The problem is that Kelton’s fun book is utterly wrong. The boring suits with their standard accounting are correct: it actually costs something when the government spends money. The fact that since 1971 we have had an unfettered printing press doesn’t give us more options, it merely gives the Fed greater license to cause boom/bust cycles and redistribute wealth to politically connected insiders.

Robert P. Murphy is a Senior Fellow with the Mises Institute. He is the author of many books. His latest is Contra Krugman: Smashing the Errors of America’s Most Famous Keynesian. His other works include Chaos TheoryLessons for the Young Economist, and Choice: Cooperation, Enterprise, and Human Action (Independent Institute, 2015) which is a modern distillation of the essentials of Mises’s thought for the layperson. Murphy is cohost, with Tom Woods, of the popular podcast Contra Krugman, which is a weekly refutation of Paul Krugman’s New York Times column. He is also host of The Bob Murphy Show. This article was originally featured at the Ludwig von Mises Institute and is republished with permission. 

Colorado Takes Action, Ends Qualified Immunity For Police

Colorado Takes Action, Ends Qualified Immunity For Police

When George Floyd was killed last month, the nation was shaken out of its slumber in regard to police brutality in this country. Cities quite literally burned over the anger that has been boiling up over decades as cops kill people—who are often innocent, unarmed, and even children—and get away with it. Sadly, however, the organized groups behind the protests only appear to be pushing a single, partial solution of “defunding” the police. While this is certainly something to be considered, it is a bandaid on sucking chest wound. To strike the root of the problem, we need bad cops held accountable. One major way to do this is by ending Qualified Immunity. Luckily, this idea is now picking up steam.

This month, Colorado Governor Jared Polis signed an omnibus reform bill into law to end qualified immunity for police officers in the state.

“This is a long overdue moment of national reflection,” Polis said at the signing ceremony. “This is a meaningful, substantial reform bill.”

A summary of the sea change from the Colorado legislature notes:

The bill allows a person who has a constitutional right secured by the bill of rights of the Colorado constitution that is infringed upon by a peace officer to bring a civil action for the violation. A plaintiff who prevails in the lawsuit is entitled to reasonable attorney fees, and a defendant in an individual suit is entitled to reasonable attorney fees for defending any frivolous claims. Qualified immunity and a defendant’s good faith but erroneous belief in the lawfulness of his or her conduct are not defenses is not a defense to the civil action. The bill requires a political subdivision of the state to indemnify its employees for such a claim; except that if the peace officer’s employer determines the officer did not act upon a good faith and reasonable belief that the action was lawful, then the peace officer is personally liable for 5 percent of the judgment or $25,000, whichever is less, unless the judgment is uncollectible from the officer, then the officer’s employer satisfies the whole judgment .

The precedent setting law reads in part:

A peace officer…employed by a local government who, under color of law, subjects or causes to be subjected, including failing to intervene, any other person to the deprivation of any individual rights that create binding obligations on government actors secured by the bill of rights, Article II of the State Constitution, is liable to the injured party for legal or equitable relief or any other appropriate relief…

qualified immunity is not a defense to liability pursuant to this section.

“Colorado has passed historic civil rights legislation, which both allows citizens to bring civil rights claims against police officers who violate their rights under the Colorado Constitution, and also clarifies that qualified immunity will not be a defense to any such claims,” the Cato Institute’s Project on Criminal Justice policy analyst Jay Schweikert, and expert on qualified immunity, told Law&Crime. “While this law doesn’t affect the availability of qualified immunity in federal cases, it does ensure that Coloradans who are the victims of police misconduct will have a meaningful remedy in state court.”

As TFTP has reported, when it comes to police accountability, one overarching question remains. ‘Do we want to live in a society whereby law enforcement officials can completely violate a person’s constitutional rights and get away with it?’ For our society to be free, the answer to that question must be a resounding, powerful, unwavering, ‘Hell No!’

Unfortunately, however, this is the case most of the time thanks to law enforcement personnel’s use and abuse of Qualified Immunity.

Qualified immunity is a legal doctrine in United States federal law that shields government officials from being sued for discretionary actions performed within their official capacity, unless their actions violated “clearly established” federal law or constitutional rights.

The Supreme Court created qualified immunity in 1982. With that novel invention, the court granted all government officials immunity for violating constitutional and civil rights unless the victims of those violations can show that the rights were “clearly established.”

As Anya Bidwell points out, although innocuous sounding, the clearly established test is a legal obstacle nearly impossible to overcome. It requires a victim to identify an earlier decision by the Supreme Court, or a federal appeals court in the same jurisdiction holding that precisely the same conduct under the same circumstances is illegal or unconstitutional. If none exists, the official is immune. Whether the official’s actions are unconstitutional, intentional or malicious is irrelevant to the test.

An example of this would be the family of George Floyd attempting to seek compensation for his death. Because there has never been a “clearly established” case of a cop kneeling on a man’s neck until he dies being declared unconstitutional, a judge in Minnesota could easily dismiss their case.

It is essentially a get out of jail free card for cops and it perpetuates the problem of police violence by giving bad cops a free pass.

Steps like this in Colorado are essential to reining in the terror of bad cops. This is why everyone needs to call their representatives and tell them to support the bill proposed by Libertarian Congressman Justin Amash (L-Michigan), H.R. 7085 which will end Qualified Immunity on a national level.

“Qualified immunity protects police and other officials from consequences even for horrific rights abuses,” said Amash. “It prevents accountability for the ‘bad apples’ and undermines the public’s faith in law enforcement. It’s at odds with the text of the law and the intent of Congress, and it ultimately leaves Americans’ rights without appropriate protection. Members of Congress have a duty to ensure government officials can be held accountable for violating Americans’ rights, and ending qualified immunity is a crucial part of that.

If you are interested in the other paradigm shifting solutions into quelling police brutality and Americans’ deprivation of rights, we propose five major solutions, including Qualified Immunity, that will have drastic changes. You can read that here.

Matt Agorist is an honorably discharged veteran of the USMC and former intelligence operator directly tasked by the NSA. This prior experience gives him unique insight into the world of government corruption and the American police state. Agorist has been an independent journalist for over a decade and has been featured on mainstream networks around the world. Agorist is also the Editor-at-Large at The Free Thought Project. Follow @MattAgorist on TwitterSteemit, and now on Minds. This article was originally featured at The Free Thought Project and is republished with permission.

There Are Solutions Besides ‘Defund the Police’

There Are Solutions Besides ‘Defund the Police’

“Defund the Police” is the latest rallying cry for protestors in many cities across the nation. Many activists, enraged by the brutal killing of George Floyd by Minneapolis police, are calling for completely disbanding the police, while others are seeking reductions in police budgets and more government spending elsewhere. However, few activists appear to be calling for a fundamental decrease in the political power that is the root cause of police abuses.

Many “Defund the Police” activists favor ending the war on drugs. That would be a huge leap forward toward making police less intrusive and oppressive. But even if police were no longer making a million plus drug arrests each year, they would still be making more than 9 million other arrests. Few protestors appear to favor the sweeping repeals that could take tens of millions of Americans out of the legal crosshairs.

How many of the “Defund the Police” protestors would support repealing mandatory seatbelt laws as a step toward reducing police power? In 2001, the Supreme Court ruled that police can justifiably arrest anyone believed to have “committed even a very minor criminal offense.” That case involved Gail Atwater, a Texas mother who was driving slowly near her home but, because her children were not wearing seatbelts, was taken away by an abusive cop whose shouting left her children “terrified and hysterical.” A majority of Supreme Court justices recognized that “Atwater’s claim to live free of pointless indignity and confinement clearly outweighs anything the City can raise against it specific to her case”—but upheld the arrest anyhow. Justice Sandra Day O’Connor warned that “such unbounded discretion carries with it grave potential for abuse.”

Unfortunately, there are endless pretexts for people to be arrested nowadays, because federal, state, and local politicians and officials have criminalized daily life with hundreds of thousands of edicts. As Gerard Arenberg, executive director of the National Association of Chiefs of Police, told me in the 1996, “We have so damn many laws, you can’t drive the streets without breaking the law. I could write you a hundred tickets depending on what you said to me when I stopped you.”

What about repealing state laws that make parents criminals if they smoke a cigarette while driving little Bastian or Alison to soccer practice? What about repealing the federal law that compelled states to criminalize anyone drinking one beer in their car—or, better yet, repealing the federal law that compelled states to raise the age for drinking alcohol to twenty-one? Or would today’s enraged reformers prefer to take the risk of cops beating the hell out of any twenty-year-old caught with a Bud Light?

Would feminist zealots calling to “Defund the Police” be willing to tolerate the legalization of sex work? That would mean they could no longer howl about vast “human trafficking” conspiracies exploiting young girls every time an undercover cop is illicitly groped by a 58-year-old Chinese woman in a massage parlor.

Some Black Lives Matters activists are calling for a ban on “stop and frisk” warrantless searches for drugs, guns, or other prohibited items. But some “Defund the Police” activists also favor government prohibitions of private firearms. It is as if they were seeking to formally enact the old slogan: “When guns are outlawed, only outlaws will have guns.”

Much of the media coverage is whooping up the recent wave of protests, perhaps hoping to stir public rage to support sweeping new government edicts. According to Washington Post assistant editor Robert Gebelhoff,

It would be a mistake to try to resolve the problems with police behavior without also acknowledging and addressing America’s epidemic of gun violence. Police reform and gun reform go hand in hand. Reducing the easy availability of guns would not eliminate the problems with policing in America nor end unwarranted killings, but it would help.

After heavily armed government agents forcibly confiscate a couple hundred million privately owned guns, the police won’t worry about any resistance and can behave like perfect gentlemen. Repealing most gun laws would produce a vast increase in self-reliance, especially in urban areas where police dismally fail to protect residents. But few street protestors are making that demand.

Many “Defund the Police” advocates presume that poverty is the cause of crime and that that shifting tax dollars from police budgets to social programs and handouts will automatically reduce violence. The Great Society programs launched by President Lyndon Johnson vastly increased handouts on a similar assumption. Instead, violent crime skyrocketed, especially in inner cities where dependence on government aid was highest. “The increase in arrests for violent crimes among blacks during the 1965–70 period was seven times that of whites,” as Charles Murray noted in his 1984 book Losing Ground.

Many advocates of defunding the police believe that a universal basic income, along with free housing and other services, would practically end urban strife. The history of Section 8 housing subsidies provides a stunning rebuke to such naïve assumptions. Concentrations of Section 8 recipients routinely spur crime waves that ravage both the peace and property values of their neighbors. A 2009 study published in the Homicide Studies academic journal found that in Louisville, Memphis, and other cities violent crime skyrocketed in neighborhoods where Section 8 recipients resettled after leaving public housing.

“Defund the Police” demands are already being translated by politicians into a justification for additional spending for social services or the usual sops. In Montgomery County, Maryland, police chiefs issued a statement announcing that they were “outraged” over George Floyd’s killing and then pledged to “improve training in cultural competency for our officers.” Elsewhere, politicians and police chiefs are talking about relying more on mental health workers to handle volatile situations. Radio host Austin Petersen predicted that the George Floyd protest “reforms” would result in “more social programs meant to give jobs to liberal white women.” Author and filmmaker Peter Quinones deftly captured the likely reality with a meme where Minneapolis police were renamed the Tactical Social Workers and still looking hungry to kick ass.

Politicians are claiming to have seen the light thanks to the Floyd protests. Floyd was killed, because politicians at many state and local levels have dismally failed to constrain the lethal power of police. There was nothing to stop politicians from banning the vast majority of no-knock raids, or torpedoing the perverse “qualified immunity” doctrine concocted by the Supreme Court, or repealing the even more perverse “Law Enforcement Officers’ Bill of Rights” that can convey a license to kill. One of the most powerful members of the House of Representatives, Eliot Engel (D-NY), embodied the political reality when he was caught on a hot mike: “If I didn’t have a primary, I wouldn’t care” about denouncing the George Floyd killing. It is unclear how much longer other politicians will pretend to give a damn.

Police have too much power, because politicians have too much power. There is little chance that the George Floyd protests and riots will reverse the criminalization of daily life. How many “Defund the Police” activists are also calling for a radical rollback of politicians’ prerogatives to punish almost any activity they disapprove? There will be some reforms and plenty of promises, but as long as cops have pretexts to harass and assail millions of peaceful Americans every day, the outrages will not end. Until protestors realize that the problem is Leviathan, not the local police chief, oppression will continue.

This article was originally featured at the Ludwig von Mises Insitute and is republished with permission. 

Case Study: How Uruguay Resisted the Pandemic and Saved Its Economy

Case Study: How Uruguay Resisted the Pandemic and Saved Its Economy

We are not talking enough about Uruguay. That small South American country boasts impressive results in its handling of the coronavirus. It is also signaling that it wants to prosper and that it understands more freedom might be the way to go about it.

Under President Luis Lacalle Pou, Uruguay has suffered a very low number of deaths from coronavirus (23 as of June 15) and the number of confirmed cases (848) is small. At no point did the government decree a national quarantine, preferring instead to let individual responsibility, guided by accurate and transparent information that originated from a team of scientists and experts, do the trick.

Rather than shut down the economy (80 percent of it kept going) and send the police or the military to arrest people, as was done in some other countries, the authorities, in coordination with civil society, put an emphasis on testing (proportionally, they are only behind South Korea in the number of tests as a percentage of confirmed cases) and briefly isolating those who had Covid-19. The external borders were shut, but the internal borders were kept open.

Read the rest of this article the Independent Institute.

The Federal Reserve is Getting Desperate

The Federal Reserve is Getting Desperate

In a sign that the Federal Reserve is growing increasingly desperate to jump-start the economy, the Fed’s Secondary Market Credit Facility has begun purchasing individual corporate bonds. The Secondary Market Credit Facility was created by Congress as part of a coronavirus stimulus bill to purchase as much as 750 billion dollars of corporate credit. Until last week, the Secondary Market Credit Facility had limited its purchases to exchange-traded funds, which are bundled groups of stocks or bonds.

The bond purchasing initiative, like all Fed initiatives, will fail to produce long-term prosperity. These purchases distort the economy by increasing the money supply and thus lowering interest rates, which are the price of money. In this case, the Fed’s purchase of individual corporate bonds enables select corporations to pursue projects for which they could not otherwise have obtained funding. This distorts signals sent by the market, making these companies seem like better investments than they actually are and thus allowing these companies to attract more private investment. This will cause these companies to experience a Fed-created bubble. Like all Fed-created bubbles, the corporate bond bubble will eventually burst, causing businesses to collapse, investors to lose their money (unless they receive a government bailout), and workers to lose their jobs.

Under the law creating the lending facilities, the Fed does not have to reveal the purchases made by the new facilities. Instead of allowing the Fed to hide this information, Congress should immediately pass the Audit the Fed bill so people can know whether a company is flush with cash because private investors determined it is a sound investment or because the Fed chose to “invest” in its bonds.

The Fed could, and likely will, use this bond buying program to advance political goals. The Fed could fulfill Chairman Jerome Powell’s stated desire to do something about climate change by supporting “green energy” companies. The Fed could also use its power to reward businesses that, for example, support politically correct causes, refuse to sell guns, require their employees and customers to wear masks, or promote unquestioning obedience to the warfare state.

Another of the new lending facilities is charged with purchasing the bonds of cash-strapped state and local governments. This could allow the Fed to influence the policies of these governments. It is not wise to reward spendthrift politicians with a federal bailout — whether through Congress or through the Fed.

With lending facilities providing to the Federal Reserve the ability to give money directly to businesses and governments, the Fed is now just one step away from implementing Ben Bernanke’s infamous suggestion that, if all else fails, the Fed can drop money from a helicopter. These interventions will not save the economy. Instead, they will make the inevitable crash more painful. The next crash can bring about the end of the fiat monetary system. The question is not if the current monetary system ends, but when. The only way Congress can avoid the Fed causing another great depression is to begin transitioning to a free-market monetary system by auditing, then ending, the Fed.

Reprinted with permission from the Ron Paul Institute.

5 Things I Learned Debating the Harvard Prof Who Called for a ‘Presumptive Ban’ on Homeschooling

5 Things I Learned Debating the Harvard Prof Who Called for a ‘Presumptive Ban’ on Homeschooling

It’s not just about homeschooling.

On Monday, I debated the Harvard professor who proposes a “presumptive ban” on homeschooling. Thousands of viewers tuned in to watch the live, online discussion hosted by the Cato Institute. With 1,000 submitted audience questions, the 90-minute webinar only scratched the surface of the issue about who is presumed to know what is best for children: parents or the state. Here is the replay link in case you missed it.

Last week, I outlined much of my argument against Harvard Law School professor Elizabeth Bartholet that I incorporated into our debate, but here are five takeaways from Monday’s discussion:

While this event was framed as a discussion about homeschooling, including whether and how to regulate the practice, it is clear that homeschooling is just a strawman. The real issue focuses on the role of government in people’s lives, and in particular in the lives of families and children. In her 80-page Arizona Law Review article that sparked this controversy, Professor Bartholet makes it clear that she is seeking a reinterpretation of the US Constitution, which she calls “outdated and inadequate,” to move from its existing focus on negative rights, or individuals being free from state intervention, to positive rights where the state takes a much more active role in citizens’ lives.

During Monday’s discussion, Professor Bartholet explained that “some parents can’t be trusted to not abuse and neglect their children,” and that is why “kids are going to be way better off if both parent and state are involved.” She said her argument focuses on “the state having the right to assert the rights of the child to both education and protection.” Finally, Professor Bartholet said that it’s important to “have the state have some say in protecting children and in trying to raise them so that the children have a decent chance at a future and also are likely to participate in some positive, meaningful ways in the larger society.”

It’s true that the state has a role in protecting children from harm, but does it really have a role in “trying to raise them”? And if the state does have a role in raising children to be competent adults, then the fact that two-thirds of US schoolchildren are not reading proficiently, and more than three-quarters are not proficient in civics, should cause us to be skeptical about the state’s ability to ensure competence.

I made the point on Monday that we already have an established government system to protect children from abuse and neglect. The mission of Child Protective Services (CPS) is to investigate suspected child abuse and punish perpetrators. CPS is plagued with problems and must be dramatically reformed, but the key is to improve the current government system meant to protect children rather than singling out homeschoolers for additional regulation and government oversight. This is particularly true when there is no compelling evidence that homeschooling parents are more likely to abuse their children than non-homeschooling parents, and some research to suggest that homeschooling parents are actually less likely to abuse their children.

Additionally, and perhaps most disturbingly, this argument for more state involvement in the lives of homeschoolers ignores the fact that children are routinely abused in government schools by government educators, as well as by school peers. If the government can’t even protect children enrolled in its own heavily regulated and surveilled schools, then how can it possibly argue for the right to regulate and monitor those families who opt out?

Of all the recommendations included in the Harvard professor’s proposed presumptive ban on homeschooling, the one that caused the most uproar among both homeschoolers and libertarians was the call for regular home visits of homeschooling families, with no evidence of wrongdoing.

In my remarks during Monday’s debate, I included a quote from a Hispanic homeschooling mother in Connecticut who was particularly angry and concerned about imposing home visits on homeschooling families. (According to federal data, Hispanics make up about one-quarter of the overall US homeschooling population, mirroring their representation in the general US K-12 school-age population.) She made the important point that minority families are increasingly choosing homeschooling to escape discrimination and an inadequate academic environment in local schools. She also pointed out that, tragically, it is often minorities who are most seriously impacted by these seemingly well-meaning government regulations. Writing to me about Professor Bartholet’s recommendation, she said:

“To state that they want to have surveillance into our homes by having government officials visit, and have parents show proof of their qualified experience to be a parent to their own child is yet another way for local and federal government to do what they have done to native Americans, blacks, the Japanese, Hispanics, etc in the past. Her proposal would once again interfere and hinder a certain population from progressing forward.”

Anyone who cares about liberty and a restrained government should be deeply troubled by the idea of periodic home visits by government agents on law-abiding citizens.

Despite the landmark 1925 US Supreme Court decision that ruled it unconstitutional to ban private schools, there remains lingering support for limiting or abolishing private education and forcing all children to attend government schools. Homeschooling is just one form of private education.

In her law review article, Professor Bartholet recommends “private school reform,” suggesting that private schools may have similar issues to homeschooling but saying that this topic is “beyond the scope” of her article. Still, she concludes her article by stating that “to the degree public schools are seriously deficient, our society should work on improving them, rather than simply allowing some parents to escape.”

The government should work to improve its own schools, where academic deficiencies and abuse are pervasive. But it should have no role in deciding whether or not parents are allowed to escape.

Some advocates of homeschooling regulation suggest that requiring regular standardized testing of homeschoolers would be a reasonable compromise. In her law review article, Professor Bartholet recommends: “Testing of homeschoolers on a regular basis, at least annually, to assess educational progress, with tests selected and administered by public school authorities; permission to continue homeschooling conditioned on adequate performance, with low scores triggering an order to enroll in school.”

During Monday’s debate, I asked the question: By whose standard are we judging homeschoolers’ academic performance? Is it by the standard of the government schools, where so many children are failing to meet the very academic standards the government has created? I pointed out that many parents choose homeschooling because they disapprove of the standards set by government schools. For example, in recent years schools have pushed literacy expectations to younger and younger children, with kindergarteners now being required to read. If they fail to meet this arbitrary standard, many children are labeled with a reading deficiency when it could just be that they are not yet developmentally ready to read.

Indeed, as The New York Times reported in 2015: “Once mainly concentrated among religious families as well as parents who wanted to release their children from the strictures of traditional classrooms, home schooling is now attracting parents who want to escape the testing and curriculums that have come along with the Common Core, new academic standards that have been adopted by more than 40 states.”

A key benefit of homeschooling is avoiding standardization in learning and allowing for a much more individualized education. And it seems to be working. Most of the research on homeschooling families conducted over the past several decades, including a recent literature review by Dr. Lindsey Burke of the Heritage Foundation, finds positive academic outcomes of homeschooling children.

There are very few movements today that bring together such a diverse group of people as homeschooling does. Families of all political persuasions, from all corners of the country, reflecting many different races, ethnicities, classes, cultures, values, and ideologies, and representing a multitude of different learning philosophies and approaches choose homeschooling for the educational freedom and flexibility it provides. Homeschoolers may not agree on much, but preserving the freedom to raise and educate their children as they choose is a unifying priority. In times of division, homeschoolers offer hope and optimism that liberty will prevail.

Reprinted from FEE.

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You’d Forget

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You’d Forget

Jim Crow Laws Democrat Party Century Of Racist History HeroIn the last Presidential electionDonald Trump was lauded for his performance among black voters – he scored 4 percent of female black voters and a whopping 13 percent of black male voters, the highest since Richard Nixon. This isn’t shocking. Black voters have voted en masse for the Democratic Party since the mid-60s and the passage of the 1964 Civil Rights Actthe Voting Rights Act and the social welfare programs of the Great Society. This solidified black voters behind the Democratic Party, but they had been moving there since the New Deal.

However, it’s a historical anomaly in the United States. The traditional home of the black voter was the Republican Party, due to its historical role in ending slavery and introducing Reconstruction Acts and Amendments to the Constitution. It also did not help that the Democratic Party was the party of Jim Crow, a system of legally enforced segregation present throughout the American South in the aftermath of the Civil War.

What Do We Mean When We Say “Jim Crow?”

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You'd ForgetBefore delving further into the topic, it is important to define precisely what we mean by Jim Crow and why it is a distinct form of legal codes in United States history. While Northern and Western cities were by no means integrated, this integration was de facto, not de jure. In many cases, the discrimination in the North was a discrimination of custom and preference, discrimination that could not be removed without a highly intrusive government action ensuring equality of outcome. Northerners and Westerners were not required to discriminate, but nor were they forbidden from doing so.

Compare this to the series of laws in the American South known for mandating segregation at everything from public schools to water fountains.

No one is entirely sure where the term “Jim Crow” came from, but it’s suspected that it comes from an old minstrel show song and dance routine called “Jump Jim Crow.” Curiously, the first political application of the term “Jim Crow” was applied to the white populist supporters of President Andrew Jackson. The history of the Jim Crow phenomenon we are discussing here goes back to the end of Reconstruction in the United States.

The Reconstruction Era

Briefly, Reconstruction was the means by which the federal government reasserted control over the Southern states that had previously seceded to form the Confederate States of America. This involved military occupation and the disenfranchisement of the bulk of the white population of the states. The results of the Reconstruction Era were mixed. Ultimately, Reconstruction ended as part of a bargain to put President Rutherford B. Hayes into the White House after the 1876 election. The lasting results of Reconstruction are best enumerated for our purposes as the Reconstruction Amendments:

  • The 13th Amendment abolished involuntary servitude for anyone other than criminals. It was once voted down and passed only through the extensive political maneuvering on behalf of President Abraham Lincoln himself and the approval of dubious Reconstruction state governments in the South. It became law in December 1865.
  • The 14th Amendment includes a number of provisions often thought to be part of the Bill of Rights, such as the Equal Protection Clause and the Due Process Clause, which are, in fact, later innovations. Birthright citizenship’s advocates claim that the Constitutional justification can be found in this sprawling Amendment, which also includes Amendments barring former Confederate officials from office and addresses Confederate war debts. This Amendment became law in July 1868.
  • The 15th Amendment prevents discrimination against voters on the basis of race or skin color. This law was quickly circumvented by a number of laws discriminating against all voters on the basis of income (poll tax) or education (literacy tests). The Southern states eventually figured out how to prevent black citizens from voting while allowing white ones through grandfather clauses.

The Reconstruction Amendments were the first amendments to the Constitution passed in almost 60 years, and represented a significant expansion of federal power.

Perhaps the most important thing to know about the Reconstruction Amendments is that they were largely ineffective. Ranking public officials of the Confederacy were elected to federal government, blacks were disenfranchised as quickly as they were elected to the Senate, and Jim Crow, an entire system of legal discrimination, was erected to return black Americans to their subservient status. With the exception of citizenship for blacks and an end to involuntary servitude, the substance of the rest of the Amendments were largely discarded.

Black Disenfranchisement as a Prologue to Jim Crow

The process of black disenfranchisement at the end of the war is important historical context for understanding the rise of Jim Crow. It’s impossible to discuss this period without discussing the role of the Ku Klux Klan and other Democratic Party-allied white supremacist terrorist organizations. You can read more about this in our lengthy and exhaustive history of American militias and paramilitary organizations.

The first attempt to roll back the gains made by black Americans, thanks to the Reconstruction Amendments, was a poll tax introduced by Georgia in 1877. However, the legal rollback of voting rights in particular did not really ramp up until the turn of the century, when Republicans ran on joint tickets with the insurgent People’s Party, also known as the Populist Party. This threat to entrenched Democratic Party political power (and all of the patronage that came with it), while certainly related to the racial question, was arguably a bigger motivator than race. In many cases, such as with poll taxes, there were explicit attempts to exclude white voters sympathetic to the Republican cause alongside black voters. The specter of unity between poor blacks and poor whites loomed large.

Mississippi drafted a new constitution in 1890, which required payment of a poll tax as well as the passing of a literacy test as qualifications to vote. This passed Constitutional muster in 1898, with Williams vs. Mississippi. Other Southern states quickly drafted new constitutions modelled on that of Mississippi. This was known as “the Mississippi Plan.” By 1908, every Southern state had either drafted a new constitution or passed a suffrage amendment to better craft the state electorate to their liking. In 1903, Giles vs. Harris strengthened federal court support of such laws.

Another method of maintaining control was the white primary. In 1923, Texas became the first state to establish primary voting for whites only. This was quickly deemed unconstitutional, so the state simply drafted a new law saying that the Democratic Party could determine its own voters for the primary. The state party quickly moved to exclude all non-white voters, which was entirely legal and Constitutional, because the Democratic Party was a private organization.

This caught the eye of some Congressmen. By 1900, there was discussion of stripping Southern states of some of their Congressional representation in accordance with provisions contained within the Reconstruction Amendments. Not only was the “Solid South” a large voting bloc, due to the one-party nature of many Southern elections, but they were also in charge of a goodly number of committee chairs, meaning that any attempt to strip Southern states of seats was probably going to go precisely nowhere.

Reliable statistics from the era are few and far between, but historians believe that somewhere between one and five percent of eligible black voters were registered by the late 1930s. Very few of these actually voted in general elections, which were a foregone conclusion. In many states, prior to the first and second Great Migrations, the black population ranged upwards of close to 50 percent.

Five border states (Delaware, Maryland, West Virginia, Kentucky and Maryland) all attempted to pass similar legislation to “the Mississippi Plan,” but failed to do so.

Redeemer Governments and the Election of 1876

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You'd ForgetThe disenfranchisement of black Americans in the South was the political precursor for Jim Crow. In addition to legal disenfranchisement, there were also paramilitary actions against both black Americans and Republican voters and candidates. It was not uncommon for Democratic Party-allied paramilitary groups to simply force the Republican candidate or even office holder out of town. Voter fraud was also a tool. As elections became closer, violence against blacks and Republicans increased to keep them away from the polls.

These Southern governments are known collectively as the Redeemer governments. They ruled over most of the South from 1870 until 1910. As we discuss in our history of militias in the United States, the white, pro-Democratic Party militias of the South were largely obsolete by the end of the 19th Century – Democratic Party state governments were doing their jobs for them.

All of this was facilitated by the Compromise of 1877 or the Corrupt Bargain of 1877, depending on one’s point of view. In exchange for certifying Southern votes for Republican candidate Rutherford B. Hayes, Hayes agreed to:

  • Remove federal troops from Florida, South Carolina and Louisiana, the final states where they remained. Hayes campaigned on doing this prior to the bargain.
  • The appointment of one or more Southerners to the Hayes Cabinet. This was fulfilled by appointing David M. Key from Tennessee as the Postmaster General.
  • A transcontinental railroad passing through the South, using the Texas and Pacific line.
  • Legislation to industrialize the Southern economy.
  • Northern hands off the South when it came to racial questions.

The first two are often emphasized, however, they are probably the least important parts of the compromise. As stated above, Hayes had already planned to withdraw the remaining troops from the South. The cabinet appointment of Postmaster General was certainly a bigger deal than it would be today, when far fewer people rely upon the mail. The post has not even been a cabinet-level position since the 1970s. The next two are arguably beneficial to everyone in the South, black or white, and in any event, were never enacted.

It’s also worth noting that the Compromise was seen as a way to avoid a potential new wave of bloodshed. At the time, it was widely feared that American politics were going to go the way of Mexico – meaning military strongmen and state violence would resolve closely contested elections. In this context, the Compromise is rather shrewd as Hayes gave up very little with regard to the first two provisions and never enacted points three and four.

The final provision, however, is the one that makes Jim Crow possible. This makes it, historically speaking, perhaps the most significant of the Compromise provisions.

The Democratic Party Coup d’Etat in Wilmington, North Carolina

The end of Reconstruction and the subsequent disenfranchisement of blacks and poor whites by Southern Democratic state governments were not entirely without resistance. However, this resistance was met with sharp and swift reprisal. For example, in November 1898, when the brother of a Republican candidate tried to collect affidavits from black voters that they were being prevented from voting, he was savagely beaten by cronies of the local Democratic Party leader. Four days of rioting followed, including 13 blacks and at least one white dead and hundreds injured.

The same month and year in Wilmington, North Carolina, there was an orgy of violence and an effective coup d’etat against the duly elected Fusionist government (blacks represented by Republicans and whites represented by Populists) of the city. Here the Democratic Party explicitly called themselves “The White Man’s Party,” forcing whites to join political and labor organizations. People were literally marched out of their homes in the middle of the night and forced to sign membership forms under threat of death.

Following a speech from former Democratic Party Congressman Alfred Moore WaddleRed Shirts in attendance left the convention hall and began terrorizing black citizens. The eventual election was rife with fraud. The local black newspaper, The Daily Record, was, along with many others around the state, burned to the ground. Waddell led a group to the Republican mayor of the town, forcing him and the entire city council to resign at gunpoint. The new city council was installed and elected Waddell as mayor.

The organizer of the coup, Charles Aycock, became the 50th Governor of North Carolina as a Democrat. Other participants became the first female Senator (Rebecca Felton), Secretary of the Navy (Josephus Daniels), a state Senator and U.S. Congressman (John Bellamy), a state Senator and Governor of North Carolina (Robert Glenn), House Majority Leader and Ways and Means Committee Chair (Claude Kitchin), Congressman and Governor of North Carolina (W.W. Kitchin), another Governor of North Carolina (Cameron Morrison) and a Lieutenant Governor (Francis Winston).

The disenfranchisement leading up to Jim Crow went deeper than simply stripping the right to vote. It also included removing black citizens from juries and preventing them from being eligible to run for public office.

What Were the Jim Crow Laws?

Jim Crow laws, for the most part, are relatively simple. Most states (over 30), including those outside of the South, had laws against interracial marriage, a crime known as miscegenation. The remaining laws against this were overturned by the Supreme Court in 1967, in the Loving v. Virginia case. Some states went further than this, such as Florida, which banned both interracial dating and cohabitation. Schools, restaurants, theaters and cinemas, hotels and train stations were commonly separated by law, but sometimes baseball teams and prisons were segregated by law, as in Georgia. Mississippi criminalized anti-Jim Crow propaganda. North Carolina banned the sharing of books between black and white schools.

The Supreme Court Upholds Jim Crow in Plessy v. Ferguson

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You'd ForgetThe landmark Supreme Court case that upheld Jim Crow was Plessy v. Ferguson. The law in question was an 1890 Louisiana law requiring separate train cars for black and white passengers. Homer Plessy, a mixed-race Louisianan (an “octaroon” – ⅞ white and ⅛ black) and registered Republican (as most Southern blacks were at the time), deliberately forced the issue as a test case.

He bought a first-class ticket and boarded the white train car. The train company knew about this in advance. They opposed the law on the grounds that it would require them to effectively double the number of train cars without a corresponding increase in passengers. The committee looking to challenge the law hired a private detective with arrest powers to ensure that Plessy would be arrested for violating the train car segregation law, and not vagrancy or something else.

Plessy and his backers lost every decision on the way up to the Supreme Court, which upheld earlier court rulings. This led to the doctrine of “separate but equal.” The legal theory was that public accommodations must be provided to all citizens, but they could be separate, provided that they were of equal quality. The argument among critics was that the separate facilities were never equal.

This was a turning point in the history of American jurisprudence. The Supreme Court made Jim Crow the law of the land in any state that wished to do so. It is important to again reiterate what Jim Crow was and was not. Jim Crow was a state policy demanding segregation from private business. It was not a law allowing for businesses to discriminate if they wished to. In this respect, it was like a continuation of the slave patrols of old, whereby private citizens were drafted into the militia to hunt for runaway slaves. As we can see above, the private business in question did not want to enforce the law, if only for purely economic reasons, but was forced to by the state Democratic Party apparatus.

A lesser known phenomenon of the Jim Crow Era is the Sundown Town. These were towns where black Americans were simply not allowed to be out after dark. For the most part, this was overturned when a matter of laws barring black men from a town or even areas of a town did not pass Constitutional muster. Some municipalities got around this by appealing to city planners or real estate professionals.

Democrat Woodrow Wilson: The Segregation President

Democrat Woodrow Wilson was arguably one of the most transformational presidents in American history. Not only did he pioneer the aggressive interventionist foreign policy that later came to characterize American international relations, he was also the segregation president. While he was previously the Governor of New Jersey, his personal background was in the American South – Virginia to be exact.

Wilson basically walked into the White House during the 1912 election. The Republican Party was split between supporters of President William Taft and former President Theodore Roosevelt, who ran on his own Progressive Party. Taft all but bowed out of the race. Historians have argued that he was more interested in his eventual job as Chief Justice of the Supreme Court than he was in being president. Wilson won 40 states, while the incumbent president won two. TR carried the balance. The 1916 election was much closer in the popular vote, with Wilson eking his way to victory in what was effectively a referendum on whether or not the United States would enter the Great War on the side of Great Britain.

Ironically, Wilson was the dove in the race.

(Once in power, Wilson’s dovish outlook led him to change American foreign policy. He argued against using self-interest as a basis for foreign alliances and instead said foreign policy should be based upon collective security and moral judgement. He led the creation of the League of Nations, the precursor to the United Nations.)

Wilson was the first Southerner to occupy the White House since fellow Virginian Zachary Taylor in 1848. While some have argued that the so-called “Nadir of Race Relations” in America was either earlier or later than Wilson, it’s hard to not see the Wilson Administration as a turning point and consolidation of segregation on a national scale.

Segregation at the federal level became the law of the land under Wilson. While Wilson rejected proposals to segregate every department, he allowed cabinet heads to segregate their own departments as they wished. The military segregated, with all-black units serving underneath white officers at equal pay. The First Great Migration led to a number of race riots and the Wilson Administration did not intervene under the advice of his attorney general.

It’s worth noting that Wilson personally shared the “Lost Cause” view of the War Between the States. He also saw the Redeemer governments as an understandable response to radical Republican rule. Neither of these views are particularly controversial nor, in and of themselves, racist. However, they are worth bringing up due to Wilson’s role as the most segregationist president since the end of the Civil War.

The Wilson Era was also the period of the resurgence of the KKK and the popularity of the film The Birth of a Nation. WIlson, for his part, denounced both of these. They’re mentioned here only to paint a picture of what race relations were like in America at this time.

The summer after Wilson left the White House was known as “Red Summer,” due to the number of race riots. Over 165 people were killed. All told, there were 39 race riots throughout the United States between February and October 1919. The African Blood Brotherhood was formed for the sake of black self defense during this period, but was quickly taken over by the nascent Communist Party.

Jim Crow Begins to Fall Apart

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You'd ForgetAfter the Second World War, however, the system of segregation began to show cracks. Northern liberals in the Democratic Party, such as Hubert Humphrey, were unwilling to continue to look the other way. President Harry S. Truman, a Democrat from the border state of Missouri, made inroads against segregation. He appointed the President’s Commission on Civil Rights, which drafted a 10-point plan to advance civil rights.

Part of the change in racial relations had to do with the service of black Americans during the war. Truman himself said “My forebears were Confederates…but my very stomach turned over when I had learned that Negro soldiers, just back from overseas, were being dumped out of Army trucks in Mississippi and beaten.” Truman aggressively pursued a policy of integration and fairness in the military and federal employment, banning racial discrimination in civil service.

There was, of course, a predictable backlash. Southern Democrats were unhappy with Truman’s actions on civil rights. They split from the party in 1948, nominating then-Democrat Strom Thurmond on the States’ Rights Democratic Party ticket. They even managed to secure his place on the main Democratic Party line in four states, all of which Thurmond won. The only non-Confederate states where Thurmond appeared on the ballot were Kentucky, Maryland, California and North Dakota.

Curiously, Thurmond was considered something of a progressive during his tenure as Governor of South Carolina. He personally fought for the arrest of the men who lynched Willie Earle. While no one was ever convicted for the crime, Thurmond was personally thanked for his efforts by the NAACP and the ACLU.

The purpose was not to win, but rather to force the election into the House of Representatives where, it was thought, the Dixiecrats could force concessions from either President Truman or New York Governor Thomas Dewey.

Thurmond fought a vigorous campaign, challenging President Truman to a debate on the civil rights question. He claimed until the day he died that the impetus for his campaign against civil rights was not based on racism, but on opposition to Communism. After the campaign, the original plan was to continue the States Rights’ Democratic Party as an alternative to the regular Democratic Party, but it fell apart when Thurmond realized he could not fully abandon the party.

Jim Crow began to fall apart completely in 1954, with the landmark Supreme Court decision Brown vs. Board of Education of Topeka. At the time of the ruling, 20 states forbade racial segregation in public schools, with another three making it optional or limited. Only in the old Confederacy as well as Oklahoma, Missouri, Kentucky, West Virginia, Maryland and Delaware was it mandatory.

This was not the end of the story. Democratic Party state governments and politicians, led by Virginia’s Sen. Harry Byrd, coordinated a campaign of “massive resistance.” In some municipalities, public schools were shuttered entirely rather than allowing black children to attend school there. This led to the rise of the segregation academy, private schools exclusively for white students, which continued legally until 1976.

A series of bills passed between 1957 and 1965 meant that segregation was all but over, except for the shouting. The first, the Civil Rights Act of 1957 was spirited to passage by future President Lyndon Baines Johnson, at that time the most powerful figure in the United States Senate. Strom Thurmond tried to block it with the longest one-man filibuster in history – 24 hours and 18 minutes. The bill sought to secure voting rights and was largely ineffective.

The Civil Rights Act of 1960 followed, further seeking to strengthen minority voting rights in the Southern states, as well as to reinforce desegregation. The Civil Rights Act of 1964 and the Voting Rights Act of 1965, along with the Brown v. Board decision, can be seen as the effective end of Jim Crow legislatively.

George Wallace bolted from the Democratic Party and ran for the presidency in 1968, explicitly to preserve segregation. This was after famously proclaiming “segregation now, segregation tomorrow, segregation forever,” in his 1963 inauguration address after he’d been elected Governor of Alabama.

It is worth noting that Wallace’s own support for segregation might have been entirely opportunistic. He initially ran for Governor of Alabama as a pan-racial populist. After losing, he declared, “I will never be outn*ggered again” and later reflected that, “You know, I tried to talk about good roads and good schools and all these things that have been part of my career, and nobody listened. And then I began talking about n*ggers, and they stomped the floor.” Wallace, like Thurmond before him, sought to throw the presidential election to the House.

Wallace was, at the time, Governor of Alabama. He had previously run for President in 1964 as a Democrat (as he did again in 1972), but this time he ran as the standard bearer of the American Independent Party. His unofficial slogan was “there’s not a dime’s worth of difference between the two major parties.” He ran on a law-and-order and pro-segregation platform seeking to, as Thurmond did before him, throw the election to the House.

Wallace was arguably far more successful than Thurmond. First, he got 46 electoral votes to Thurmond’s 39, making him the last third-party candidate to get electoral votes. He also got nearly 10 million votes (9.9 million) compared to Thurmond’s 1.1 million. For Thurmond, this was a scant 2.4 percent of the vote, compared to Wallace’s 13.5 percent. While Wallace ran first in the Confederacy with 45 percent of the vote, he succeeded where Thurmond did not, for making a direct appeal to Northern working-class voters. Fully one in three AFL-CIO members supported Wallace in the summer of ‘68. In September of 1968, Wallace was the preferred candidate of Chicago steelworkers, taking 44 percent of their support.

Soon-to-be President Richard Nixon, however, with the help of a young Patrick Buchanan, turned the tide. Nixon argued that a vote for Wallace was ultimately a vote for Humphrey. He also hammered away at Alabama being a right-to-work state. While it is often argued that Nixon pursued a “Southern strategy,” this is patently false. Nixon’s strategy was two-fold: He sought to win on the basis of massive turnout from Northern Catholics and Southern Protestants, the latter category including Southern black voters.

While some argue that de facto segregation still exists, it is not debatable that de jure segregation was ended in the 1950s and 60s. This was done through an alliance of liberal Democrats and Republicans. Debate does continue, however, as to whether some of the legislation championed by Lyndon B. Johnson was merely opportunistic in order to secure the black vote for Democrats for years to come. While it’s hard to say what Johnson’s motives were, Johnson’s strategy has paid off – black Americans continue to vote en masse for Democrats who have conveniently scrubbed major portions of their party’s history from the record books.

Democrats & Jim Crow: A Century of Racist History the Democratic Party Prefers You’d Forget” originally appeared in Ammo.com’s Resistance Library.

If I Were a Racist…

If I Were a Racist…

Protests across the nation following the murder of George Floyd have inspired discussions beyond just police brutality, shining a spotlight on issues like “social justice” and “systemic racism.”

But the divisive rhetoric on racism serves to distract from the statism.

If I were a racist, I would support policies that negatively impact minorities. Anything that winds up making their lives and socioeconomic condition worse off would get my approval. On that score, big government could serve as a shining example with a record of harming minorities any racist would envy.

The Welfare State

For starters, if I were a racist I would look at the results of the huge expansion of the welfare state with glee. It’s rumored that President Lyndon Johnson bragged, “I’ll have those niggers voting Democratic for the next 200 years” as a result of his passage not only of civil rights legislation but his massive ratcheting up of the welfare state known as the “Great Society.”

Whether or not Johnson uttered those actual words is immaterial; they were consistent with his racist tendencies. Much more importantly, however, is that the results have reflected the sentiment behind the alleged quote: a growing dependency of the black community on government programs, leading to a devastating destruction of the black family and in turn a deepening cycle of poverty.

Poor and dependent people will reliably vote for the party promising to continue and increase the flow of benefits.
Welfare programs championed by Johnson and progressives break up families by replacing a father’s paycheck with a government check and benefits. Nationally, since LBJ’s Great Society ratcheted up government welfare programs in the mid-1960s, the rate of unmarried births has tripled.

This effect has been especially acute in black families, as more than 70 percent of all black children today are born to an unmarried mother, a three-fold increase.

According to 2017 American Community Survey data produced by the U.S. Census Bureau, only 5.3% of families with a married couple live in poverty nationally, compared to 28.8% of households with a “female householder, no husband present.”
In other words, single mother households are five times as likely to be in poverty compared to households with both parents. Largely as a result of the breakdown of the black family, 20 percent of blacks live in poverty, more than twice the rate of whites (8%).

As economist Thomas Sowell once wrote, “The black family survived centuries of slavery and generations of Jim Crow, but it has disintegrated in the wake of the liberals’ expansion of the welfare state.”

Indeed, if a group of racist Klan members conspired to develop a plan to impoverish black households, they could have not done much better than the exploding welfare state.

The Minimum Wage

If I were a racist, I would want to see to it that young black people coming from broken, low-income homes have a harder time entering the workforce, making it more difficult to escape poverty.

The minimum wage accomplishes that.

The economic lesson is obvious: artificially increasing the wage employers must pay decreases the demand for low-skilled workers, while drawing more demand from prospective workers to fill these positions. Low-skilled labor is priced out of the workforce as a result.

History has shown that black teenagers are hit the hardest by minimum wage hikes.

Research by Sowell underscores this point: “Unemployment among 16 and 17-year-old black males was no higher than among white males of the same age in 1948. It was only after a series of minimum wage escalations began that black male teenage unemployment rates not only skyrocketed but became more than double the unemployment rates among white male teenagers.”

Indeed, there is ample research showing that the minimum wage’s origin was inspired by racism. Such historic facts led economist Walter E. Williams to label the minimum wage “one of the most effective tools in the arsenal of racists everywhere in the world.”

Putting the first rung of the career ladder out of reach to young blacks is a great way to frustrate them and push them towards either a life of crime or government dependency. Far too many end up hopeless in prison or in the ghetto—right where racists want them.

Gun Control

Seeing to it that more blacks are stuck in a cycle of government dependency and hopelessness, and packed in close quarters in inner cities, I’d be pretty confident that those inner cities would have high rates of violent crime.

So if I were a racist, I’d want to take away the right to legally defend oneself by imposing strict gun control laws. This way, the honest citizens living in the violent inner cities would have no way to defend themselves against the criminals.

As Maj Toure of Black Guns Matter says, “All gun control is racist.”

Research on the history of gun control laws strongly suggests racist motives compelling these restrictions for hundreds of years. According to the website firearmsandliberty.com, “The historical record provides compelling evidence that racism underlies gun control laws—and not in any subtle way. Throughout much of American history, gun control was openly stated as a method for keeping blacks and Hispanics ‘in their place,’ and to quiet the racial fears of whites.”

One of the top priorities of the Ku Klux Klan after the Civil War was to enact laws barring gun ownership by the freedmen, making it all the easier to terrorize them.

Today, however, there’s no need to put on a white hood and lynch anybody, just see to it that blacks are defenseless and let the criminals handle the rest.

School Choice

If I were a racist, I’d want to block any attempt to make better educational opportunities available for minorities. The government indoctrination centers known as public schools are not only systemically incapable of providing high quality education for children, they have especially failed minority kids.

As Walter E. Williams has written, “the average black 12th-grader has the academic achievement level of the average white seventh- or eighth-grader. In some cities, there’s an even larger achievement gap.”

The ultimate goal, of course, is to separate school and state (and eliminate the state altogether). But short of that, we need to shift more control over educational choices out of the hands of politicians and bureaucrats and into the hands of parents and families.

Such policies are highly popular among minority families. Indeed, a 2018 national survey by Education Next found that Hispanic (62%) and black (56%) respondents expressed far higher support for school choice initiatives targeted to low-income families than whites (35%).

Results like this suggest that low-income, minority families recognize the status quo is not working, and they are craving policies that would enable them to access other educational options.

Those opposing policies that would provide minorities such options may not be motivated by racism, but one would be hard pressed to say how their actions would be different if they were.

War on Drugs

If I were a racist, I would no doubt enjoy the results of the government’s failed “war on drugs.” The war on drugs has put thousands of minorities in prison for crimes emerging from the government’s attempt to dictate to citizens what they can or cannot put in their own bodies. According to the Drug Policy Alliance, “Nearly 80% of people in federal prison and almost 60% of people in state prison for drug offenses are black or Latino.”

Moreover, the Drug Policy Alliance notes “2.7 million children are growing up in U.S. households in which one or more parents are incarcerated. Two-thirds of these parents are incarcerated for nonviolent offenses, including a substantial proportion who are incarcerated for drug law violations.” The drug war, like the war on poverty, is a major factor in fatherless homes in the black community.

The war on drugs has devastated minority communities, and its enforcement has greatly increased the number of confrontations between police and minorities; which in turn increases the opportunity for police brutality cases.


Big government has arguably been the biggest enemy of minorities. Indeed, an examination of the results of government control and intervention looks an awful lot like something racists would support.

Instead of pitting white against black to divide us, to achieve more justice for minorities we must instead focus our energy on dismantling the state.

Bradley Thomas is creator of the website Erasethestate.com and is a libertarian activist who enjoys researching and writing on the freedom philosophy and Austrian economics. Follow him on Twitter, @erasestate.

The Irredeemable Racism of the Death Penalty

The Irredeemable Racism of the Death Penalty

When confronted with overwhelming evidence of a discriminatory state practice, a decent society responds in one of two ways: by trying to remove discrimination from the practice, or by scrapping the practice altogether. In the context of capital punishment, the Supreme Court has opted affirmatively for the former course of action. In 1987, the Court in McCleskey v. Kemp expressed its hope and conviction that, even without a wholesale abolition of capital punishment, any troubling racism in executions was destined to end through Court-facilitated adjustments to the ultimate punishment.

Nearly 35 years later, that conviction has proved unfounded. As we maneuver our way through a political moment pregnant with possibility—in which the foundations of our criminal justice system are under heightened scrutiny, and in which “abolitionists” debate “reformists” about the best path forward—we should be mindful of what the results of our national experiment with the death penalty suggest. As long as it retains tremendous power, the government will be tremendously dangerous. If government officials are in a position to discriminate in life-or-death siutations, Americans will continue to die because of discrimination. If our history with the death penalty is any indication, successfully taming the governmental beast cannot mean simply regulating (that is, making regular) the government’s exercise of all of its awesome powers. Instead, it must mean taking many of those powers away from the government outright.

Efforts on the high court to excise racism from the administration of the death penalty date back to 1963 (at the latest). In Rudolph v. Alabama, Frank Lee Rudolph, a black man in Alabama, petitioned the Supreme Court for review of his death sentence for raping a white woman. Although the Court ultimately declined to hear the case, three (outnumbered) justices argued that executions for rape raise important constitutional questions and that the Court, therefore, had good reason to weigh in. We now know that Justice Arthur Goldberg, who authored this dissenting opinion, was largely concerned about the death penalty’s disproportionate impact on black men convicted of raping white women. It was only at the insistence of Chief Justice Earl Warren—who apparently felt it necessary for the Court to sidestep the charged issue of black crime—that Goldberg did not mention race in his dissent to the denial of review of Rudolph’s case.

Those seeking to circumscribe the racialized system of capital punishment by ending executions for rape (of grown women) got their victory in 1977, when the Court ruled in Coker v. Georgia that executing people for rape was so disproportionate as to violate the 8th Amendment. Given the sordid history of the death penalty for rape as a mechanism of racial terrorism in the United States, this was a remarkable achievement. Even so, capital punishment (for other crimes) stayed in place, as did the plague of racism that infected it.

A decade after Coker, the Court addressed the racial issue head-on in McCleskey, where the majority suggested that it was possible to administer the death penalty in a sufficiently race-neutral way. Crucial to the majority opinion was the Court’s “Batson Doctrine,” named for a then-recent case in which the Court made it more difficult for prosecutors to strike potential jurors on racial grounds. The availability of Batson-based relief, the Court suggested, minimized the odds of unfair capital trials, thereby casting doubt on death penalty abolitionists’ contention that the death penalty was irredeemably racist in its application.

As Carol and Jordan Steiker have pointed out, the Court in McCleskey overstated its case. To be sure, Batson has and had made it easier to thwart prosecutorial attempts to strike jurors because of their race. However, the fact that prosecutorial teams can almost effortlessly concoct and claim benign, non-racial reasons for striking potential jurors of color means that many race-based peremptory challenges probably go unpunished. In any event, Batson failed to address other ways that racial prejudice can surface in death penalty cases. For example, Batson did nothing to remove prosecutors’ vast discretion over whether to seek the death penalty in the first place. Insofar as their implicit racial biases affect prosecutors’ assessment of the heinousness of various crimes, the exercise of this discretion can affect who lives and who dies.

The failure of the Court’s efforts to cleanse the death penalty of its racism is apparent in our own time. In 2018, the State of Georgia executed Kenneth Fults after one of the jurors in Fults’s case claimed post-trial that the “nigger got just what should have happened,” no matter if Fults “ever killed anybody.” Similarly shockingly, the African-American Andre Thomas, after he was found guilty of killing his “white estranged wife” and their child, was sentenced to death by exclusively white jurors—three of whom claimed to disfavor interracial romance. (The state’s conduct is even more shocking in light of the fact that Thomas is so mentally ill as to have removed and eaten his own eye.)

Those who believe that the death penalty has no appreciable race problem may consider these sorts of cases simple aberrations in a system otherwise designed to withstand attempts at racist infiltration. But the reality is that as long as powerful actors exercise (an inevitable) discretion over the death penalty’s application—by deciding whether to seek the death penalty, whether to grant clemency, and how to weigh mitigating factors in defendants’ individual cases—discrimination is likely to rear its ugly head in the penalty’s administration.

The clear lesson for those seeking to address abuses of American state power in other contexts is to eschew the utopianism, either sincere or feigned, that the Court has embraced in its retention of the death penalty. Reasonable people can disagree about the propriety of hard drug laws, the deployment of armed police officers in response to 911 calls, the placement of officers in public schools, and—for that matter—the death penalty. However, those who would embrace an extensive state presence in people’s lives should not be allowed to claim that the mammoth state—through diversity and sensitivity training, for example, or through peremptory challenge reforms—can be made nondiscriminatory. Simply put, efforts to rid a human institution of the apparently ineradicable human vulnerability to prejudice is doomed to failure. The only way to stop the state from abusing its power is to eliminate the power that the state would abuse.

Tommy Raskin is pursuing a J.D. at Harvard Law School. Readers are encouraged to research “Courting Death: The Supreme Court and Capital Punishment” by Jordan and Carol Steiker for more information.

Getting the Police Issue Right

Getting the Police Issue Right

Now that folks are coming around on the idea that law enforcement needs serious structural transformation in this country, let’s make our argument a little more robust.

The tiniest fraction of people get killed by police. It is not useful to think of this problem as one in which there is any real likelihood of being gunned down. At least from the perspective of intellectual integrity (whether it’s useful for the masses to see it that way, is another question).

The problem with the term “police brutality” is that it has multiple meanings. From a police officers perspective, something that you regard as police brutality is in fact, them just doing their job. They believe they are doing the right thing, the best they can. And they probably carry some moral justification around with them that’s not dissimilar to yours. I am not talking about the instances where among LEOs it would be regarded as excessive force. It’s way less likely for law enforcement to act with impunity than it is for them to do what they think is right.

The most egregious issue is mass incarceration. We have by far the largest prison population in the world, anyway you slice it. Absolute numbers, as a percentage, etc. This comes from over-policing and an emphasis on enforcement of prohibition.

Among ways of slicing the demographics, the most vulnerable to over-policing and mass incarceration are the poor.

So how does race come into play? Well, 3/4 of the prison population is black. So notwithstanding socio-economic class as the most accurate predictor of vulnerability to over-policing, it is not unreasonable to view this issue as one of race. Particularly when historically, race has been a way of slicing demographics (by which I mean, the Civil rights movement is only a small few decades old). In America today, the poor are not a community. Black people and African-Americans largely see themselves as one. To add to that, there is clear evidence suggesting a cyclical relationship between over-policing and further impoverishment, and there are numerous other factors that suggest a particular causal relationship between fitting a certain profile (namely: being black) and being a target of over-policing.

Therefore, it’s okay that the loudest voices are the communities (actual, not theoretical) that are most impacted by the most egregious issue.

The jury is out on whether some of the relatively few cases of needless killings by police officers will be most effective as the primary motivation for political change in this area. However, at the moment it is pointing to the optimal solution, end over-policing by getting the most police off the streets.

Finally, police are only Sauron’s physical form. The laws criminalizing poverty are the ring of power. They must be thrown into mount doom

The ‘Thorny Question’ of Public Property

The ‘Thorny Question’ of Public Property

In a recent Mises Wire article, Jeff Deist commented on the squatting of Capitol Hill in Seattle. Contrasting Hans-Hermann Hoppe’s and Walter Block’s respective libertarian approaches to public property, Deist asked if the residents of the ‘CHAZ’ (Capitol Hill Autonomous Zone) are illegal squatters or homesteaders, and concluded that it is a “thorny question.”

Walter Block then responded in a rejoinder on the Power & Market blog elaborating on his view that public property is open for homesteading because, while stolen, it is, in effect, unowned.

The problem here is that both arguments introduce ambiguity and, in fact, shy away from the core issue: private property rights and, specifically, what is owed when something was stolen. Let’s first summarize the arguments.

The Arguments

Hoppe’s argument is summarized by Deist as “the streets of Seattle are not virgin territory available to homesteaders, but rather akin to land held in trust by (admittedly unworthy) state agents on behalf of taxpayers.” This is the same argument as Hoppe has famously used in defense of government-imposed restrictions on immigration since it ‘subsidizes’ foreign nationals by allowing them to take advantage of the stolen property in the form of public roads, welfare system, etc. (The same restriction does not seem to apply to out-migration by those who are net beneficiaries of the state.)

Block’s argument is instead based on the question of whether (non-criminal) homesteaders of public facilities would be legitimate owners. States Block: “It is difficult to see why not. After all, according to strict Rothbardianism, these amenities are not—cannot be—legitimately owned by a coercive government. If this is so, then they are unowned and therefore available for the taking by the next homesteader to come down the pike.”

There are problems in both cases, of course. I have previously addressed some of Hoppe’s argument as it applies to immigration, but will here focus on the problematic assumption that what is stolen from taxpayers is specific property and/or that what the perpetrator owes is specific. The perpetrator here, of course, is the state, and abstractions or aggregates do not have rights—only individuals do.

The Real Issue

The real issue here is not how one can (or want to) conceive of the state with respect to a citizenry. As both Hoppe and Block would agree (I think), the state is not simply about theft. States inflict immense harm on people, their liberty, and their property. It is insufficient to consider the state only in terms of the property it has acquired (by rights violation). Its actual guilt, and thus the rightful claim by its victims, is the total scope of the abuses and rights violations it has perpetrated. And the real guilt lies, of course, with the acting individuals who instead of acting as individuals perpetrated these acts while hiding behind their titles, positions, badges, uniforms, and the pretend-authority the state provides them with.

Now, libertarians already have an answer to how rights violations must be dealt with. The answer is that the perpetrator must make the victim whole again. To the extent specific property has been stolen, this property must be returned (along with damages and cost for enforcing the return). But if the loss is not of a specific (and unspoiled) item, libertarian principles demand restitution in the form of the value lost (or some reasonable estimation thereof, e.g. its market price). This is simple enough.

The “thorny issue” is who is the rightful owner of the streets, national parks, government buildings, etc. that are part of the “public property.”

What Is Owed?

Making the rightful owner whole is different for fungible property (such as money). Certainly, taxpayers have a right to get ‘their money’ back in the same sense as a property owner has the right to receive their property back. But just like one can pay one’s debt in full using completely different bills and coins than those that one borrowed, taxpayers are not reasonably owed the exact bills and coins (if any) they paid to the IRS. It is not the specific monies but the function of it that is lost—not the specific bills and coins. Property is here-in what is represented by the money.

It is similar in the mixed or in-between case of specific property that has been destroyed, spoiled, changed, or transformed. It is the function that is lost that one has a right to get back. If one night Anne steals trees from Ben’s private forest, makes planks and then builds a house—Ben does not have a right to (and probably doesn’t want) the actual matter stolen: the planks made from those trees or that section of the wall that they make up (plus the waste material discarded). No, Ben has a right to be made whole. That is, Anne must get Ben back to where he was before the crime. (Exactly what this means in any specific case is a matter of negotiation, market standards, etc.)

The same goes for public property. A government building was not made from bricks that were stolen from a specific mason; the windows were not stolen from a specific glazier. The building cannot be returned in pieces to the rightful owners to make them whole. And even if they were outright stolen, the nails cannot be taken out of the walls and be returned to thereby make the victim whole again. The same is true for the land on which this building was built. It was likely stolen long ago from (let’s assume) a legitimate owner. This land is in many cases not the same as it was then but has been changed in numerous ways. Perhaps it was farmland but the fertile top soil has been transported elsewhere to make room for the building’s foundation or basement. Returning the land is unlikely to make the previous owner whole. And try to find the topsoil again to return it to the owner of the land where there is now a building. (And even if possible, all of this assumes the actual owner is still alive or that whoever would be the rightful heir or present owner can be traced.)

The Solution

All of this suggests that the solution is recompense rather than the return of specific property to the victim(s). What makes this a “thorny issue” is that we rely on the (inapplicable) assumption that property stolen by the state should (and can) be returned to make the legitimate owner whole. This is more often than not an impossibility because the specific property in the form it was when stolen no longer exists—it has typically been mixed with other stolen specific property. And in all cases the state has wasted plenty of stolen money (taxes) to support this process, maintain control over the property, and keep others out.

The only reasonable conclusion is that in almost all cases the state is not a ‘steward’ of the specific property it stole, but rather an illegitimate occupier and controller of valuable assets that are well beyond what can be returned to rightful owners. In that sense Block is right: public property, because it is ‘owned’ by the State, is unowned and can therefore be justly homesteaded. Simultaneously, all those victimized by the individuals acting as the State have a right to be fully compensated for the harm inflicted on them.

Supreme Court Refuses To Reconsider Its Doctrine of ‘Qualified Immunity’ for Police

Supreme Court Refuses To Reconsider Its Doctrine of ‘Qualified Immunity’ for Police

The U.S. Supreme Court today refused to hear eight separate cases that had presented opportunities to reconsider its doctrine of “qualified immunity.” That doctrine, created by the Supreme Court in 1982, holds that government officials can be held accountable for violating the Constitution only if they violate a “clearly established” constitutional rule. In practice, that means that government officials can only be held liable if a federal court of appeals or the U.S. Supreme Court has already held that someone violated the Constitution by engaging in precisely the same conduct under precisely the same circumstances.

“Qualified immunity means that government officials can get away with violating your rights as long as they violated them in a way nobody thought of before,” explained Institute for Justice (IJ) Attorney Anya Bidwell. “And that means that the most egregious abuses are frequently the ones for which no one can be held to account.”

Qualified immunity has come in for harsh criticism from the left and the right alike. And the outrageous facts of the cases rejected today help illustrate why: In them, lower courts had granted immunity to a group of officers who took an Idaho mom’s consent to “get inside” her home as consent to stand outside, bombarding it with tear-gas grenades; to Texas medical regulators who showed up at a doctor’s office and, without warning or a warrant, rifled through confidential patient files; and to a deputy sheriff who (while in pursuit of an unrelated, unarmed suspect) held a group of young children at gunpoint and then shot a ten-year-old in the leg while firing at a non-threatening family pet.

“Qualified immunity is a failure as a matter of policy, as a matter of law, and as a matter of basic morality,” said IJ Senior Attorney Robert McNamara, who was counsel of record in West v. Winfield, one of the cases denied review today. “It is past time for the Supreme Court to admit as much and start expecting government officials to follow the Constitution.”

The Court’s rejection of the petitions was not unanimous. Justice Clarence Thomas issued a dissent in the longest-pending petition, Baxter v. Bracey, calling for the Court to reevaluate the doctrine entirely: “I continue to have strong doubts about our §1983 qualified immunity doctrine,” Justice Thomas’s dissent concludes. “Given the importance of this question, I would grant the petition for certiorari.”

The drumbeat of voices calling for an end to qualified immunity and a return to basic government accountability has only grown louder in the wake of the killing of George Floyd by Minneapolis police officers. Articles in outlets ranging from USA Today to Fox News Channel to the New York Times editorial page all pointed to the slaying as a symptom of a broader culture of official impunity and called upon the Supreme Court to rethink its qualified immunity rules. Today’s decision means those cries will, at least for now, go unanswered.

“There is no shortage of outrageous qualified immunity cases for the Supreme Court to take,” said IJ Attorney Patrick Jaicomo. “It has refused to hear a case this year, but it can only avoid the issue for so long. The skewed incentives of qualified immunity guarantee that lower courts will continue to generate more examples of injustice, and we will keep bringing those examples back to the courthouse steps until we break through.”

The Institute for Justice, through its Project on Immunity and Accountability, actively litigates to remove barriers to meaningful enforcement of constitutional rights. Today’s decision denied review in one of IJ’s Immunity and Accountability cases, but a second, Brownback v. King, has already been granted review and will be heard by the justices next term. A third case, brought on behalf of a Colorado family whose home was destroyed by police in pursuit of a suspect who had no connection to them, will be considered later this month.

“The principle at stake is simple: If citizens must obey the law, then government officials must obey the Constitution,” concluded IJ President and General Counsel Scott Bullock. “The Constitution’s promises of freedom and individual rights are important only to the extent that they are actually enforced—and the Institute for Justice will work tirelessly to ensure that they are.”

John Kramer is Vice President for Communications at the Institute for Justice. This article was originally featured at the Institute for Justice and is republished with permission. 

Women: Reject Victimhood, Embrace Your Individualism

Women: Reject Victimhood, Embrace Your Individualism

As a teenager I lived on the streets for as short a period as I could manage. This one experience brought more violence into my life than I care to remember, let alone describe, but it did not define me. I mention the experience for one reason; it is not ignorance or a lack of empathy that makes me flatly state that sexual violence against women has declined over time. Almost everything is better for North American women now than it was decades ago.

Yet North America is widely decried as a “rape culture,” and the condemnation is passionate to the point of sounding almost religious. “Rape culture thrives in the dark,” Anthony Zenkus writes in Common Dreams. “It hates the light. It tries to tell victims and future victims that the dark is normal…It succeeds in hiding the sun, keeping the light away and changing the subject. It makes excuses for the dark. Boys will be boys.”

A “rape culture” is defined as one in which the dominant attitudes normalize or trivialize the sexual abuse of women. 1970s feminism coined the term to describe the allegedly pervasive threat of sexual violence by which men oppress women. For fifty years, women have been taught to be afraid of men and angry with them.

It is time for the fear and rage to stop. A social revolution has overturned sexual attitudes since then, and the results have been remarkable. According to the Rape, Abuse & Incest National Network—America’s largest anti-sexual violence organization—such attacks have fallen by more than half since 1993 even though the definition of sexual abuse has expanded. Yet, according to most voices in the media and politics, the danger to women never seems to diminish. Quite the contrary.

Campus rape is called “epidemic” and “getting worse under Betsy DeVos” despite colleges being empty for months. The abuse of indigenous women in North America is denounced as “genocide,” but the fact that indigenous men are abused at almost the same rate is rarely mentioned. Domestic abuse is said to be soaring due to COVID-19 lockdowns even though there is evidence it is not. And why is April considered Sexual Violence Awareness Month rather than the Sexual Celebration Month?

The explanation lies in two words: funding and status. Ask yourself who gains the most in money or power by stoking ubiquitous fear and anger? “The sexual violence industry” benefits by garnering the support that comes from terrified, outraged, and intimidated people. This industry consists of politicians who get elected on “women’s issues,” in much the same manner as Hillary Clinton almost did. It consists of bureaucrats whose wealth and prestige rest upon a gendered narrative of women’s oppression. And their pay-offs can be huge. The FY 2020 budget request for the Office on Violence Against Women, for example, totals $492.5 million. The industry also sustains academics, researchers, and other “experts” whose careers flourish…that is, if they take the sanctioned line.

Facts do not seem to matter. If they did, then male survivors of domestic violence would be recognized; again, by most accounts, they experience domestic violence (DV) at roughly the same rate as women. Or, rather, facts do start to matter when they become obstacles that need to be overshadowed by emotions, like fear.

Fear can be incredibly valuable. Adrenaline flashes through the body and increases the chance a person will survive in the face of immediate danger, like a careening car. But a pervasive fear—one that dominates everyday life—hurts people by shutting down their ability to assess what is real and what is a reasonable response.

It also makes them easier to control. In his novel 1984, George Orwell depicted a dystopian society called Oceania in which “no emotion was pure, because everything was mixed up with fear and hatred.” The fear made people obedient to authority. The hatred bonded them into a collective that assembled for regular two-minute hate sessions of screaming; the world was divided into “them” versus “us.” When combined, the two emotions killed individuality and the willingness to question.  During most of the novel, for example, Oceania is at war with Eastasia and friendly to Eurasia. But Eurasia had once been the enemy. This earlier truth is unacknowledged because people are unwilling to contradict the now-official history.

Pitting the sexes against each another—“them” versus “us”—creates a sexual and social dystopia. It lumps all women into one class and denies their ability to think as individuals who can assess their own risks in life. But when a woman defines her identity as an automatic victim, she is far less likely to question authority and more likely to be hostile to those who do.

Women need to strenuously reject victimhood and to realize that they share a common humanity with men. Otherwise, they will be used by those who create or worsen a crisis in order to profit from it. This process will go on indefinitely because there will never be enough law or cash for the sexual violence industry to declare “mission accomplished.” Its real mission is to continue the flow of power and money.

News Roundup

News Roundup 7/3/20

US News Jeffery Epstein associate Ghislaine Maxwell has been arrested in New Hampshire by the FBI on charges related to the dead sex trafficker. [Link] Large US corporations are threatening to pull ads from Facebook in an effort to get the social network to censor...


Cops Kill Man

Well, first they entrapped him on some drug bullshit. Then they killed him. Unarmed, no chance. Then they got away with it. https://youtu.be/CPeufhOU6OQ

Thank Goodness for Matt Taibbi

What this society needs is men and women who can write: It’s the Fourth of July, and revolution is in the air. Only in America would it look like this: an elite-sponsored Maoist revolt, couched as a Black liberation movement whose canonical texts are a corporate...

Hydroxychloroquine Works

Democrats across America will be sad to hear the news that fewer people are dying from the Covid due to the use of Hydroxychloroquine. They will not, however, question how wrong and smug they are and have been on this topic.

Abolish the CDC, Zika Example

From the Post: Four years before the federal Centers for Disease Control and Prevention fumbled the nation’s chance to begin effective early testing for the novel coronavirus, the agency similarly mishandled its efforts to detect another dreaded pathogen. Amid a...

The Scott Horton Show

7/3/20 Tom Woods on How the Pentagon Makes Us Poorer

Scott talks to Tom Woods about his latest free ebook, The Pentagon vs. The Economy. Woods examines the many ways America's military-industrial complex distorts the economy, often applying Frederic Bastiat's principle of "seen vs. unseen" to demonstrate just how many...

Free Man Beyond the Wall

Foreign Policy Focus

Congress Wants Forever Wars

On FPF #513, I discuss the efforts in Congress to prevent Trump from bringing US troops home. After 19 years, there is a chance the US could end our longest war and bring troops home. However, members of Congress are working to pass provisions to the 2021 National...

Elijah McClain, Excited Delirium, and Ketamine

On FPF #512, John Dangelo returns to the show to talk about the murder of Elijah McClain. McClain was walking down the street when someone called the police because he was wearing a ski mask. A police officer confronted him and put him in a chokehold. An EMT gave his...

Trump’s Russia Policy Failed the American People

On FPF #511, Will Porter returns to the show to discuss Trump's failing Russia policy. During the campaign, Trump made it clear that he would able to get along and make deals with Putin. However, once Trump took office, his policy towards Russia became more and more...

The Anti-War War Vet guest John Dangelo

On FPF #510, John Dangelo returns to the show to discuss being a anti-war war vet. John explains he was a Marine reservist when he was deployed to Afghanistan. His experiences there and reading libertarian/antiwar material led to him to becoming antiwar. We talk about...

Don't Tread on Anyone

The Morality of Consent – Murray Rothbard and Lady A

https://youtu.be/-4NwihBCl6g Once concede the power of the people to consent as well as the natural law of “equal freedom from subjection,” and the logical consequence must be anarchism. Murray N. Rothbard Economic Thought Before Adam Smith, p. 279 Minds.com:...

What is Anarchy?

  ***Watch here***: https://www.minds.com/newsfeed/1121253455002247168?referrer=KeithKnightDontTreadOnAnyone     ... anarchism [is] a simple matter of libertarian logic. Murray N. Rothbard Betrayal of the American Right, p. 145   Once concede the...

Year Zero

120: The Strategy of Silence

Tommy discusses a few current events in order to show how the stories are framed to create a binary, and keep citizens at each other's throat. https://traffic.libsyn.com/secure/strangerencounterspodcast/Strategy_of_Silence.mp3

119: Project Manticore w/Ryan Bunting

Ryan Bunting is a newly published author of the dystopian novel Project Manticore. Tommy invited him on to discuss the book, his writing process, and his future endeavors. The conversation progresses into policing in the US, and how the left and right are missing the...

Support via Amazon Smile

Get your official Libertarian Institute Merchandise!

Order Today!

Pin It on Pinterest