Discover more from A User's Guide to History
The end of war?
Or not? Sixth law, final part
For almost the first two centuries of American history, there was a roughly generational pattern to war. The Revolutionary War (1775-83) was followed by the War of 1812 (to 1815), which was followed by the Mexican War (1846-48), the Civil War (1861-65), the Spanish-American War (1898), World War I (1917-18), and World War II
The pattern cracked slightly with the Korean War, which started just five years after the end of World War II. But it was an exception that proved the rule. It was an undeclared war, lacking the political legitimacy that comes from an endorsement by the people’s representatives in Congress. And it was a distinctly limited war, fought for narrow purposes under the auspices of the United Nations.
After the Korean War, the Vietnam War got things back on the generational track. The Persian Gulf War of 1991 came roughly a generation after Vietnam. The wars in Afghanistan and Iraq starting in 2001 and 2003 came sooner than the generational model forecast, but they were small-scale conflicts and were a response to a specific event, the terrorist attacks of September 11, 2001.
The generational aspect of America’s wars seems to reflect the fact that generations of young men have often felt a need to prove themselves, but once having done so, they don’t need to do it again. This point became clear in the arguments between Theodore Roosevelt, a child at the time of the Civil War, and McKinley, a veteran of that war and a participant in the war’s bloodiest battle, Antietam. “I’ve been through one war,” McKinley said. “I’ve seen the dead piled up, and I don’t want to see another.”
Roosevelt, after passing his test at San Juan Hill, became more like McKinley. The year before the war, Roosevelt had given a speech at the Naval War College in which he lauded the martial virtues. “All the great masterful races have been fighting races,” he said. “No triumph of peace is quite so great as the supreme triumphs of war.” But after his day of glory, which provided the springboard that landed him in the presidency, he became the opposite of a warmonger. He conspicuously avoided sending American troops into battle, elevating diplomacy above war—so successfully that he won the Nobel Peace Prize in recognition of his role in mediating an end to the Russo-Japanese War.
For any student of world affairs during the last several decades, one of the most important questions that can be asked is why there has been no World War III. Plausible answers include the existence of nuclear weapons, which have made the cost of general war prohibitive; the deterrent effect of alliances like NATO, which promise swift punishment of breakers of the peace; and economic globalization, which has dramatically reduced world poverty and made partners of erstwhile enemies.
Another possibility is that World War III was simply never in the cards. Maybe humans really do learn from their mistakes. Maybe the species is becoming more peaceful. Maybe the old rallying cries have lost their purchase on the imagination. Even if the default setting of human nature didn’t used to be peace, maybe it is now.
Will we ever know? How will we know?
The empirical evidence is mounting that something has changed. Not since World War II have great powers gone to war against each other. (At the time of the Korean War, China was still far from a great power). A standard measure of war-induced deaths—battle deaths per 100,000 of world population—has declined from 300 per year in World War II to less than 1 in the twenty-first century.
There is no way of telling if the change is permanent. As noted above, at the beginning of 1914 a persuasive case could be made that war had become anachronistic. The next three decades blew that argument to smithereens. Yet alcoholics sometimes do stay on the wagon, even if they have to treat each day as presenting another risk of falling off.
But other human habits have changed, and no one worries much about their return. Human sacrifice had a long history in various cultures before disappearing, and there seems little chance of its return. Chattel slavery—the open, legal buying and selling of humans for labor—vanished more recently, and though its black-market scion, human trafficking, remains a scourge, the consensus of disapproval of slavery appears certain to keep trafficking in the shadows.
Assuming war is on the demise, one has to wonder what will take its place. Another way of asking the question is to consider what substitutes for war will keep armed conflict at bay. The American psychologist William James considered the matter a century ago in an essay titled “The Moral Equivalent of War.” James wrote against the current of militarism that had produced America’s overseas empire after the Spanish-American War. He opposed war and imperialism, but he knew he and those who thought like him had their work cut out for them. “The war against war is going to be no holiday excursion or camping party,” he said. “The military feelings are too deeply grounded to abdicate their place among our ideals until better substitutes are offered.” James had lived through the Civil War; two of his brothers had fought, while he, like his third brother, the novelist Henry James, had claimed physical disability. William James had observed the evolution of American attitudes toward the Civil War, and drawn conclusions. “There is something highly paradoxical in the modern man’s relation to war,” he said. “Ask all our millions, North and South, whether they would vote now (were such a thing possible) to have our War for the Union expunged from history, and the record of a peaceful transition to the present time substituted for that of its marches and battles, and probably hardly a handful of eccentrics would say yes. Those ancestors, those efforts, those memories and legends, are the most ideal part of what we now own together, a sacred spiritual possession worth more than all the blood poured out. Yet, ask those same people whether they would be willing, in cold blood, to start another civil war now, to gain a similar possession, and not one man or woman would vote for the proposition. In modern eyes, precious though wars may be, they must not be waged solely for the sake of the ideal harvest. Only when forced upon one, only when an enemy’s injustice leaves us no alternative, is a war now thought permissible.”
James admired the martial virtues of honor, efficiency and service. The question for the modern age was how to separate them from the bloodshed. He thought this would be difficult but possible. “The martial type of character can be bred without war,” he asserted. “Strenuous honor and disinterestedness abound elsewhere. Priests and medical men are in a fashion educated to it, and we should all feel some degree of it imperative if we were conscious of our work as an obligatory service to the state.” James rejected common arguments against the transition. “It would be simply preposterous if the only force that could work ideals of honor and standards of efficiency into English or American natures should be the fear of being killed by the Germans or the Japanese. Great indeed is fear; but it is not, as our military enthusiasts believe, and try to make us believe, the only stimulus known for awakening the higher ranges of men's spiritual energy.” James’s formula was simple, if not necessarily easy to follow: “The only thing needed henceforward is to inflame the civic temper as past history has inflamed the military temper.”
Jimmy Carter picked up on James’s idea, following four additional American wars. Carter became president the year after the Vietnam War finally ended in a defeat that took the shine off the promise of actual war. Meanwhile the oil shocks of the 1970s wreaked havoc on the American economy. Carter specified his version of a moral equivalent of war as a campaign to end America’s addiction to imported petroleum. “Many of these proposals will be unpopular,” he said in a speech from the Oval Office. “Some will cause you to put up with inconveniences and to make sacrifices.” But they were necessary. “The alternative may be a national catastrophe.” Carter called on Americans to gird up. “This difficult effort will be the ‘moral equivalent of war,’ except that we will be uniting our efforts to build and not to destroy.”
Carter’s program got no traction against the defenders of America’s energy status quo, who dismissed it by the acronym MEOW. They added insult to injury when they dubbed a subsequent Carter appeal for sacrifice in the national interest the “malaise speech,” though he never uttered that word.
Which left the question of what would replace war in the national psyche, if anything at all. Sports seemed a possibility. During the same period when war was offering fewer and fewer opportunities for emotional solidarity in pursuit of a common goal, spectator sports became enormously popular. In football stadiums across America and at soccer pitches around the world, people in groups of a hundred thousand or more cheered their champions on. In smaller arenas for other sports they did the same thing. Annually as many as a billion people watched the American Super Bowl football championship. A comparable number hung on the outcome of soccer’s World Cup. For the World Cup, the Olympic Games and assorted other world championships, the athletes and teams were explicitly identified with their countries, bringing patriotism into play.
Some of this growth, at a time of decline in the practice of war, might have been merely coincidental—an artifact of the post-World War II popularization of television. But the spread of television might itself have contributed to the decline of war. Television brought the violence and destructiveness of war into homes across America and much of the rich world by the 1960s, and the visibility of war only increased with the emergence of cable television and the internet. War is easier to honor and indulge in when the victims are unseen.
Ironically—or maybe not, given the paradoxes of human nature—the bear market for war itself was accompanied in America by a bull market in the reputation of the country’s warriors. When the GIs came home at the end of World War II, they were respected but in a businesslike way: their country had called, they had answered, and now it was time to get back to regular life. Veterans of the Korean War—especially the repeaters from World War II—likewise thought of themselves as nothing special, and for the most part were treated that way. Veterans of the Vietnam War sometimes sought to hide their service, so unpopular had that conflict become by its end.
But as America’s wars got fewer and smaller, the treatment of American soldiers and veterans became more conspicuously admiring. Simple numbers had something to do with it. The 16 million veterans of World War II were so many that one met them everywhere during the few decades after the war; familiarity bred, if not contempt, then matter-of-factness. In the early twenty-first century the number of Americans under arms was less than a tenth as many, in a nation of twice as many people. America’s warriors were a more special group by the mere fact of being many fewer. (Strikingly, the World War II generation was dubbed the “greatest generation” only by their descendants, as the generation itself began to die out.)
America’s twenty-first-century warriors were also a more select group—self-selected, in fact. The end of the draft in the 1970s meant that thereafter every man and woman in the American military had volunteered to be there. And volunteering to put one’s life on the line for one’s country usually inspires greater respect than doing so to avoid prison for draft-dodging.
There was also a bit of conscience-assuaging in the treatment of every soldier as a hero. Some liberals felt badly, or politically vulnerable, for having blamed the American soldiers in Vietnam for the war policies the liberals disliked. And some conservatives were sensitive to the fact that as much as they supported the wars in Afghanistan and Iraq, their own kids weren’t fighting there; it was the kids of other people.
If sports offered a substitute for war, so did politics, which increasingly resembled a blood sport. Again, the timing might have been coincidental. The modern polarization in American politics followed the embrace of civil rights reform in the 1960s by the Democratic party. Suddenly white Southern Democrats, since the Civil War the strongest single bloc within the party and the most devoted to segregation, found themselves without a home. Older Southern Democrats started voting Republican; many switched to the Republican party. Younger white Southerners joined the Republicans when they first registered to vote. The transition took a generation, but it was largely complete by the 1990s.
The Republican party made room for the newcomers, who were more conservative socially than most of the traditional party of Lincoln. Liberal Republicans of the Northeast, upper Midwest and West Coast found themselves as adrift as the Southern Democrats had been. Many of them gravitated to the Democratic party.
The result was a sifting that left each party more ideologically coherent than it had ever been. Nearly all the liberals were Democrats, and nearly all the conservatives Republicans. The bipartisan coalitions that had made possible the reforms of the Progressive era of the early twentieth century, the New Deal programs of the 1930s, the civil rights revolution and Great Society of the 1960s, and the Reagan Revolution of the 1980s were a thing of the past. Taking their place was the scorched-earth partisanship of Newt Gingrich and his counterparts and successors in both parties.
Facilitating the polarization was the rise of the new media: first, cable television news networks that let viewers choose their channels to suit their tastes, and then social media that made the cocoon still more comforting.
Cementing the new system in place was the increasing precision of partisan gerrymandering, which allowed state legislatures, with the imprimatur of the Supreme Court, to design congressional districts that were safe for one party or the other. The result was the elevation of primary elections to the status of main event in congressional races. Republican incumbents worried about challenges only from their right, Democrats only from their left. For incumbents of either party, a vote for a measure sponsored by the other party could be called a betrayal and often was, leading to the near-disappearance of any such aisle-crossing compromises.
The new dispensation triggered a flipping of the Clausewitzian principle that war was politics by other means. Now politics was war by other means. Political foes were often treated as the enemy; insufficient enthusiasm for the party cause was branded as treason.
The bloody-mindedness of American politics had chiefly domestic roots. But there was a foreign-policy element that was absolutely essential to its burgeoning: the absence of a major war. Woodrow Wilson was said to have commented, in his days as president of Princeton University, that academic politics were so bitter because the stakes were so small. To some extent that described America as the twentieth century turned into the twenty-first. The issues over which the parties fought were hardly immaterial—tax rates, immigration, health care—but they lacked the mortal immediacy of a world war, and so produced no comparable unifying effect.
When Americans at the end of the century looked back on World War II as the “good war,” they were thinking not of the fifty million men, women and children who lost their lives in that horrendous conflict, but of the unifying effect the war had on the American nation. After a decade of the Great Depression, which produced a politics as bitter in its own way as that of the end of the century, Pearl Harbor snapped Americans into line. Republicans didn’t become Democrats, nor Democrats Republicans, but both understood that there were matters more important than which side won the next election.
The attacks of 9/11 briefly had a similar effect. The approval ratings of George W. Bush shot up; legislation designed to secure the country against terrorism sailed through Congress. Yet the effect didn’t last, mostly because al-Qaeda wasn’t Nazi Germany or imperial Japan, but rather a small gang of criminals. Iraq was a regular country with a genuine army, but Saddam Hussein was toppled within weeks. There and in Afghanistan the Bush administration took pains to keep the wars from becoming centerpieces of American attention. Americans remained edgy about terrorism, but their lives went on much as before.
Was there anything that could restore the lost unity of the World War II years? Another world war, presumably. Or even a serious tussle with China or Russia, say, that didn’t go nuclear. It wasn’t the sort of thing any responsible leader would risk deliberately, but the temptation to turn foreign affairs to domestic ends is at least as old as Shakespeare, who had his Henry IV explain that it had been his purpose “to lead out many to the Holy Land, lest rest and lying still might make them look to near unto my state.” The dying king goes on to urge his son “to busy giddy minds with foreign quarrels.”
Short of actual war, American leaders frequently employed war metaphorically, declaring wars on poverty, cancer and drugs. The war on terror after 9/11 was closer to the real deal, but it lacked a coherent enemy. This proved a serious problem strategically—the terrorists were hard to locate and pin down—but also psychologically. Few things are more unifying among humans than a common enemy. The enemy can be a person—the German Kaiser for Americans in World War I, Hitler in World War II. It can be a group of people unlike one’s own. Here the list is endless, in that humans have long determined who they are by declaring who they are not. Race, religion, ethnicity and political ideology are but a few of the markers by which people distinguish their own group from others, and no distinction has been so fine that it didn’t drive some to murderous effort to maintain the distinction.
Different groups can be driven together by a shared enemy. The United States allied with the Soviet Union during World War II against Nazi Germany. If somehow all the peoples on earth, or at least most of them, could be allied against a common enemy, war might fall more fully out of favor. Science-fictionists imagine an extraterrestrial invasion; science-realists—and many others—point to global problems like climate change, which threatens if not all of humanity, at least very large portions of it. Such a common threat might finally summon the moral equivalent of war.
It hadn’t done so as of the early 2020s. Neither did the covid pandemic, which produced finger-pointing between nations and lethally bizarre partisanship within them.
If climate change and covid couldn’t elicit the moral equivalent of war, one had to wonder whether even war itself would have the same effect it once had. If America suffered a modern Pearl Harbor—say, a cyberattack from China—would the country rally together? Or would the parties simply toss blame-bombs at each other? Would one or the other deny the attack had taken place?
This was strange new territory. Yet though the ground might have changed, human nature probably hadn’t. At least not enough to render people immune to the appeal of heroes. If anything, the tawdriness of politics might have made heroes—not the manufactured kind from sports or other entertainment, but the real thing—more necessary than ever. War might yet make a comeback.