Does the West still exist? Most American politicians, journalists, and policy intellectuals seem to think so, or at least they pretend to. But what if, like the Baltimore Catechism and St. Joseph Missal of my boyhood, the West has surreptitiously vanished, without anyone taking much notice of its disappearance? As with the old church of incense, ritual, and mystery, we can argue about whether what has replaced it represents progress, but there’s no point in pretending that what once was still is. It’s not.
Some place names all but quiver with historical resonance: Athens, Rome, Jerusalem, and not least of all, America. Yet during the second half of the twentieth century, the West merited a place on that roster.
In its heyday, the West—used more or less interchangeably with the phrase “free world”—was much more than a conglomeration of countries. The term itself conjured up a multiplicity of images: peoples sharing a devotion to freedom and democracy; nations mustering the political and cultural cohesion to stand firm in a common cause; sacrifice and steadfastness in the face of evil. The West was Rick and Ilsa, Winston and Franklin, Jack and Ron at the Berlin Wall. It was Greer Garson as Mrs. Miniver and Tom Hanks as army ranger Captain John Miller.
For several decades after 1945, the West imparted legitimacy to U.S. claims of global leadership. Nations said to make up the West endorsed, or played along with, the notion that the United States was exceptional and indispensable. Endlessly reiterated in stump speeches and newspaper editorials, this proposition came to seem self-evidently true—or at least expedient.
Today, it is neither. Seven decades after World War II and three decades after the end of the Cold War, to pretend that something called the West, taking its cues from Washington, continues to play an organizing role in international politics is to indulge in a vast self-deception.
Please email comments to [email protected] and join the conversation on our Facebook page.