RSS

Author Archives: Jimmy Maher

The Game of Everything, Part 9: Civilization and Economics

If the tailor goes to war against the baker, he must henceforth bake his own bread.

— Ludwig von Mises

There’s always the danger that an analysis of a game spills into over-analysis. Some aspects of Civilization reflect conscious attempts by its designers to model the processes of history, while some reflect unconscious assumptions about history; some aspects represent concessions to the fact that it first and foremost needs to work as a playable and fun strategy game, while some represent sheer random accidents. It’s important to be able to pull these things apart, lest the would-be analyzer wander into untenable terrain.

Any time I’m tempted to dismiss that prospect, I need only turn to Johnny L. Wilson and Alan Emrich’s ostensible “strategy guide” Civilization: or Rome on 640K a Day, which is actually far more interesting as the sort of distant forefather of this series of articles — as the very first attempt ever to explore the positions and assumptions embedded in the game. Especially given that it is such an early attempt — the book was published just a few months after the game, being largely based on beta versions of same that MicroProse had shared with the authors — Wilson and Emrich do a very credible job overall. Yet they do sometimes fall into the trap of seeing what their political beliefs make them wish to see, rather than what actually existed in the minds of the designers. The book doesn’t explicitly credit which of the authors wrote what, but one quickly learns to distinguish their points of view. And it turns out that Emrich, whose arch-conservative worldview is on the whole more at odds with that of the game than Wilson’s liberal-progressive view, is particularly prone to projection. Among the most egregious and amusing examples of him using the game as a Rorschach test is his assertion that the economy-management layer of Civilization models a rather dubious collection of ideas that have more to do with the American political scene in 1991 than they do with any proven theories of economics.

We know we’re in trouble as soon as the buzzword “supply-side economics” turns up prominently in Emrich’s writing. It burst onto the stage in a big way in the United States in 1980 with the election of Ronald Reagan as president, and has remained to this day one of his Republican party’s main talking points on the subject of economics in general. Its central, counter-intuitive claim is that tax revenues can often be increased by cutting rather than raising tax rates. Lower taxes, goes the logic, provide such a stimulus to the economy as a whole that people wind up making a lot more money. And this in turn means that the government, even though it brings in less taxes per dollar, ends up bringing in more taxes in the aggregate.

In seeing what he wanted to see in Civilization, Alan Emrich decided that it hewed to contemporary Republican orthodoxy not only on supply-side economics but also on another subject that was constantly in the news during the 1980s and early 1990s: the national debt. The Republican position at the time was that government deficits were always bad; government should be run like a business in all circumstances, went their argument, with an orderly bottom line.

But in the real world, supply-side economics and a zero-tolerance policy on deficits tend to be, shall we say, incompatible with one another. Since the era of Ronald Reagan, Republicans have balanced these oil-and-water positions against one another by prioritizing tax cuts when in power and wringing their hands over the deficit — lamenting the other party’s supposedly out-of-control spending on priorities other than their own — when out of power. Emrich, however, sees in Civilization‘s model of an economy the grand unifying theory of his dreams.

Let’s quickly review the game’s extremely simplistic handling of the economic aspects of civilization-building before we turn to his specific arguments, such as they are. The overall economic potential of your cities is expressed as a quantity of “trade arrows.” As leader, you can devote the percentage of trade arrows you choose to taxes, which add money to your treasury for spending on things like the maintenance costs of your buildings and military units and tributes to other civilizations; research, which lets you acquire new advances; and, usually later in the game, luxuries, which help to keep your citizens content. There’s no concept of deficit spending in the game; if ever you don’t have enough money in the treasury to maintain all of your buildings and units at the end of a turn, some get automatically destroyed. This, then, leads Emrich to conclude that the game supports his philosophy on the subject of deficits in general.

But the more entertaining of Emrich’s arguments are the ones he deploys to justify supply-side economics. At the beginning of a game of Civilization, you have no infrastructure to support, and thus you have no maintenance costs at all — and, depending on which difficulty level you’ve chosen to play at, you may even start with a little bit of money already in the treasury. Thus it’s become standard practice among players to reduce taxes sharply from their default starting rate of 50 percent, devoting the bulk of their civilization’s economy early on to research on basic but vital advances like Pottery, Bronze Working, and The Wheel. With that in mind, let’s try to follow Emrich’s thought process:

To maximize a civilization’s potential for scientific and technological advancement, the authors recommend the following exercise in supply-side economics. Immediately after founding a civilization’s initial city, pull down the Game Menu and select “Tax Rate.” Reduce the tax rate from its default 50% to 10% (90% Science). This reduced rate will allow the civilization to continue to maintain its current rate of expenditure while increasing the rate at which scientific advancements occur. These advancements, in turn, will accelerate the wealth and well-being of the civilization as a whole.

In this way, the game mechanics mirror life. The theory behind tax reduction as a spur to economic growth is built on two principles: the multiplier and the accelerator. The multiplier effect is abstracted out of Sid Meier’s Civilization because it is a function of consumer spending.

The multiplier effect says that each tax dollar cut from a consumer’s tax burden and actually spent on consumer goods will net an additional 50 cents at a second stage of consumer spending, an additional 25 cents at a third stage, an additional 12.5 cents at a fourth stage, etc. Hence, economists claim that the full progression nets a total of two dollars for each extra consumer dollar spent as a result of a tax cut.

The multiplier effect cannot be observed in the game because it is only presented indirectly. Additional consumer spending causes a flash point where additional investment takes place to increase, streamline, and advance production capacity and inventory to meet the demands of the increased consumption. Production increases and advances, in turn, have an additional multiplier effect beyond the initial consumer spending. When the scientific advancements occur more rapidly in Sid Meier’s Civilization, they reflect that flash point of additional investment and allow civilizations to prosper at an ever accelerating rate.

Wow. As tends to happen a lot after I’ve just quoted Mr. Emrich, I’m not quite sure where to start. But let’s begin with his third paragraph, in particular with a phrase which is all too easy to overlook: that for this to work, the dollar cut must “actually be spent on consumer goods.” When tax rates for the wealthy are cut, the lucky beneficiaries don’t tend to go right out and spend their extra money on consumer goods. The most direct way to spur the economy through tax cuts thus isn’t to slash the top tax bracket, as Republicans have tended to do; it’s to cut the middle and lower tax brackets, which puts more money in the pockets of those who don’t already have all of the luxuries they could desire, and thus will be more inclined to go right out and spend their windfall.

But, to give credit where it’s due, Emrich does at least include that little phrase about the importance of spending on consumer goods, even if he does rather bury the lede. His last paragraph is far less defensible. To appreciate its absurdity, we first have to remember that he’s talking about “consumer spending” in a Stone Age economy of 4000 BC. What are these consumers spending on? Particularly shiny pieces of quartz?  And for that matter what are they spending, considering that your civilization hasn’t yet developed currency? And how on earth can any of this be said to justify supply-side economics over the long term? You can’t possibly maintain your tax rate of 10 percent forever; as you build up your cities and military strength, your maintenance costs steadily increase, forcing you back toward that starting default rate of 50 percent. To the extent that Civilization can be said to send any message at all on taxes, said message must be that a maturing civilization will need to steadily increase its tax rate as it advances toward modernity. And indeed, as we learned in an earlier article in this series, this is exactly what has happened over the long arc of real human history. Your economic situation at the beginning of a game of Civilization isn’t some elaborate testimony to supply-side economies; it just reflects the fact that one of the happier results of a lack of civilization is the lack of a need to tax anyone to maintain it.

In reality, then, the taxation model in the game is a fine example of something implemented without much regard for real-world economics, simply because it works in the context of a strategy game like this one. Even the idea of such a centralized system of rigid taxation for a civilization as a whole is a deeply anachronistic one in the context of most societies prior to the Enlightenment, for whose people local government was far more important than some far-off despot or monarch. Taxes, especially at the national level, tended to come and go prior to AD 1700, depending on the immediate needs of the government, and lands and goods were more commonly taxed than income, which in the era before professionalized accounting was hard for the taxpayer to calculate and even harder for the tax collector to verify. In fact, a fixed national income tax of the sort on which the game’s concept of a “tax rate” seems to be vaguely modeled didn’t come to the United States until 1913. Many ancient societies — including ones as advanced as Egypt during its Old Kingdom and Middle Kingdom epochs —  never even developed currency at all. Even in the game Currency is an advance which you need to research; the cognitive dissonance inherent in earning coins for your treasury when your civilization lacks the concept of money is best just not thought about.

Let’s take a moment now to see if we can make a more worthwhile connection between real economic history and luxuries, that third category toward which you can devote your civilization’s economic resources. You’ll likely have to begin doing so only if and when your cities start to grow to truly enormous sizes, something that’s likely to happen only under the supercharged economy of a democracy. When all of the usual bread and circuses fail, putting resources into luxuries can maintain the delicate morale of your civilization, keeping your cities from lapsing into revolt. There’s an historical correspondence that actually does seem perceptive here; the economies of modern Western democracies, by far the most potent the world has ever known, are indeed driven almost entirely by a robust consumer market in houses and cars, computers and clothing. Yet it’s hard to know where to really go with Civilization‘s approach to luxuries beyond that abstract statement. At most, you might put 20 or 30 percent of your resources into them, leaving the rest to taxes and research, whereas in a modern developed democracy like the United States those proportions tend to be reversed.

Ironically, the real-world economic system to which Civilization‘s overall model hews closest is actually a centrally-planned communist economy, where all of a society’s resources are the property of the state — i.e., you — which decides how much to allocate to what. But Sid Meier and Bruce Shelley would presumably have run screaming from any such association — not to mention our friend Mr. Emrich, who would probably have had a conniption. It seems safe to say, then, that what we can learn from the Civilization economic model is indeed sharply limited, that most of it is there simply as a way of making a playable game.

Still, we might usefully ask whether there’s anything in the game that does seem like a clear-cut result of its designers’ attitudes toward real-world economics. We actually have seen some examples of that already in the economic effects that various systems of government have on your civilization, from the terrible performance of despotism to the supercharging effect of democracy. And there is one other area where Civilization stakes out some clear philosophical territory: in its attitude toward trade between civilizations, a subject that’s been much in the news in recent years in the West.

In the game, your civilization can reap tangible benefits from its contact with other civilizations in two ways. For one, you can use special units called caravans, which become available after you’ve researched the advance of Trade, to set up “trade routes” between your cities and those of other civilizations. Both then receive a direct boost to their economies, the magnitude of which depends on their distance from one another — farther is better — and their respective sizes. A single city can set up such mutually beneficial arrangements with up to five other cities, and see them continue as long as the cities in question remain in existence.

In addition to these arrangements, you can horse-trade advances directly with the leaders of other civilizations, giving your counterpart one of your advances in exchange for one you haven’t yet acquired. It’s also possible to take advances from other civilizations by conquering their cities or demanding tribute, but such hostile approaches have obvious limits to which a symbiotic trading relationship isn’t subject; fighting wars is expensive in terms of blood and treasure alike, and you’ll eventually run out of enemy cities to conquer. If, on the other hand, you can set up warm relationships with four or five other civilizations, you can positively rocket up the Advances Chart.

The game’s answer to the longstanding debate between free trade and protectionism — between, to put a broader framing on it, a welcoming versus an isolationist attitude toward the outside world — is thus clear: those civilizations which engage economically with the world around them benefit enormously and get to Alpha Centauri much faster. Such a position is very much line in line with the liberal-democratic theories of history that were being espoused by thinkers like Francis Fukuyama at the time Meier and Shelley were making the game — thinkers whose point of view Civilization unconsciously or knowingly adopts.

As has become par for the course by now, I believe that the position Civilization and Fukuyama alike take on this issue is quite well-supported by the evidence of history. To see proof, one doesn’t have to do much more than look at where the most fruitful early civilizations in history were born: near oceans, seas, and rivers. Egypt was, as the ancient historian Herodotus so famously put it, “the gift of the Nile”; Athens was born on the shores of the Mediterranean; Rome on the east bank of the wide and deep Tiber river. In ancient times, when overland travel was slow and difficult, waterways were the superhighways of their era, facilitating the exchange of goods, services, and — just as importantly — ideas over long distances. It’s thus impossible to imagine these ancient civilizations reaching the heights they did without this access to the outside world. Even today port cities are often microcosms of the sort of dynamic cultural churn that spurs civilizations to new heights. Not for nothing does every player of the game of Civilization want to found her first city next to the ocean or a river — or, if possible, next to both.

To better understand how these things work in practice, let’s return one final time to the dawn of history for a narrative of progress involving one of the greatest of all civilizations in terms of sheer longevity.

Egypt was far from the first civilization to spring up in the Fertile Crescent, that so-called “cradle of civilization.” The changing climate that forced the hunter-gatherers of the Tigris and Euphrates river valleys to begin to settle down and farm as early as 10,000 BC may not have forced the peoples roaming the lands near the Nile to do the same until as late as 4000 BC. Yet Egyptian civilization, once it took root, grew at a crazy pace, going from primitive hunter-gatherers to a culture that eclipsed all of its rivals in grandeur and sophistication in less than 1500 years. How did Egypt manage to advance so quickly? Well, there’s strong evidence that it did so largely by borrowing from the older, initially wiser civilizations to its east.

Writing is among the most pivotal advances for any young civilization; it allows the tallying of taxes and levies, the inventorying of goods, the efficient dissemination of decrees, the beginning of contracts and laws and census-taking. It was if anything even more important in Egypt than in other places, for it facilitated a system of strong central government that was extremely unusual in the world prior to the Enlightenment of many millennia later. (Ancient Egypt at its height was, in other words, a marked exception to the rule about local government being more important than national prior to the modern age.) Yet there’s a funny thing about Egypt’s famous system of hieroglyphs.

In nearby Sumer, almost certainly the very first civilization to develop writing, archaeologists have traced the gradual evolution of cuneiform writing by fits and starts over a period of many centuries. But in Egypt, by contrast, writing just kind of appears in the archaeological record, fully-formed and out of the blue, around 3000 BC. Now, it’s true that Egypt didn’t simply take the Sumerian writing system; the two use completely different sets of symbols. Yet many archaeologists believe that Egypt did take the idea of writing from Sumer, with whom they were actively trading by 3000 BC. With the example of a fully-formed vocabulary and grammar, all translated into a set of symbols, the actual implementation of the idea in the context of the Egyptian language was, one might say, just details.

How long might it have taken Egypt to make the conceptual leap that led to writing without the Sumerian example? Not soon enough, one suspects, to have built the Pyramids of Giza by 2500 BC. Further, we see other diverse systems of writing spring up all over the Mediterranean and Middle East at roughly the same time. Writing was an idea whose time had come, thanks to trading contacts. Trade meant that every new civilization wasn’t forced to reinvent every wheel for itself. It’s since become an axiom of history that an outward-facing civilization is synonymous with youth and innovation and vigorous growth, an inward-turning civilization synonymous with age and decadence and decrepit decline. It happened in Egypt; it happened in Greece; it happened in Rome.

But, you might say, the world has changed a lot since the heyday of Rome. Can this reality that ancient civilizations benefited from contact and trade with one another really be applied to something like the modern debate over free trade and globalization? It’s a fair point. To address it, let’s look at the progress of global free trade in times closer to our own.

In the game of Civilization, you won’t be able to set up a truly long-distance, globalized trading network with other continents until you’ve acquired the advance of Navigation, which brings with it the first ships that are capable of transporting your caravan units across large tracts of ocean. In real history, the first civilizations to acquire such things were those of Europe, in the late fifteenth century AD. Economists have come to call this period “The First Globalization.”

And, tellingly, they also call this period “The Great Divergence.” Prior to the arrival of ships capable of spanning the Atlantic and Pacific Oceans, several regions of the world had been on a rough par with Europe in terms of wealth and economic development. In fact, at least one great non-European civilization — that of China — was actually ahead; roughly one-third of the entire world’s economic output came from China alone, outdistancing Europe by a considerable margin. But, once an outward-oriented Europe began to establish itself in the many less-developed regions of the world, all of that changed, as Europe surged forward to the leading role it would enjoy for the next several centuries.

How did the First Globalization lead to the Great Divergence? Consider: when the Portuguese explorer Vasco da Gama reached India in 1498, he found he could buy pepper there, where it was commonplace, for a song. He could then sell it back in Europe, where it was still something of a delicacy, for roughly 25 times what he had paid for it, all while still managing to undercut the domestic competition. Over the course of thousands of similar trading arrangements, much of the rest of the world came to supply Europe with the cheap raw materials which were eventually used to fuel the Industrial Revolution and to kick the narrative of progress into overdrive, making even tiny European nations like Portugal into deliriously rich and powerful entities on the world stage.

And what of the great competing civilization of China? As it happens, it might easily have been China instead of Europe that touched off the First Globalization and thereby separated itself from the pack of competing civilizations. By the early 1400s, Chinese shipbuilding had advanced enough that its ships were regularly crisscrossing the Indian Ocean between established trading outposts on the east coast of Africa. If the arts of Chinese shipbuilding and navigation had continued to advance apace, it couldn’t have been much longer until its ships crossed the Pacific to discover the Americas. How much different would world history have been if they had? Unfortunately for China, the empire’s imperial leaders, wary of supposedly corrupting outside influences, made a decision around 1450 to adopt an isolationist posture. Existing trans-oceanic trade routes were abandoned, and China retreated behind its Great Wall, leaving Europe to reap the benefits of global trade. By 1913, China’s share of the world’s economy had dropped to 4 percent. The most populous country in the world had become a stagnant backwater in economic terms. So, we can say that Europe’s adoption of an outward-facing posture just as China did the opposite at this critical juncture became one of the great difference-makers in world history.

We can already see in the events of the late fifteenth century the seeds of the great debate over globalization that rages as hotly as ever today. While it’s clear that the developed countries of Europe got a lot out of their trading relationships, it’s far less clear that the less-developed regions of the world benefited to anything like the same extent — or, for that matter, that they benefited at all.

This first era of globalization was the era of colonialism, when developed Europe freely exploited the non-developed world by toppling or co-opting whatever forms of government already existed among its new trading “partners.” The period brought a resurgence of the unholy practice of slavery, along with forced religious conversions, massacres, and the theft of entire continents’ worth of territory. Much later, over the course of the twentieth century, Europe gradually gave up most of its colonies, allowing the peoples of its former overseas possessions their ostensible freedom to build their own nations. Yet the fundamental power imbalances that characterized the colonial period have never gone away. Today the developing world of poor nations trades with the developed world of rich nations under the guise of being equal sovereign entities, but the former still feeds raw materials to the industrial economies of the latter — or, increasingly, developing industrial economies feed finished goods to the post-industrial knowledge economies of the ultra-developed West. Proponents of economic globalization argue that all of this is good for everyone concerned, that it lets each country do what it does best, and that the resulting rising economic tide lifts all their boats. And they argue persuasively that the economic interconnections globalization has brought to the world have been a major contributing factor to the unprecedented so-called “Long Peace” of the last three quarters of a century, in which wars between developed nations have not occurred at all and war in general has become much less frequent.

But skeptics of economic globalism have considerable data of their own to point to. In 1820, the richest country in the world on a per-capita basis was the Netherlands, with an inflation-adjusted average yearly income of $1838, while the poorest region of the world was Africa, with an average income of $415. In 2017, the Netherlands had an average income of $53,582, while the poorest country in the world for which data exists was in, you guessed it, Africa: it was the Central African Republic, with an average income of $681. The richest countries, in other words, have seen exponential economic growth over the last two centuries, while some of the poorest have barely moved at all. This pattern is by no means entirely consistent; some countries of Asia in particular, such as Taiwan, South Korea, Singapore, and Japan, have done well enough for themselves to join the upper echelon of highly-developed post-industrial economies. Yet it does seem clear that the club of rich nations has grown to depend on at least a certain quantity of nations remaining poor in order to keep down the prices of the raw materials and manufactured goods they buy from them. If the rising tide lifted these nations’ boats to equality with those of the rich, the asymmetries on which the whole world economic order runs today wouldn’t exist anymore. The very stated benefits of globalization carry within them the logic for keeping the poor nations’ boats from rising too high: if everyone has a rich, post-industrial economy, who’s going to do the world’s grunt work? This debate first really came to the fore in the 1990s, slightly after the game of Civilization, as anti-globalization became a rallying cry of much of the political left in the developed world, who pointed out the seemingly inherent contradictions in the idea of economic globalization as a universal force for good.

Do note that I referred to “economic globalization” there. We should do what we can to separate it from the related concepts of political globalization and cultural globalization, even as the trio can often seem hopelessly entangled in the real world. Still, political globalization, in the form of international bodies like the United Nations and the International Court of Justice, is usually if not always supported by leftist critics of economic globalization.

But cultural globalization is decried to almost an equal degree, being sometimes described as the “McDonaldization” of the world. Once-vibrant local cultures all over the world, goes the claim, are being buried under the weight of an homogenized global culture of consumption being driven largely from the United States. Kids in Africa who have never seen a baseball game rush out to buy the Yankees caps worn by the American rap stars they worship, while gangsters kill one another over Nike sneakers in the streets of China. Developing countries, the anti-globalists say, first get exploited to produce all this crap, then get the privilege of having it sold back to them in ways that further eviscerate their cultural pride.

And yet, as always with globalization, there’s also a flip side. A counter-argument might point out that at the end of the day people have a right to like what they like (personally, I have no idea why anyone would eat a McDonald’s hamburger, but tastes evidently vary), and that cultures have blended with and assimilated one another from the days when ancient Egypt traded with ancient Sumer. Young people in particular in the world of today have become crazily adept at juggling multiple cultures: getting married in a traditional Hindu ceremony on Sunday and then going to work in a smart Western business suit on Monday, listening to Beyoncé on their phone as they bike their way to sitar lessons. Further, the emergence of new forms of global culture, assisted by the magic of the Internet, have already fostered the sorts of global dialogs and global understandings that can help prevent wars; it’s very hard to demonize a culture which has produced some of your friends, or even just creative expressions you admire. As the younger generations who have grown up as members of a sort of global Internet-enabled youth culture take over the levers of power, perhaps they will become the vanguard of a more peaceful, post-nationalist world.

The debate about economic globalization, meanwhile, has shifted in some surprising ways in recent years. Once a cause associated primarily with the academic left, cosseted in their ivory towers, the anti-globalization impulse has now become a populist movement that has spread across the political spectrum in many developed countries of the West. Even more surprisingly, the populist debate has come to center not on globalization’s effect on the poor nations on the wrong side of the power equation but on those rich nations who would seem to be its clear-cut beneficiaries. In just the last couple of years as of this writing, blue-collar workers who feel bewildered and displaced by the sheer pace of an ever-accelerating narrative of progress in an ever more multicultural world were a driving force behind the Brexit vote in Britain and the election of Donald Trump to the presidency of the United States. The understanding of globalization which drove both events was simplistic and confused — trade deficits are no more always a bad thing for any given country than is a national tax deficit — but the visceral anger behind them was powerful enough to shake the established Western world order more than any event since the World Trade Center attack of 2001. It should become more clear in the next decade or so whether, as I suspect, these movements represent a reactionary last gasp of the older generation before the next, more multicultural and internationalist younger generation takes over, or whether they really do herald a more fundamental shift in geopolitics.

As for the game of Civilization: to attempt to glean much more from its simple trading mechanisms than we already have would be to fall into the same trap that ensnared Alan Emrich. A skeptic of globalization might note that the game is written from the perspective of the developed world, and thus assumes that your civilization is among the privileged ranks for whom globalization on the whole has been — sorry, Brexiters and Trump voters! — a clear benefit. This is true even if the name of the civilization you happen to be playing is the Aztecs or the Zulus, peoples for whom globalization in the real world meant the literal end of their civilizations. As such examples prove, the real world is far more complicated than the game makes it appear. Perhaps the best lesson to take away — from the game as well as from the winners and arguable losers of globalization in our own history — is that it really does behoove a civilization to actively engage with the world. Because if it doesn’t, at some point the world will decide to engage with it.

(Sources: the books Civilization, or Rome on 640K a Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Economics by Paul Samuelson, The Rise and Fall of Ancient Egypt by Toby Wilkinson, Enlightenment Now: The Case for Reason, Science, Humanism, and Progress by Steven Pinker, Global Economic History: A Very Short Introduction by Robert C. Allen, Globalization: A Very Short Introduction by Manfred B. Steger, Taxation: A Very Short Introduction by Stephen Smith, and Guns, Germs, and Steel: The Fates of Human Societies by Jared Diamond.)

 

Tags: , , ,

The Game of Everything, Part 8: Civilization and Government II (Democracy, Communism, and Anarchy)

Democracy is like a raft. It never sinks, but, damn it, your feet are always in the water.

— Fisher Ames

What can we say about democracy, truly one of the most important ideas in human history? Well, we can say, for starters, that it’s yet another Greek word, a combination of “demos” — meaning the people or, less favorably, the mob — with “kratos,” meaning rule. Rule by the people, rule by the mob… the preferred translations have varied with the opinion of the translator.

The idea of democracy originated, as you might expect given the word’s etymology, in ancient Greece, where Plato detested it, Aristotle was ambivalent about it, and the citizens of Athens were intrigued enough to actually try it out for a while in its purest form: that of a government in which every significant decision is made through a direct vote of the people. Yet on the whole it was regarded as little more than an impractical ideal for many, many centuries, even as some countries, such as England, developed some mechanisms for sharing power between the monarch and elected or appointed representatives of other societal interests. It wasn’t until 1776 that a new country-to-be called the United States declared its intention to make a go of it as a full-blown representational democracy, thereby touching off the modern era of government, in which democracy has increasingly come to be seen as the only truly legitimate form of government in the world.

Like the Christianity that had done so much to lay the groundwork for its acceptance, democracy was a meme with such immediate, obvious mass appeal that it was well-nigh impossible to control once the world had a concrete example of it to look at in the form of the United States. Over the course of the nineteenth century, responding to the demands of their restive populations, remembering soberly what had happened to Louis XVI in France when he had tried to resist the democratic wave, many of the hidebound old monarchies of Europe found ways to democratize in part if not in total; in Britain, for example, about 40 percent of adult males were allowed to vote by 1884. When the drift toward democracy failed to prevent the carnage of World War I, and when that war was followed by a reactionary wave of despotic fascism, many questioned whether democracy was really all it had been cracked up to be. Yet even as the pundits doubted, the slow march of democracy continued; by 1930, almost all adult citizens of Britain, including women, were allowed to vote. By the time the game of Civilization was made near the end of the twentieth century, any doubts about democracy’s ethical supremacy and practical efficacy had been cast aside, at least in the developed West. In missives like Francis Fukuyama’s The End of History, it was once again being full-throatedly hailed as the natural endpoint of the whole history of human governance.

We may not wish to go as far as calling democracy the end of history, but there’s certainly plenty of historical data in its favor. There’s been an undeniable trend line from the end of the eighteenth century to today, in which more and more countries have become more and more democratic. And, equally importantly, over the last century or so virtually all of the most successful countries in terms of per-capita economic performance have been democracies. A few interrelated factors likely explain why this should be the case.

One of them is the reality that as societies and economies develop they inevitably become more and more complex, a confusing mosaic of competing and cooperating interests which seemingly only democracy is equipped to navigate. “Democracies permit participation and therefore feedback,” writes Francis Fukuyama.

Another factor is the way that democracies manage to subsume within them the seemingly competing virtues of stability and renewal. As anyone who’s observed the worldwide stock markets after one of President Donald Trump’s more unhinged tweets can attest, business in particular loves stability and hates the uncertainty that’s born of political change. Yet often change truly is necessary, and often an aged, rigid-thinking despot or monarch is the very last person equipped to push it through. An election every fixed number of years provides a country with the ability to put new blood in power whenever it’s needed, without the chaos of revolution.

The final factor is another reality disliked by despots everywhere: the reality that education and democracy go hand in hand. A successful economy requires an educated workforce, but an educated workforce has a disconcerting tendency to demand a greater role in civic life. Francis Fukuyama:

Economic development demonstrates to the slave the concept of mastery, as he discovers he can master nature through technology, and master himself as well through the discipline of work and education. As societies become better educated, slaves have the opportunity to become more conscious of the fact that they are slaves and would like to be masters, and to absorb the ideas of other slaves who have reflected on their condition of servitude. Education teaches them that they are human beings with dignity, and that they ought to struggle to have that dignity recognized.

When making the game of Civilization, Sid Meier and Bruce Shelley clearly understood the longstanding relationship between a stable democracy and a strong economy — a relationship which is engendered by all of the factors I’ve just described. Switching your government to democracy in the game thus supercharges your civilization’s economic performance, dramatically increasing the number of “trade” units your cities collect.

But the game isn’t always so clear-sighted; the Civilopedia describes democracy as “fragile” in comparison to other forms of government. I would argue that in many ways just the opposite is the case. It’s true that democracies can be incredibly difficult to start in a country with little tradition of same, as the multiple false starts that we’ve seen in places like Russia and much of sub-Saharan Africa will attest. Yet once they’ve taken root they can be extremely difficult if not impossible to dislodge. Having, as we’ve already seen, the means of self-correction baked into them in a way that no other form of government does, mature democracies are surprisingly robust things. In fact, examples of mature, stable democracies falling back into autocracy simply don’t exist in history to date. [1]The collapsed democracies of places like Venezuela and Sri Lanka, which managed on paper to survive several decades before their downfall, could never be described as mature or stable, having been plagued throughout those decades with constant coup attempts and endemic corruption. Ditto Turkey, which has sadly embraced Putin-style sham democracy in the last few years after almost a century of intermittent crises, including earlier coups or military interventions in civilian government in 1960, 1971, 1980, and 1997. Of course, we have to be wary of straying into the logical fallacy of simply defining any democracy which collapses as never having been stable to begin with. Still, I think the evidence, at least as of this writing, justifies the claim that a mature, stable democracy has never yet collapsed back into blatant authoritarianism. History would seem to indicate that, if a new democracy can survive thirty or forty years without coups or civil wars — long enough, one might say, for democracy to put down roots and become an inviolate cultural tradition — it can survive for the foreseeable future.

Ironically, Civilization portrays its dubious assertion of democratic “fragility” using methods that actually do feel true to history. The ease with which democracies can fall into unrest means that you must pay much closer attention to public opinion — taking the form of your population’s proportion of “unhappy” to “happy” citizens — than under any other system of government. Any democratic politician in the real world, forced to live and die by periodic opinion polls that take the form of elections, would no doubt sympathize with your plight. It’s particularly difficult in the game to prosecute a foreign war as a democracy, both because sending military units abroad sends your population’s morale into the toilet and because the game forces you to always accept peace overtures from your enemies as a matter of public policy.

In light of this last aspect of the game, the intersection of democracy and war in the real world merits digging into a bit further. Earlier in this series of articles, I wrote about the so-called “Long Peace” in which we’ve been living since the end of World War II, in which the great powers of the world have ceased to fight one another directly even when they find themselves at odds politically, and in which war in general has been on a marked decline in the world. I introduced theories about why that might be, such as the fear of nuclear annihilation and the emergence of global peacekeeping institutions like the United Nations. Well, another strong theory comes down to the advance of democracy. It’s long been an accepted rule among historians that mature, stable democracies simply don’t go to war with one another. Thus, as democracies multiply in the world, the possibilities for war decrease in rhythm, thanks to the incontrovertible logic of statistics. For this reason, some historians prefer to call the Long Peace the “Democratic Peace.”

Civilization reflects this democratic aversion to war through the draconian disadvantages that make its version of democracy, although the best government you can have in peacetime, the absolute worst you can have during war. As demonstrated not least by the United States’s many and varied military interventions since 1945, the game if anything overstates the case for democracy as force for peace. Yet, as I also noted in that earlier article, this crippling need the United States military now feels to make its wars, which are now covered by legions of journalists and shown every night on television, such clean affairs says much about its citizens’ unwillingness to accept the full, ugly toll of the country’s voluntary “police actions” and “liberations.”

But what of wars that have bigger stakes? Civilization‘s mechanics actually vastly understate the case for democracy here. They fail to account for the fact that, once the people of a democracy have firmly committed themselves to fighting an all-out war, history gives us little reason to believe that they can’t prosecute that war as well as they could under any other form of government. In reality, the strong economies that usually accompany democracies are an immense military advantage; the staggering economic might of the United States is undoubtedly the primary reason the Allied Powers were able to reverse the tide of Nazi Germany and Imperial Japan and win World War II in, all things considered, fairly short order.

There’s one final element of the game of Civilization‘s take on democracy that merits discussion: its complete elimination of corruption. Under other forms of government, the corruption mechanic causes cities other than your capital to lose a portion of their economic potential to this most insidious of social forces, with how much they lose depending on their distance from your capital. You can combat it only by building courthouses in some of your non-capital cities; they’re fairly expensive in both purchase and maintenance costs, but reduce corruption within their sphere of influence. Or you can eliminate all corruption at a stroke by making your civilization a democracy.

At first blush, this sounds both hilarious and breathtakingly naive. It would seem to indicate, as Johnny L. Wilson and Alan Emrich note in Civilization: or Rome on 640K a Day, that Meier and Shelley’s research into the history of democracy neglected such icons of its American version as Tammany Hall and Teapot Dome, not to mention Watergate. Yet when we really stop to consider, we find that this seemingly naive mechanic may actually be one of the most historically and sociologically perceptive in the whole game.

If you’ve ever traveled independently in a non-democratic, less-developed country, you’ve likely seen a culture of corruption first-hand. Personal business there is done through wads of cash passed from pocket to pocket, and every good and service tends to have a price that fluctuates from customer to customer, based on a reading of what that particular market will bear. Most obviously from your foreigner’s perspective, there are tourist prices and native prices.

The asymmetries that lead to the rampant “cheating” of foreign customers aren’t hard to understand. You can pay twenty times the going rate for that bottle of soda and never think about it again, while your shopkeeper can use the extra money to put some meat on his family’s table tonight; the money is far more important to him than it is to you because you are rich and he is poor. This reality will probably cause you to give up quibbling about petty (to you) sums in fairly short order. But the mindset behind it is deadly to a country’s economic prospects — not least to its tax base, which could otherwise be used to institute the programs of education and infrastructure that can lead a country out of the cycle of poverty. High levels of corruption are comprehensively devastating to a country’s economy — witness, to take my favorite whipping boy again, Vladimir Putin’s thoroughly corrupt Russia with its economy 7 percent the size of the United States’s — while a relative lack of corruption allows it to thrive.

As it happens, corruption levels across government, business, and personal life in the real world correlate incredibly well with the presence or absence of democracy. When we look at the ten least-corrupt countries in the world according to the Corruption Perceptions Index for 2017, we find that nine of them are among the nineteen countries that are given the gold star of Full Democracy by The Economist‘s latest Democracy Index. (Singapore, the sole exception among the top ten, is classified as a Hybrid System.) Meanwhile none of the ten most-corrupt countries qualify as Full or even Flawed Democracies, with seven of the ten classified as full-on authoritarian states. When we further consider that levels of corruption are inversely correlated to a country’s overall economic performance, we add to our emerging picture of just why democracy has accrued so much wealth and power to the developed West since the beginning of the great American experiment back in 1776.

And there may be yet another, more subtle inverse linkage between democracy and corruption. As I noted at the beginning of this pair of articles on Civilization‘s systems of government, I’ve tried to arrange them in an order that reflects the relative stress they place on the individual leader versus the institutions of leadership. Thus the despotic state and the monarchy are so defined by their leaders as to be almost indistinguishable as entities apart from them, while the republic and the democracy mark the emergence of the concept of the state as a sovereign entity unto itself, with its individual leaders mere stewards of a legacy greater than themselves. I don’t believe that this shift in thinking is reflected only in a country’s leadership; it rather extends right through its society. A culture of corruption emphasizes personal, transactional relationships, while its opposite places faith in strong, stable institutions with a lifespan that will hopefully transcend that of the people who staff them at any given time.

So, let’s turn back now to the game’s once-laughable assertion that democracy eliminates corruption, which now seems at least somewhat less laughable. It is, of course, an abstraction at best; a country can no more eliminate corruption than it can eliminate poverty or terrorism (to name a couple of other non-proper nouns on which our politicians like to declare war). Yet a country can sharply de-incentivize it by bringing it to light when it does appear, and by shaming and punishing those who engage in it.

Given the times in which I’m writing this article, I do understand how strange it may sound to argue that Civilization‘s optimistic take on corruption in democracy is at bottom a correct one. Just a couple of years ago in the Full Democracy of Germany, the twelfth least-corrupt country on the planet according to the Corruption Perceptions Index, executives in the biggest of the country’s auto manufacturers were shown to have concocted a despicable scheme to cheat emissions standards worldwide in the name of profit, ignoring the environmental consequences to future generations. And as I write these words the Trump administration in the Flawed Democracy of the United States, sixteenth least-corrupt country on the planet, has so many ongoing scandals that the newspapers literally don’t have enough reporters to cover them all. But the fact that we know about these scandals — that we’re reading about them and arguing about them and in some cases taking to the streets to protest them — is proof that liberal democracy is still working rather than the opposite. Compare the anger and outrage manifested by opponents and defenders alike of Donald Trump with the sullen, defeated acceptance of an oligarchical culture of corruption that’s so prevalent in Russia.

Which isn’t to say that democracy is without its disadvantages. From the moment the idea of rule by the people was first broached in ancient Athens, it’s had fierce critics who have regarded it as inherently dangerous. Setting aside the despots and monarchs who have a vested interest in other philosophies of government, thoughtful criticisms of democracy have almost always boiled down to the single question of whether the great unwashed masses can really be trusted to rule.

Plato was the first of the great democratic skeptics, describing it as the victory of opinion over knowledge. Many of the great figures of American history have ironically taken his point of view to heart, showing considerable ambivalence toward this supposedly greatest of American virtues. The framers of the Constitution twisted themselves into knots over a potential tyranny of the ignorant over the educated, and built into it machinations to hopefully prevent such a scenario — machinations that still determine the direction of American politics to this day. (The electoral college which has awarded the presidency twice in the course of the last five elections to someone who didn’t win the popular vote was among the results of the Founding Fathers’ terror of the masses; in amplifying the votes of the country’s generally less-educated rural areas in recent years, it has arguably had exactly the opposite of its intended effect). Even the great progressive justice Oliver Wendell Holmes could disparage democracy as merely “what the crowd wants.”

In the cottage industry of American political punditry as well, there’s a long tradition of lamenting the failure of the working class to vote their own self-interest on economic matters, of earnest hand-wringing over the way they supposedly fall prey instead to demagogic appeals to cultural identity and religion. One of the best-selling American nonfiction books of 2011 was The Myth of the Rational Voter, which deployed reams of sociological data to reveal that (gasp!) the ballot-box choices of most people have more to do with emotion, prejudice, and rigid ideology than rationality or enlightened self-interest. Recently, such concerns have been given new urgency among the intellectual elite all over the West by events like the election of Donald Trump in the United States, the Brexit vote in Britain, and the wave of populist political victories and near-victories across Europe — all movements that found the bulk of their support among the less educated, a fact that was lost on said elite not at all.

Back in 1872, the British journalist Walter Bagehot wrote of the dangers of rampant democracy in the midst of another conflicted time in British history, as the voting franchise was being gradually expanded through a series of controversial so-called “Reform Bills.” His writing rings in eerie accord with the similar commentaries from our own time, warning as it does of “the supremacy of ignorance over instruction and of numbers [of voters] over knowledge”:

In plain English, what I fear is that both our political parties will bid for the support of the working man; that both of them will promise to do as he likes if he will only tell them what it is. I can conceive of nothing more corrupting or worse for a set of poor ignorant people than that two combinations of well-taught and rich men should constantly defer to their decision, and compete for the office of executing it. “Vox populi” [“the voice of the people”] will be “Vox diaboli” [“the voice of the devil”] if it is worked in that manner.

Consider again my etymology of the word “democracy” from the beginning of this article. “Demos” in the Greek can variously mean, as I explained, the people or the mob. It’s the latter of these that is instinctively feared, by no means entirely without justification, by democratic skeptics like the ones whose views I’ve just been describing. In The Origins of Totalitarianism, Hannah Arendt defines the People as a constructive force, citizens acting in good faith to improve their country’s society, while the Mob is a destructive force, citizens acting out of hate and fear against rather than for the society from which they feel themselves excluded. We often hear it suggested today that we may have reached the tipping point where the People become a Mob in many places in the West. We hear frequently that the typical Brexit or Trump voter feels so disenfranchised and excluded that she just doesn’t care anymore, that she wants to throw Molotov cocktails into the middle of the elites’ most sacred institutions and watch them burn — that she wants to blow up the entire postwar world order that progressives like me believe have kept us safe and prosperous for all these decades.

I can’t deny that the sentiment exists, sometimes even with good reason; modern democracies all remain to a greater or lesser degree flawed creations in terms of equality, opportunity, and inclusivity. I will, however, offer three counter-arguments to the Mob theory of democracy — one drawing from history, one from practicality, and one from a thing that seems in short supply these days, good old idealistic humanism.

My historical argument is that democracies are often messy, chaotic things, but, once again — and this really can’t be emphasized enough — a mature, stable democracy has never, ever collapsed back into a more retrograde system of government. If it were to happen to a democracy as mature and stable as the United States, as is so often suggested by alarmists in the Age of Trump, it would be one of the more shockingly unprecedented events in all of history. As things stand today, there’s little reason to believe that the institutions of democracy won’t survive President Donald Trump, as they have 44 other good, bad, and indifferent presidents before him. Ditto with respect to many of the other reactionary populist waves in other developed democracies.

My practical argument is the fact that, while democracies sometimes go down spectacularly misguided paths at the behest of their citizenry, they’re also far better at self-correcting than any other form of government. The media in the United States has made much of the people who were able to justify voting for Donald Trump in 2016 after having voted for Barack Obama in 2008 and 2012. It’s become fashionable on this basis to question whether the ebbing of racial animus the latter’s election had seemed to represent was all an illusion. Yet there’s grounds for hope as well as dismay there for the current president’s opponents — grounds for hope in the knowledge that the pendulum can swing back in the other direction just as quickly. The anonymity of the voting booth means that people have the luxury of changing their minds entirely with the flick of a pen, without having to justify their choice to anyone, without losing face or appearing weak. Many an autocratic despot or monarch has doubtless dreamed of the same luxury. This unique self-correcting quality of democracy does much to explain why this form of government that the Civilopedia describes as so “fragile” is actually so amazingly resilient.

Finally, my argument from principle comes from the same idealistic place as those famous opening paragraphs of the American Declaration of Independence (“We hold these truths to be self-evident…”). The Enlightenment philosophy that led to that document said, for the first time in the history of the world, that every man was or ought to be master of his own life. If we believe wholeheartedly in these metaphysical principles, we must believe as well that even a profoundly misguided democracy is superior to Plato’s beloved autocracy — even an autocracy under a “philosopher king” who benevolently makes all the best choices for the good of his country’s benighted citizens. For rule by the people is itself the greatest good, and one which no philosopher king can ever provide. Perhaps the best way to convert a Mob back into a People is to let them have their demagogues. When it doesn’t work out, they can just vote them out again come next election and try something else. What other form of government can make that claim?

Most people in the West during most of the second half of the twentieth century would agree that the overarching historical question of their times was whether the world’s future lay with democracy or communism. This was, after all, the question over which the Cold War was being fought (or, if you prefer, not being fought).

For someone studying the period from afar, however, the whole formulation is confusing from the get-go. Democracy has always been seen as a system of government, while communism, in theory anyway, has more to do with economics. In fact, the notion of a “communist democracy,” oxymoronic as it may sound to Western sensibilities, is by no means incompatible with communist theory as articulated by Karl Marx. Plenty of communist states once claimed to be exactly that, such as the German Democratic Republic — better known as East Germany. It’s for this reason that, while people in the West spoke of a Cold War between the supposed political ideologies of communism and democracy, people in the Soviet sphere preferred to talk of a conflict between the economic ideologies of communism and capitalism. And yet accepting the latter’s way of framing the conflict is giving twentieth-century communism far too much credit — as is, needless to say, accepting communism’s claim to have fostered democracies. By the time the Cold War got going in earnest, communism in practice was already a cynical lie.

This divide between communism as it exists in the works of Karl Marx and communism as it has existed in the real world haunts every discussion of the subject. We’ll try to pull theory and practice apart by looking first at Marx’s rosy nineteenth-century vision of a classless society of the future, then turning to the ugly reality of communism in the twentieth century.

One thing that makes communism unique among the systems of government we’ve examined thus far is how very recent it is. While it has roots in Enlightenment thinkers like Henri de Saint-Simon and Charles Fourier, in its complete form it’s thoroughly a product of the Industrial Revolution of the nineteenth century. Observing the world around him, Karl Marx divided society in the new industrial age into two groups. There were the “bourgeoisie,” a French word meaning literally “those who live in a borough,” or more simply “city dwellers”; these people owned the means of industrial production. And then there were the “proletariat,” a Latin word meaning literally “without property”; these people worked the means of production. Casting his eye further back, Marx articulated a view of all of human history as marked by similar dualities of class; during the Middle Ages, for instance, the fundamental divide was between the aristocrats who owned the land which was that era’s wellspring of wealth and the peasants who worked it. “The history of all hitherto existing societies,” he wrote, “is the history of class struggles.” As I mentioned in a previous article, his economic theory of history divided it into six phases: pre-civilized “primitive communism,” the “slave society” (i.e., despotism), feudalism (i.e., monarchy), pure laissez-faire capitalism (the phase the richest and most developed countries were in at the time he wrote), socialism (a mixed economy, not all that different from most of the developed democracies of today), and mature communism. Importantly, Marx believed that the world had to work through these phases in order, each one laying the groundwork for what would follow.

But, falling victim perhaps to a tendency that has dogged many great theorists of history, Marx saw his own times’ capitalist phase as different from all those that had come before in one important respect. Previously, class conflicts had been between the old elite and a new would-be elite that sought to wrest power from them — most recently, the landed gentry versus the new capitalist class of factory owners. But now, with the industrial age in full swing, he believed the next big struggles would be between the bourgeois elites and the proletarian masses as a whole. The proletariat would eventually win those struggles, resulting in a new era of true equality and majority rule. (Here, the eagerness of so many of the later communist states to label themselves democracies starts to become more clear.)

In light of what would follow in the name of Karl Marx, it’s all too easy to overlook the fact that he didn’t see himself as the agent which would bring about this new era; his communism was a description of what would happen in the future rather than a prescription for what should happen. Many of the direct calls to action in 1848’s The Communist Manifesto, by far his most rabble-rousing document, would ironically be universally embraced by the liberal democracies which would become the ideological enemy of communism in the century to come: things such as a progressive income tax, the abolition of child labor, and a basic taxpayer-funded education for everyone. The literary project he considered his most important life’s work, the famously dense three volumes of Capital, are, as the name would indicate, almost entirely concerned with capitalism and its discontents as Marx understood them to already exist, saying almost nothing about the communist future. Written later in his life and thus reflecting a more mature form of his philosophy, Capital shies away from even such calls to action as are found in The Communist Manifesto, saying again and again that the contradictions inherent in capitalism itself will inevitably bring it down when the time is right.

By this point in this life, Marx had become a thoroughgoing historical determinist, and was deeply wary of those who would use his theories to justify premature revolutions of the proletariat. Even The Communist Manifesto‘s calls to action had been intended not to force the onset of the last phase of history — communism — but to prod the world toward the penultimate phase of socialism. True communism, Marx believed, was still a long, long way off. Not least because he wrote so many more words about capitalism than he did about communism, Marx’s vision of the latter can be surprisingly vague for what would later become the ostensible blueprint for dozens upon dozens of governments, including those of two of the three biggest nations on the planet.

With this very basic understanding of Marxist theory, we can begin to understand the intellectual rot that lay at the heart of communism as it was actually implemented in the twentieth century. Russia in 1917 hadn’t even made it to Marx’s fourth phase of industrialized capitalism; as an agrarian economy, more feudal than capitalist, it was still mired in the third phase of history. Yet Vladimir Lenin proposed to leapfrog both of the intervening phases and take it straight to communism — something Marx had explicitly stated was not possible. Similarly ignoring Marx’s description of the transition to communism as a popular revolution of the people, Lenin’s approach hearkened back to Plato’s philosopher kings; he stated that he and his secretive cronies represented the only ones qualified to manage the transition. “It is an irony of history,” remarks historian Leslie Holmes dryly, “that parties committed to the eventual emergence of highly egalitarian societies were in many ways among the most elitist in the world.”

When Lenin ordered the cold-blooded murder of Czar Nicholas II and his entire family, he sketched the blueprint of communism’s practical future as little more than amoral despotism hiding behind a facade of Marxist rhetoric. And when capitalist systems all over the world didn’t collapse in the wake of the Russian Revolution, as he had so confidently predicted, there was never a question of saying, “Well, that’s that then!” and moving on. One of the most repressive governments in history was now firmly entrenched, and it wouldn’t give up power easily. “Socialism in One Country” became Josef Stalin’s slogan, as nationalism became an essential component of the new communism, again in direct contradiction to Marx’s theory of a new world order of classless equality. The guns and tanks parading through Red Square every May Day were a yearly affront to everything Marx had written.

Still, communist governments did manage some impressive achievements. Universal free healthcare, still a pipe dream throughout the developed West at the time, was achieved in the new Soviet Union in the 1920s. Right through the end of the Cold War, average life expectancy and infant-mortality rates weren’t notably worse in most communist countries than they were in Western democracies. Their educational systems as well were often competitive with those in the West, if sometimes emphasizing rote learning over critical thinking to a disturbing degree. Illiteracy was virtually nonexistent behind the Iron Curtain, and fluency in multiple languages was at least as commonplace as in Western Europe. Women were not just encouraged but expected to join the workforce, and were given a degree of equality that many of their counterparts in the West could only envy. The first decade or even in some cases several decades after the transition to communism would often bring an economic boom, as women entered the workforce for the first time and aged infrastructures were wrenched toward modernity, arguably at a much faster pace than could have been managed under a government more concerned about the individual rights of its citizens. Under these centrally planned economies, unemployment and the pain it can cause were literally unknown, as was homelessness. In countries where cars were still a luxury reserved for the more equal among the equal, public transport too was often surprisingly modern and effective.

In time, however, economic stagnation inevitably set in. Corruption in the planning departments — the root of the oligarchical system that still holds sway in the Russia of today — caused some industries to be favored over others with no regard to actual needs; the growing complexity of a modernizing economy overwhelmed the planners; a lack of personal incentive led to a paucity of innovation; prices and demand seemed to have no relation to one another, distorting the economy from top to bottom; the quality of consumer goods remained notoriously terrible. By the late 1970s, the Soviet Union, possessed of some of the richest farmland in the world, was struggling and failing just to feed itself, relying on annual imports of millions of tons of wheat and other raw foodstuffs. The very idea of the shambling monstrosity that was the Soviet economy competing with the emerging post-industrial knowledge economies of the West, which placed a premium on the sort of rampant innovation that can only be born of free press, free speech, and free markets, had become laughable. Francis Fukuyama:

The failure of central planning in the final analysis is related to the problem of technological innovation. Scientific inquiry proceeds best in an atmosphere of freedom, where people are permitted to think and communicate freely, and more importantly where they are rewarded for innovation. The Soviet Union and China both promoted scientific inquiry, particularly in “safe” areas of basic or theoretical research, and created material incentives to stimulate innovation in certain sectors like aerospace and weapons design. But modern economies must innovate across the board, not only in hi-tech fields but in more prosaic areas like the marketing of hamburgers and the creation of new types of insurance. While the Soviet state could pamper its nuclear physicists, it didn’t have much left over for the designers of television sets, which exploded with some regularity, or for those who might aspire to market new products to new consumers, a completely non-existent field in the USSR and China.

Marx had dreamed of a world where everyone worked just four hours per day to contribute her share of the necessities of life to the collective, leaving the rest of her time free to pursue hobbies and creative endeavors. Communism in practice did manage to get half of that equation right; few people put in more than four honest hours of labor per day. (As a popular joke said, “they pretend to pay me and I pretend to work.”) But these sad, ugly gray societies hardly encouraged a fulfilling personal life, given that the tools for hobbies were almost impossible to come by and so many forms of creative expression could land you in jail.

If there’s one adjective I associate more than any other with the communist experiments of the twentieth century, it’s “corrupt.” Born of a self-serving corruption of Marx’s already questionable theories, their economies functioned so badly that corruption on low and on high, of forms small and large, was the only way they could muddle through at all. Just as the various national communist parties were vipers’ nests of intrigue and backstabbing in the name of very non-communist personal ambitions, ordinary citizens had to rely on an extensive black market that lived outside the planned economy in order to simply survive.

So, in examining the game of Civilization‘s take on communism, one first has to ask which version of same is being modeled, the idealistic theory or the corrupt reality. It turns out pretty obviously to be the reality of communism as it was actually practiced in the twentieth century. In another of their crazily insightful translations of history to code, Meier and Shelley made communism’s effect on the game’s mechanic of corruption its defining attribute. A communist economy in the game performs up to the same mediocre baseline standard as a monarchy — which is probably being generous, on the whole. Yet it has the one important difference that economy-draining corruption, rather than increasing in cities located further from your capital, is uniform across the entirety of your civilization. While the utility of this is highly debatable in game terms, it’s rather brilliant and kind of hilarious as a reflection of the way that corruption and communism have always been so inseparable from one another — essential to one another, one might even say — in the real world. After all, when your economy runs on corruption, you need to make sure you have it everywhere.

For all that history since the original Civilization was made has had plenty of troubling aspects, it hasn’t seen any resurgence of communism; even Russia hasn’t regressed quite that far. The new China, while still ruled by a cabal who label themselves the Communist Party, gives no more than occasional lip service to Chairman Mao, having long since become something new to history: a joining of authoritarianism and capitalism that’s more interested in doing business with the West than fomenting revolutions there, and has been far more successful at it than anyone could have expected, enough to challenge some of the conventional wisdom that democracy is required to manage a truly thriving economy. (I’ll turn back to the situation in China and ask what it might mean in the last article of this series.) Meanwhile the last remaining hard-line communist states are creaky old relics from another era, just waiting to take their place in hipster living rooms between vinyl record albums and lava lamps; a place like North Korea would almost be kitschy if its chubby man-child of a leader wasn’t killing and torturing so many of his own people and threatening the world with nuclear war.

When those last remaining old-school communist regimes finally collapse in one way or another, will that be that for Karl Marx as well? Probably not. There are still Marxists among us, many of whom say that the real, deterministic communist revolution is still ahead of us, who claim that the communism of the twentieth century was all a misguided and tragic false start, an attempt to force upon history what history was not yet ready for. They find grist for their mill in the fact that so many of the most progressive democracies in the world have embraced socialism, providing for their citizens much of what Marx asked for in The Communist Manifesto. If this vanguard has thus reached the fifth phase of history, can the sixth and final be far behind? We shall see. In the meantime, though, liberal democracy continues to provide something communism has never yet been able to: a practical, functional framework for a healthy economy and a healthy government right here and now, in the world in which we actually live.

I couldn’t conclude this survey without saying something about anarchy, Civilization‘s least desirable system of government — or, in this case, system of non-government. You fall into it only as a transitional phase between two other forms of government, or if you let your population under a democracy get too unhappy. Anarchy is, as the Civilopedia says, “a breakdown in government” that brings “panic, disruption, waste, and destruction.” It’s comprehensively devastating to your economy; you want to spend as little time in anarchy as you possibly can. And that, it would seem, is just about all there is to say about it.

Or is it? It’s worth noting that the related word “anarchism” in the context of government has another meaning that isn’t acknowledged by the game, one born from many of the same patterns of thought that spawned Karl Marx’s communism. Anarchism’s version of Marx could be said to be one Pierre-Joseph Proudhon, who in 1840 applied what had hitherto been a pejorative term to a new, positive vision of social organization characterized not by yet another new system of government but by government’s absence. Community norms, working in tandem with the natural human desire to be accepted and respected, could according to the anarchists replace government entirely. By 1905, they had earned themselves an entry in the Encyclopædia Britannica:

[Anarchism is] the name given to a principle or theory of life and conduct under which society is conceived without government — harmony in such a society being obtained, not by submission to law, or by obedience to any authority, but by free agreements, concluded between the various groups, territorial and professional, freely constituted for the sake of production and consumption, as also for the satisfaction of the infinite variety of needs and aspirations of a civilised being.

As a radical ideology advocating a classless society, anarchism has often seemed to walk hand in hand with communism. As an ideology advocating the absolute supremacy of individual freedom, it’s sometimes seemed most at home in right-wing libertarian circles. Yet its proponents insist it to be dramatically different from either of these philosophies, as described by the American anarchist activist and journalist Dwight Macdonald in 1957:

The revolutionary alternative to the status quo today is not collectivised property administered by a “workers’ state,” whatever that means, but some kind of anarchist decentralisation that will break up mass society into small communities where individuals can live together as variegated human beings instead of as impersonal units in the mass sum. The shallowness of the New Deal and the British Labour Party’s postwar regime is shown by their failure to improve any of the important things in people’s lives — the actual relationships on the job, the way they spend their leisure, and child-rearing and sex and art. It is mass living that vitiates all these today, and the State that holds together the status quo. Marxism glorifies “the masses” and endorses the State [the latter is not quite true in terms of Marx’s original theories, as we’ve seen]. Anarchism leads back to the individual and the community, which is “impractical” but necessary — that is to say, it is revolutionary.

As Macdonald tacitly admits, it’s always been difficult to fully grasp how anarchism would work in theory, much less in practice; if you’ve always felt that communism is too practical a political ideology, anarchism is the radical politics for you. Its history has been one of constant defeat — or rather of never even getting started — but it never seems to entirely go away. Like Rousseau’s vision of the “noble savage,” it will always have a certain attraction in a world that only continues to get more complicated, in societies that continue to remove themselves further and further from what feels to some like their natural wellspring. For this reason, we’ll have occasion to revisit some anarchist ideas again in the last article of this series.


 

What, then, should we say in conclusion about Civilization and government? The game has often been criticized for pointing you toward one type of government — democracy — as by far the best for developing your civilization all the way to Alpha Centauri. That bias is certainly present in the game, but it’s hard for me to get as exercised about it as some simply because I’m not at all sure it isn’t also present in history. At least if we define progress in the same terms as Civilization, democracy has proved itself to be more than just an airy-fairy ideal; it’s the most effective means for organizing a society which we’ve yet come up with.

Appeals to principle aside, the most compelling argument for democracy has long been the simple fact that it works, that it’s better than any other form of government at creating prosperous, peaceful countries where, as our old friend Georg Wilhelm Friedrich Hegel would put it, the most people have the most chance to fulfill their individual thymos. Tellingly, many of the most convincing paeans to democracy tend to come in the form of backhanded compliments. “Democracy is the worst form of government,” famously said Winston Churchill, “except for all those other forms that have been tried from time to time.” Or, as the theologian Reinhold Niebuhr wrote, “Man’s inclination to justice makes democracy possible, but man’s capacity for injustice makes it necessary.” Make no mistake: democracy is a messy business. But history tells us that it really does work.

None of this is to say that you should be sanguine about your democracy’s future, assuming you’re lucky enough to live in one. Like videogames, democracy is an interactive medium. Protests and bitter arguments are a sign that it’s working, not the opposite. So, go protest and argue and all the rest, but remember as you do so that this too — whatever this happens to be — shall pass. And, at least if history is any guide, democracy shall live on after it does.

(Sources: the books Civilization, or Rome on 640K a Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, The Republic by Plato, Politics by Aristotle, Plough, Sword, and Book: The Structure of Human History by Ernest Gellner, Aristocracy: A Very Short Introduction by William Doyle, Democracy: A Very Short Introduction by Bernard Crick, Plato: A Very Short Introduction by Julia Annas, Political Philosophy: A Very Short Introduction by David Miller, The Myth of the Rational Voter by Bryan Caplan, Anarchism: A Very Short Introduction by Colin Ward, Communism: A Very Short Introduction by Leslie Holmes, Corruption: A Very Short Introduction by Leslie Holmes, The Communist Manifesto by Karl Marx and Friedrich Engels, Capital by Karl Marx, The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, What’s the Matter with Kansas: How Conservatives Won the Heart of America by Thomas Frank, The Origins of Totalitarianism by Hannah Arendt.)

Footnotes

Footnotes
1 The collapsed democracies of places like Venezuela and Sri Lanka, which managed on paper to survive several decades before their downfall, could never be described as mature or stable, having been plagued throughout those decades with constant coup attempts and endemic corruption. Ditto Turkey, which has sadly embraced Putin-style sham democracy in the last few years after almost a century of intermittent crises, including earlier coups or military interventions in civilian government in 1960, 1971, 1980, and 1997. Of course, we have to be wary of straying into the logical fallacy of simply defining any democracy which collapses as never having been stable to begin with. Still, I think the evidence, at least as of this writing, justifies the claim that a mature, stable democracy has never yet collapsed back into blatant authoritarianism.
 

Tags: , , ,

The Game of Everything, Part 7: Civilization and Government I (Despotism, Monarchy, and the Republic)

Monarchy is like a splendid ship. With all sails set it moves majestically on, but then it hits a rock and sinks forever.

— Fisher Ames

In The Republic, that most famous treatise ever written on the slippery notions of good and bad government, Plato describes what first motivated people to willingly cede some of their personal freedoms to others. He writes that “a state arises, as I conceive, out of the needs of mankind; no one is self-sufficing, but all of us have many wants. Can any other origin of a state be imagined?”

Even in relatively primitive civilizations, the things that need doing outstrip the ability of any single individual to learn how to do them. From this comes specialization, that key marker of civilization. But for specialization to work, a civilization needs a marketplace — a central commons where goods and services can be bought, sold, and traded for. Maintaining such a space, and resolving any disputes that arise in it, requires a central authority. And then, as the fruits of specialization cause a civilization to rise in the world, outsiders inevitably begin to think about taking what it has. Thus a standing army needs to be created. So, already we have the equivalents of a Department of Justice, a Department of Commerce, and a Department of Defense. But now we have another problem: the people staffing all of these bureaucracies, not to mention the soldiers in our army, don’t themselves produce goods and services which they can use to sustain themselves. Thus we now need an Internal Revenue Service of some sort, to collect taxes from the rest of the people — by force, if necessary — so that the bureaucrats and the soldiers have something to live on. And so it continues.

I want to point out a couple of important features of this snippet of the narrative of progress I’ve just outlined. The first is that, of all forms of progress, the growth of government is greeted with the least enthusiasm; for most people, government is the very definition of a necessary evil. At bottom, it becomes necessary because of one incontrovertible fact: that what is best for the individual in a vacuum is almost never what is best for the collective. Government addresses this fundamental imbalance, but in doing so it’s bound to create resentment in the individuals over whom it asserts control. Even if we understand that it’s necessary, even if we agree in principle with the importance of regulating commerce, protecting our country’s borders, even collecting funds to help the young, the old, the sick, and the disadvantaged, how many of us don’t cringe a little when we have to pay the taxman out of our own hard-earned wages? Not for nothing do the people of almost all times and all countries feel a profound ambivalence toward their entire political class, those odd personality types willing to baldly seek power over their peers. Will Durant:

If the average man had had his way there would probably never have been any state. Even today he resents it, classes death with taxes, and yearns for that government which governs least. If he asks for many laws it is only because he is sure that his neighbor needs them; privately he is an unphilosophical anarchist, and thinks laws in his own case superfluous.

The other thing that bears pointing out is that, even though they make up two separate departments in a university, political science and economics are very difficult to pull apart in the real world. Certainly one doesn’t have to be a Marxist to acknowledge that it was commerce that gave rise to government in the first place. In Plough, Sword, and Book, his grand sociological theory of history, Ernest Gellner writes that “property and power are correlative notions. The agricultural revolution gave birth to the exchange and storage of both necessities and wealth, thereby turning power into an unavoidable aspect of social life.” The interconnections between government and economics can be tangled indeed, as in the case of a descriptor like “communism,” technically an economic system but one which, in modern usage at least, presumes much about government as well; a phrase like “communist democracy” rings as an oxymoron to ears brought up in the Western tradition of liberal democracy.

In this light, we can probably forgive the game of Civilization for lumping communism into its list of possible systems of “government” for your civilization, as I hope you’ll be able to forgive me for discussing it in this pair of articles on those systems. (The article that follows this pair will address other aspects of economics in Civilization.) Each of Civilization‘s broadly-drawn governments provides one set of answers to the eternally fraught questions of who should have power in a society, what the limits of that power should be, and whence the power should be derived. As such, they lend themselves better than just about any other aspect of the game to systematic step-by-step analysis. So, that’s what I want to do in this article and the next, looking at each of the six in turn, asking, as has become our standard practice in this series, what we can learn about the real history of government from the game and what we can learn about the game from the real history of government.

That said, there are — also as usual — complications. This is one of the few places where Civilization rather breaks down as a linear narrative of progress. You don’t need to progress through each of the governments one by one in order to be successful at the game. In fact, just the opposite; many players never feel the need to adopt more than a couple of the six governments over the course of their civilization’s millennia of steady progress on other fronts. Likewise, depending on which advances they choose to research when, players of Civilization may see the various governments become available for adoption in any number of different orders and combinations.

And then too, unlike just about every other element of the game, the effectiveness of the governments in Civilization can’t be ranked in ascending order of desirability by the date of their first appearance in actual human history. If that was the case, communism — or possibly, as we shall see, anarchism — would have to be the best government of all in the game, something that most definitely isn’t true. Government just doesn’t work like that, in history or in the game. Democracy, for example, a form of government inextricably associated with modern developed nations and the likes of Francis Fukuyama’s end of history, is actually a very ancient idea. Given this, I’ve hit upon what feels like a logical method of my own for ordering Civilization‘s systems of government here, in a descending ranking based on the power and status each system in its most theoretical or idealized form vests in its leader or leaders. If that sounds confusing right now, please trust that my logic will become clearer as we move through them.

I do need to emphasize that this overview isn’t written even from 30,000 feet, but rather from something more akin to a low orbit over a massively complicated topic. Do remember as you read on that these strokes I’m laying down are — to mix some metaphors — very broad. I’m well aware that our world has contained and does contain countless debatable edge cases and mangy hybrids. That acknowledged, I do believe that setting aside the details of day-to-day politics and returning to first principles of government, as it were, might just be worthwhile from time to time.

Despotism is the most blunt of all political philosophies, one otherwise known as the law-of-the-jungle or the Lord of the Flies approach to governance. It states simply that he who is strong and crafty enough to gain power over all his rivals shall rule exactly as long as he remains strong and crafty enough to maintain it. For whatever that period may be, the despot and the state he rules are effectively one and the same. Regardless of what the despot might say to justify his rule, in the end despotism is might-makes-right distilled to its purest essence.

“Every state begins in compulsion,” writes Will Durant. In the formative stages of any civilization, despotism truly is an inevitability. In a society with no writing, no philosophy, no concept of human rights or social justice, no other form of government could possibly take hold. “Without autocratic rule,” writes the philosopher and sociologist Herbert Spencer, “the evolution of society could not have commenced.”

So, it’s perfectly natural that the game of Civilization as well always begins in despotism. In the early stages of the game especially, absolute power has undeniable advantages. Aristotle considered all of the citizens under a despotic government to be nothing more nor less than slaves, and that understanding is reflected in the way the game lets you form them into military units and march them off to war without having to pay them for the privilege, nor having to worry about the morale of the folks left behind at the home front. The military advantages despotism offers are thus quite considerable indeed. If your goal in the game is to conquer the world rather than climb the narrative of progress all the way to Alpha Centauri, you can easily win while remaining a despot throughout.

Yet the game does also levy a toll on despotism, one which, depending on your strategy, can become more and more difficult to bear as it continues. Cities under despotism grow slowly if at all, and are terrible at exploiting the resources available to them. If you do want to make it to Alpha Centauri, you’re thus best advised to leave despotism behind just as quickly as you can.

All of which rings true to history. The economy of a society that lives in fear — a society where ideas are things hated and feared by the ruling junta — will almost always badly lag that of a nation with a more sophisticated form of government. In particular, despotism is a terrible system for managing an industrial or post-industrial economy geared toward anything but perpetual war. Political scientist Bernard Crick:

Most autocracies (and military governments) are in agrarian societies. Attempts to industrialize either lead to democratization as power is spread and criticism is needed, or to concentrations of power as if towards totalitarianism but usually resulting in chronic economic and political instability. The true totalitarian regimes were war economies, whether at war or not, rejecting “mere” economic criteria.

Although the economic drawbacks of despotism are modeled, the structure of the game of Civilization doesn’t give it a good way of reflecting perhaps the most crippling of all the disadvantages of despotism in the real world: its inherent instability. A government destined to last only as long as the cult of personality at its center continues to breathe is hardly ideal for a real civilization’s long-term health. But in the game, the notion of a ruler outside yourself exists not at all; you play a sort of vaguely defined immortal who controls your civilization through the course of thousands of years. Unlike a real-world despot, you don’t have to constantly watch your back for rivals, don’t have to abide by the rule of history that he who lives by the sword shall often die by the sword. And you don’t have to worry about the chaos that ensues when a real despot dies — by natural causes or otherwise — and his former underlings throw themselves into a bloody struggle to find out who will replace him.

For the ruthless power-grabbing despot in the real world, who’s in this only to gratify his own megalomaniacal urges, this supposed disadvantage is of limited importance at best. For those who live on after him, though, it’s far more problematic. Indeed, a good test for deciding whether a given country’s government is in fact a despotic regime is to ask yourself whether it’s clear what will happen when the current leader dies of natural causes, is killed, steps down, or is toppled from power. Ask yourself whether, to put it another way, you can imagine the country continuing to be qualitatively the same place after one of those things happens. If the answer to either of those questions is no, the answer to the question of whether the current leader is a despot is very likely yes.

It would be nice if, given the instability of despotism as well as all of the other ethical objections that accompany it, I could write of it as merely a necessary formative stage of government, a stepping stone to better things. But unfortunately, despotism, the oldest form of human government, has stubbornly remained with us down through the millennia. In the twentieth century, it flared up again even in the heart of developed Europe under the flashy new banner of fascism. Thankfully, its inherent weaknesses meant that neither Mussolini’s Italy, Hitler’s Germany, nor Franco’s Spain could survive beyond the deaths of the despots whose names will always be synonymous with them.

And yet despotism still lives on today, albeit often cloaked under a rhetoric of pseudo-legitimacy. Vladimir Putin of Russia, that foremost bogeyman of the modern liberal-democratic West, one of the most stereotypically despotic Bond villains on the current world stage, nevertheless feels the need to hold a sham election every six years. Peek beneath the cloak of democracy in Russia, however, and all the traits of despotism are laid bare. The economic performance of this, the biggest country in the world, is absolutely putrid, to the tune of about 7 percent of the gross national product of the United States, despite being blessed with comparable natural resources. And then there’s the question of what will happen in Russia once Putin’s gone. Tellingly, commentators have been asking that very question with increasing urgency in recent years, as “Putin’s Russia” threatens to become a descriptor of an historical nation unto itself not unlike Hitler’s Germany.

The inventors of monarchy — absolute rule by a single familial lineage rather than absolute rule by a single individual — appear to have been the ancient Egyptians. Its first appearance there in perhaps as early as 3500 BC marks an early attempt to remedy the most obvious weakness of despotism, the lack of any provision for what happens after any given despot is no more. It’s not hard to grasp where the impulse that led to it came from. After all, it’s natural for any father to want to leave a legacy to his son. Why not the country he happens to rule?

At the same time, though, with monarchy we see the first stirrings of a concern that will become more and more prevalent as we continue with this tour of governments: a concern for the legitimacy of a government, with providing a justification for its existence beyond the age-old dictum of might-makes-right. Electing to go right to the source of all things in the view of most pre-modern societies, monarchs from the time of the Egyptian pharaohs claimed to enjoy the blessing of the gods or God, or in some cases to be gods themselves. In the centuries since, there has been no shortage of other justifications, such as claims to tradition, to a constitution, or just to good old superior genes. But more important than the justifications themselves for our purposes is the fact that they existed, a sign of emerging sophistication in political thought. Depending on how compelling those being ruled over found them to be, they insulated the rulers to a lesser or greater degree from the unmitigated law of the jungle.

In time, the continuity engendered by monarchy in many places allowed not just a ruling family but an extended ruling class to arise, who became known as the aristocracy. Ironically for a word so overwhelmingly associated today with inherited wealth and privilege, “aristocracy” when first coined in ancient Greece had nothing to do with family or inheritance. The aristocracy of a society was simply the best people in that society; the notion of aristocratic rule was thus akin to the modern concept of meritocracy. We can still see some of this etymology in the synonym for “aristocrat” of “noble,” which has long been taken to mean, depending on context, either a high-born member of the ruling class or a brave, upright, trustworthy sort of person in general; think of Rousseau’s “noble savages.” (The word “aristocrat” hasn’t been quite so lucky in recent years, bearing with it today a certain connotation of snobbery and out-of-touchness.)

Aristocrats of the old school were one of the bedrocks of the idealized theory of government outlined by Aristotle in Politics; an “autocracy” ruled by true aristocrats was according to him one of the best forms of government, although it could easily decay into what he called “oligarchy.” Yet the fact was that ancient Greece and Rome were every bit as obsessed with bloodlines as would be the wider Europe in centuries to come, and it didn’t take long for the ruling classes to assert that the best way to ensure the best people had control of the government was to simply pass power from high-born father to high-born son. Whether under the ancients’ original definition of the word or its more modern usage, writes the historian of aristocracy William Doyle, “the essence of aristocracy is inequality. It rests on the presumption that some people are naturally better than others.” For thousands of years, this presumption was at the core of political and social orders throughout Europe and most of the world.

“Monarchy seems the best balanced government in the game,” note Johnny L. Wilson and Alan Emrich in Civilization: or Rome on 640K a Day. And, indeed, the game of Civilization takes monarchy as its sort of baseline standard government. Your economy is subject to no special advantages or disadvantages, reflecting the fact that historical monarchies have tended to be, generally speaking, somewhat less tyrannical than despotic governments, thanks not least to a diffusion of power among what can become quite a substantial number of aristocratic elites. This is a good thing in economic as well as ethical terms; a population that spends less time cowering in fear has more time to be productive. But it does mean that the player of the game needs to pay more to maintain a military under a monarchical government, a reflection of that same diffusion of power.

Once again, though, Civilization‘s structure makes it unable to portray the most important drawbacks of monarchy from the standpoint of societal development. A country that employs as a philosophy of governance such a system of nepotism taken to the ultimate extreme is virtually guaranteed to wind up in the hands of a terrible leader within a few generations — for, despite the beliefs that informed aristocratic privilege down through all those centuries, there’s little actual reason to believe that such essential traits of leadership as wisdom, judgment, forbearance, and decisiveness are heritable. Indeed, the royal family of many a country may have wound up ironically less qualified for the job of running it than the average citizen, thanks to a habit of inbreeding in the name of maintaining those precious bloodlines, which could sometimes lead to unusual incidences of birth defects and mental disorders. The ancient Egyptians made even brother-sister marriages a commonplace affair among the pharaohs, and were rewarded with an unusual number of truly batshit crazy monarchs.

Thus even despotism has an advantage over monarchy in the quest to avoid really, really bad leaders. At least the despot who rises to the top of the heap through strength and wiles has said strength and wiles to rely on as ruler. The histories of monarchies tend to be a series of wild oscillations between boom and bust, all depending on who’s in charge at the time; if Junior is determined to smash up the car, mortgage the house, and invest the family fortune in racehorses after Daddy dies, there’s not much anyone can do about it. Consider the classic example of England during the Renaissance period. The reigns of Elizabeth I and James I yielded great feats of exploration and scientific discovery, major military victories, and the glories of Shakespearean theater. Then along came poor old Charles I, who within 25 years had managed to bankrupt the treasury, spark a civil war, get himself beheaded, and prompt the (brief-lived) abolition of the very concept of a King of England. With leadership like that, a country doesn’t need external enemies to eat itself alive.

The need for competent leadership in an ever more complicated world has caused monarchy, even more so than despotism, to fall badly out of fashion in the last century; I tend to think the final straw was the abject failure of the European kings and queens, almost all of whom were in family with one another in one way or another, to do anything to stop the slow, tragically pointless march up to World War I. Monarchies where the monarch still wields real power today are confined to special situations, such as the micro-states of Monaco and Liechtenstein, and special regions of the world, such as the Middle East.

In Europe, for all those centuries the heart of the monarchical tradition, a surprising number of countries have elected to retain their royal families as living, breathing national-heritage sites, but they’re kept so far removed from the levers of real political power that the merest hint of a political opinion from one of them can set off a national scandal. I confess that I personally don’t understand this desire to keep a phantom limb of the monarchical past alive, and think the royals can darn well turn in the keys to their taxpayer-funded palaces and go get real jobs like the rest of us. I find the fondness for kings and queens particularly baffling in the case of Scandinavia, a place where equality has otherwise become such a fundamental cultural value. But then, I grew up in a country with no monarchical tradition, and I am told that maintaining the tradition of royalty brings in more tourist dollars than it costs tax dollars in Britain at least. I suppose it’s harmless enough.

The notion of a certain group of people who are inherently better-suited to rule than any others is sadly less rare than non-national-heritage monarchies in the modern world. Still, because almost all remaining believers in such a notion believe the group in question is the one to which they themselves belong, such points of view have an inherent problem gaining traction in the court of public opinion writ large.

Of all the forms of government in Civilization, the republic is the most difficult to get a handle on. If we look the word up in the dictionary, we learn little more for sure about any given republic than that it’s a nation that’s not a monarchy. Beyond that, it can often seem that the republic is in the eye of the beholder. In truth, it’s doubtful whether the republic should be considered a concrete system of government at all in the sense of its companions in Civilization. It’s become one of those words everyone likes to use whose real definition no one seems to know. Astonishingly, more than three-quarters of the sovereign nations in the world today have the word “republic” somewhere in their full official name, a range encompassing everything from the most regressive religious dictatorships to the most progressive liberal democracies. Growing up in American schools, I remember it being drilled into me that “the United States is a republic, not a democracy!” because, rather than deciding on public policy via direct vote, citizens elect representatives to lead the nation. Yet such a distinction is not only historically idiosyncratic, it’s also practically pointless. By this standard, no more than one or two real democracies have ever existed in the history of the world, and none after ancient times. Such a thing would, needless to say, be utterly impossible to administer in this complicated modern world of ours. Anyway, if you really insist on getting pedantic about definitions, you can always use the term “representative democracy.”

I suspect that the game of Civilization‘s inclusion of the republic is most heavily informed by the ancient Roman Republic, which can be crudely described as a form of sharply limited democracy, where ordinary citizens of a certain stature were given a limited ability to check the powers of the aristocracy. Every legionnaire of the Republic had engraved upon his shield the motto “Senatus Populusque Romanus”: “The Senate and the People of Rome.” Aristocrats made up the vast majority of the former institution, with the plebeians, or common people, largely relegated to a subsidiary role in the so-called plebeian tribunes. The vestiges of such a philosophy of government survive to this day in places like the British Parliament, with its split between a House of Lords — which has become all but powerless in this modern age of democracy, little more than yet another living national-heritage site — and a House of Commons.

In that same historical spirit, the republic in the game functions as a sort of halfway point between monarchy and democracy, subject to a milder form of the same advantages and disadvantages as the latter — advantages and disadvantages which I think are best discussed when we get to democracy itself.

I don’t want to move on from the republic quite yet, however, because when we trace these roots back to ancient times we do find something else there of huge conceptual importance. Namely, we find the idea of the state as an ongoing entity unto itself, divorced from the person or family that happens to rule it. This may seem a subtle distinction, but it really is an important one.

The despot or the monarch for all intents and purposes is the state. Thus the royal we you may remember from Shakespeare; “Our sometime sister, now our queen,” says Claudius in Hamlet in reference to his wife, who is also the people’s queen. The ruler and the state he rules truly are one. But with the arrival of the republic on the scene, the individual ruler is separate from and — dare I say it? — of less overall significance than the institution of the state itself. He becomes a mere caretaker of a much greater national legacy. This is important not least because it opens the door to a system of government that prioritizes the good of the state over that of its leaders, that emphasizes the ultimate sovereignty of the state as a whole, in the form of all of the citizens that make it up. It opens the door to, as John Adams famously said after the drafting of the American Constitution, “a government of laws and not men.” In other words, it makes the philosophy of government safe for democracy.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, The End of History and the Last Man by Francis Fukuyama, The Rise and Fall of Ancient Egypt by Toby Wilkinson, The Republic by Plato, Politics by Aristotle, Plough, Sword, and Book: The Structure of Human History by Ernest Gellner, Aristocracy: A Very Short Introduction by William Doyle, Democracy: A Very Short Introduction by Bernard Crick, Plato: A Very Short Introduction by Julia Annas, Political Philosophy: A Very Short Introduction by David Miller, and Hamlet by William Shakespeare.)

 
 

Tags: , , ,

The Game of Everything, Part 6: Civilization and Religion

Science without religion is lame, religion without science is blind.

— Albert Einstein

If you ever feel like listening to two people talking past one another, put a strident atheist and a committed theist in a room together and ask them to discuss The God Question. The strident atheist — who, as a colleague of the psychologist and philosopher of religion William James once put it, “believes in No-God and worships Him” — will trot out a long series of supremely obvious, supremely tedious Objective Truths. He’ll talk about evolution, about the “God of the gaps” theory of religion as a mere placeholder for all the things we don’t yet understand, about background radiation from the Big Bang, about the age-old dilemma of how a righteous God could allow all of the evil and suffering which plague our world. He’ll talk and talk and talk, all the while presuming that the theist couldn’t possibly be intelligent enough to have considered any of these things for herself, and that once she’s been exposed to them at last her God delusion will vanish in a puff of incontrovertible logic. The theist, for her part, is much less equipped to argue in this fashion, but she does her best, trying to explain using the crude tool of words her ineffable experiences that transcend language. But her atheist friend, alas, has no time, patience, or possibly capability for transcendence.

My own intention today certainly isn’t to convince you of the existence or non-existence of God. Being a happy agnostic —  one of what the Catholic historian Hugh Ross Williamson called “the wishy-washy boneless mediocrities who flap around in the middle” — I make a poor advocate for either side of the debate.  But I will say that, while I feel a little sorry for those people who have made themselves slaves to religious dogma and thereby all but lost the capacity to reason in many areas of their lives, I also feel for those who have lost or purged the capacity to get beyond logic and quantities and experience the transcendent.

“One must have musical ears to know the value of a symphony,” writes William James. “One must have been in love one’s self to understand a lover’s state of mind. Lacking the heart or ear, we cannot interpret the musician or the lover justly, and are even likely to consider him weak-minded or absurd.” Richard Dawkins, one of the more tedious of our present-day believers in No-God, spends the better part of a chapter in his book The God Delusion twisting himself into knots over the Einstein quote that opens this article, trying to logically square the belief of the most important scientist of the twentieth century in the universe’s ineffability with the same figure’s claim not to believe in a “personal God.” Like a cat chasing a laser pointer, Dawkins keeps running around trying to pin down that which refuses to be captured. He might be happier if he could learn just to let the mystery be.

In a sense, a game which hopes to capture the more transcendent aspects of life runs into the same barriers as the unadulteratedly rational person hoping to understand them. Many commenters, myself included, have criticized games over the years for a certain thematic niggardliness, a refusal to look beyond the physics of tanks and trains and trebuchets and engage with the concerns of higher art. We’ve tended to lay this failure at the feet of a design culture that too often celebrates immaturity, but that may not be entirely fair. Computers are at bottom calculating machines, meaning they’re well-suited to simulating easily quantifiable physical realities. But how do you quantify love, beauty, or religious experience? It can be dangerous even to try. At worst — and possibly at best as well — you can wind up demeaning the very ineffabilities you wished to celebrate.

Civilization as well falls victim to this possibly irreconcilable dilemma. In creating their game of everything, their earnest attempt to capture the entirety of the long drama of human civilization, Sid Meier and Bruce Shelley could hardly afford to leave out religion, one of said drama’s prime driving forces. Indeed, they endeavored to give it more than a token part, including the pivotal advances of Ceremonial Burial, Mysticism, and Religion — the last really a stand-in for Christianity, giving you the opportunity to build “cathedrals” — along with such religious Wonders of the World as the Oracle of Delphi, the Sistine Chapel, and the church music of Johann Sebastian Bach. Yet it seems that they didn’t know quite what to do with these things in mechanical, quantifiable, computable terms.

The Civilopedia thus tells us that religion was important in history only because “it brought peace of mind and the ability to get on with the work of life.” In that spirit, all of the advances and Wonders dealing with religion serve in one way or another to decrease the unhappiness level of your cities — a level which, if it gets too high, can throw a city and possibly even your entire civilization into revolt. “The role of religion in Sid Meier’s Civilization,” note Johnny L. Wilson and Alan Emrich in Civilization: or Rome on 640K a Day, “is basically the cynical role of pacifying the masses rather than serving as an agent for progress.” This didn’t sit terribly well with Wilson in particular, who happened to be an ordained Baptist minister. Nor could it have done so with Sid Meier, himself a lifelong believer. But, really, what else were they to do with religion in the context of a numbers-oriented strategy game?

I don’t have an answer to that question, but I do feel compelled to make the argument the game fails to make, to offer a defense of religion — and particularly, what with Civilization being a Western game with a Western historical orientation, Christianity — as a true agent of progress rather than a mere panacea. In these times of ours, when science and religion seem to be at war and the latter is all too frequently read as the greatest impediment to our continued progress, such a defense is perhaps more needed than ever.

Richard Dawkins smugly pats himself on the back for his fair-mindedness when, asked if he really considers religion to be the root of all evil in the world, he replies that no, “religion is not the root of all evil, for no one thing is the root of everything.” And yet, he tells us:

Imagine, with John Lennon, a world with no religion. Imagine no suicide bombers, no 9/11, no 7/7, no Crusades, no witch hunts, no Gunpowder Plot, no Indian partition, no Israeli/Palestinian Wars, no Serb/Croat/Muslim massacres, no persecution of Jews as “Christ-killers,” no Northern Ireland “troubles,” no “honour killings,” no shiny-suited bouffant-haired televangelists fleecing gullible people of their money (“God wants you to give till it hurts”). Imagine no Taliban to blow up ancient statues, no public beheadings of blasphemers, no flogging of female skin for the crime of showing an inch of it.

Fair points all; the record of religious — and not least Christian — atrocities is well-established. In the interest of complete fairness, however, let’s also acknowledge that but for religion those ancient statues whose destruction at the hands of the Taliban Dawkins so rightfully decries, not to mention his Jews being persecuted by Christians, would never have existed in the first place. Scholar of early Christianity Bart D. Ehrman — who, in case it matters, is himself today a reluctant non-believer — describes a small subset of the other things the world would lack if Christianity alone had never come to be:

The ancient triumph of Christianity proved to be the single greatest cultural transformation our world has ever seen. Without it the entire history of Late Antiquity would not have happened as it did. We would never have had the Middle Ages, the Reformation, the Renaissance, or modernity as we know it. There could never have been a Matthew Arnold. Or any of the Victorian poets. Or any of the other authors of our canon: no Milton, no Shakespeare, no Chaucer. We would have had none of our revered artists: no Michelangelo, Leonardo da Vinci, or Rembrandt. And none of our brilliant composers: no Mozart, Handel, or Bach.

One could say that such an elaborate counterfactual sounds more impressive than it really is; the proverbial butterfly flapping its wings somewhere in antiquity could presumably also have deprived us of all those things. Yet I think Ehrman’s deeper point is that all of the things and people he mentions, along with the modern world order and even the narrative of progress that has done so much to shape it, are at heart deeply Christian, whether they express any beliefs about God or not. I realize that’s an audacious statement to make, so let me try to unpack it as carefully as possible.

In earlier articles, I’ve danced around the idea of the narrative of progress as a prescriptive ethical framework — a statement of the way things ought to be — rather than a descriptive explication of the way they actually are. Let me try to make that idea clearer now by turning to one of the most important documents to emerge from the Enlightenment, the era that spawned the narrative of progress: the American Declaration of Independence.

We don’t need to read any further than the beginning of the second paragraph to find what we’re looking for: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the pursuit of Happiness.” No other sentence I’ve ever read foregrounds the metaphysical aspect of progress quite like this one, the most famous sentence in the Declaration, possibly the most famous in all of American history. It’s a sentence that still gives me goosebumps every time I read it, thanks not least to that first clause: “We hold these truths to be self-evident.” This might just be the most sweeping logical hand-wave I’ve ever seen. Nowhere in 1776 were any of these “truths” about human equality “self-evident.” The very notion that a functioning society could ever be founded on the principle of equality among people was no more than a century old. Over the course of that century, philosophers such as Immanuel Kant and John Locke had expended thousands of pages in justifying what Thomas Jefferson was now dismissing as a given not worthy of discussion. With all due caveats to the scourge of slavery and the oppression of women and all the other imperfections of the young United States to come, the example that a society could indeed be built around equal toleration and respect for everyone was one of the most inspiring the world has ever known — and one that had very little to do with strict rationality.

Even today, there is absolutely no scientific basis to a claim that all people are equal. Science clearly tells us just the opposite: that some people are smarter, stronger, and healthier than other people. Still, the modern progressive ideal, allegedly so rational, continues to take as one of its most important principles Jefferson’s leap of faith. Nor does Jefferson’s extra-rational idealism stand alone. Consider that one version of the narrative of progress, the one bound up with Georg Wilhelm Friedrich Hegel’s eschatology of an end to which all of history is leading, tells us that that end will be achieved when all people are allowed to work in their individual thymos-fulfilling roles, as they ought to be. But “ought to,” of course, has little relevance in logic or science.

If the progressive impulse cannot be ascribed to pure rationality, we have to ask ourselves where Jefferson’s noble hand-wave came from. In a move that will surprise none of you who’ve read this far, I’d like to propose that the seeds of progressivism lie in the earliest days of the upstart religion of Christianity.

“The past is a foreign country,” wrote L.P. Hartley in The Go-Betweens. “They do things differently there.” And no more recent past is quite so foreign to us as the time before Jesus Christ was (actually or mythically) born. Bart D. Ehrman characterizes pre-Christian Mediterranean civilization as a culture of “dominance”:

In a culture of dominance, those with power are expected to assert their will over those who are weaker. Rulers are to dominate their subjects, patrons their clients, masters their slaves, men their women. This ideology was not merely a cynical grab for power or a conscious mode of oppression. It was the commonsense, millennia-old view that virtually everyone accepted and shared, including the weak and marginalized.

This ideology affected both social relations and government policy. It made slavery a virtually unquestioned institution promoting the good of society; it made the male head of the household a sovereign despot over all those under him; it made wars of conquest, and the slaughter they entailed, natural and sensible for the well-being of the valued part of the human race (that is, those invested with power).

With such an ideology one would not expect to find governmental welfare programs to assist weaker members of society: the poor, homeless, hungry, or oppressed. One would not expect to find hospitals to assist the sick, injured, or dying. One would not expect to find private institutions of charity designed to help those in need.

There’s a telling scene late in The Iliad which says volumes about the ancient system of ethics, and how different it was from our own. Achilles is about to inflict the killing blow on a young Trojan warrior, who begs desperately for his life. Achilles’s reply follows:

“Come, friend, you too must die. Why moan about it so?
Even Patroclus died, a far, far better man than you.
And look, you see how handsome and powerful I am?
The son of a great man, the mother who gave me life
a deathless goddess. But even for me, I tell you,
death and the strong force of fate are waiting.
There will come a dawn or sunset or high noon
when a man will take my life in battle too —
flinging a spear perhaps
or whipping a deadly arrow off his bow.”

Life and death in Homer are matters of fate, not of morality. Mercy is neither given nor expected by his heroes.

Of course, the ancients had gods — plenty of them, in fact. A belief in a spiritual realm of the supernatural is far, far older than human civilization, dating back to the primitive animism of the earliest hunter-gatherers. By the time Homer first chanted the passage above, the pantheon of Greek gods familiar to every schoolchild today had been around for many centuries. Yet these gods, unsurprisingly, reflected the culture of dominance, so unspeakably brutal to our sensibilities, that we see in The Iliad, a poem explicitly chanted in homage to them.

The way these gods were worshiped was different enough from what we think of as religion today to raise the question of whether the word even applies to ancient sensibilities. Many ancient cultures seem to have had no concept or expectation of an afterlife (thus rather putting the lie to one argument frequently trotted out by atheists, that the entirety of the God Impulse can be explained by the very natural human dread of death). The ancient Romans carved the phrase “non fui; fui; non sum; non curo” on gravestones, which translates to “I was not; I was; I am not; I care not.” It’s a long way from “at rest with God.”

Another, even more important difference was the non-exclusivity of the ancient gods. Ancient “religion” was not so much a creed or even a collection of creeds as it was a buffet of gods from which one could mix and match as one would. When one civilization encountered another, it was common for each to assimilate the gods of the other, creating a sort of divine mash-up. Sumerian gods blended with the Babylonian, who blended with the Greek, who were given Latin names and assimilated by the Romans… there was truly a god for every taste and for every need. If you were unlucky in love, you might want to curry favor with Aphrodite; if the crops needed rain, perhaps you should sacrifice to Demeter; etc., etc. The notion of converting to a religion, much less that of being “born again” or the like, would have been greeted by the ancients with complete befuddlement. [1]There were just three exceptions to the rule of non-exclusivity, all of them also rare pre-Christian examples of monotheism. The Egyptian pharaoh Akhenaten decreed around 1350 BC that his kingdom’s traditional pantheon of gods be replaced with the single sun god Aten. But his new religion was accepted only resentfully, with the old gods continuing to be worshiped in secret, and was gradually done away with after his death. Then there was Zoroastrianism, a religion with some eyebrow-raising similarities to the later religion of Christianity which sprung up in Iran in the sixth century BC. It still has active adherents today. And then of course there were the Jews, whose single God would brook no rivals in His people’s hearts and minds. But the heyday of an independent kingdom of Judah was brief indeed, and in the centuries that followed the Jews were regarded as a minor band of oddball outcasts, a football to be kicked back and forth by their more powerful neighbors.

And then into this milieu came Jesus Christ, only to be promptly, as Douglas Adams once put it, “nailed to a tree for saying how great it would be to be nice to people for a change.” It’s very difficult to adequately convey just how revolutionary Christianity, a religion based on love and compassion rather than dominance, really was. I defer one more time to Bart D. Ehrman:

Leaders of the Christian church preached and urged an ethic of love and service. One person was not more important than another. All were on the same footing before God: the master was no more significant than the slave, the husband than the wife, the powerful than the weak, or the robust than the diseased.

The very idea that society should serve the poor, the sick, and the marginalized became a distinctively Christian concern. Without the conquest of Christianity, we may well never have had institutionalized welfare for the poor or organized healthcare for the sick.  Billions of people may never have embraced the idea that society should serve the marginalized or be concerned with the well-being of the needy, values that most of us in the West have simply assumed are “human” values.

Christianity carried within it as well a notion of freedom of choice that would be critical to the development of liberal democracy. Unlike the other belief systems of the ancient world, which painted people as hapless playthings of their gods, Christianity demanded that you choose whether to follow Christ’s teachings and thus be saved; you held the fate of your own soul in your own hands. If ordinary people have agency over their souls, why not agency over their governments?

But that was the distant future. For the people of the ancient world, Christianity’s tenet that they — regardless of who they were — were worthy of receiving the same love and compassion they were urged to bestow upon others had an immense, obvious appeal. Hegel, a philosopher of ambiguous personal spiritual beliefs who saw religions as intellectual memes arising out of the practical needs of the people who created them, would later describe Christianity as the perfect slave religion, providing the slaves who made up the bulk of its adherents during the early years with the tool of their own eventual liberation.

And so, over the course of almost 300 years, Christianity gradually bubbled up from the most wretched and scorned members of society to finally reach the Roman Emperor Constantine, the most powerful man in the world, in his luxurious palace. The raw numbers accompanying its growth are themselves amazing. At the time of Jesus Christ’s death, the entirety of the Christian religion consisted of his 20 or so immediate disciples. By the time Constantine converted in AD 312, there were about 3 million Christians in the world, despite persecution by the same monarch’s predecessors. In the wake of Constantine’s official sanction, Christianity grew to as many as 35 million disciples by AD 400. And today roughly one-third of the world’s population — almost 2.5 billion people — call themselves Christians of one kind or another. For meme theorists, Christianity provides perhaps the ultimate example of an idea that was so immensely appealing on its own merits that it became literally unstoppable. And for political historians, its takeover of the Roman Empire provides perhaps the first example in history of a class revolution, a demonstration of the power of the masses to shake the palaces of the elites.

Which is not to say that everything about the Christian epoch would prove better than what had come before it. “There is no need of force and injury because religion cannot be forced,” wrote the Christian scholar Lactantius hopefully around AD 300. “It is a matter that must be managed by words rather than blows, so that it may be voluntary.” Plenty would conspicuously fail to take his words to heart, beginning just 35 years later with the appropriately named Firmicus, an advisor with the newly Christianized government of Rome, who told his liege that “your severity should be visited in every way on the crime of idolatry.” The annals of the history that followed are bursting at the seams with petty tyrants, from Medieval warlords using the cross on their shields to justify their blood lust to modern-day politicians of the Moral Majority railing against “those sorts of people,” who have adopted the iconography of Christianity whilst missing the real point entirely. This aspect of Christianity’s history cannot and should not be ignored.

That said, I don’t want to belabor too much more today Christianity’s long history as a force for both good and ill. I’ll just note that the Protestant Reformation of the sixteenth and seventeenth centuries, which led to the bloodiest war in human history prior to World War I, also brought with it a new vitality to the religion, doing much to spark that extraordinary acceleration in the narrative of progress which began in the eighteenth century. I remember discussing the narrative of progress with a conservative Catholic acquaintance of mine who’s skeptical of the whole notion’s spiritual utility. “That’s a very Protestant idea,” he said about it, a little dismissively. “Yeah,” I had to agree after some thought. “I guess you’re right.” Protestantism is still linked in the popular imagination with practical progress; the phrase “Protestant work ethic” still crops up again and again, and studies continue to show that large-scale conversions to Protestantism are usually accompanied — for whatever reason — by increases in a society’s productivity and a decline in criminality.

One could even argue that it was really the combination of the ethos of love, compassion, equality, and personal agency that had been lurking within Christianity from the beginning with this new Protestant spirit of practical, worldly achievement in the old ethos’s service that led to the Declaration of Independence and the United States of America, that “shining city on a hill” inspiring the rest of the world. (The parallels between this worldly symbol of hope and the Christian Heaven are, I trust, so obvious as to not be worth going into here.) It took the world almost 2000 years to make the retrospectively obvious leap from the idea that all people are equal before God to the notion that all people are equal, period. Indeed, in many ways we still haven’t quite gotten there, even in our most “civilized” countries. Nevertheless, it’s hard to imagine the second leap being made absent the first; the seeds of the Declaration of Independence were planted in the New Testament of the Christian Bible.

Of course, counterfactuals will always have their appeal. If, as many a secular humanist has argued over the years, equality and mutual respect really are just a rationally better way to order a society, it’s certainly possible we would have gotten as far as we have today by some other route — possibly even have gotten farther, if we had been spared some of the less useful baggage which comes attached to Christianity. In the end, however, we have only one version of history which we can truly judge: the one that actually took place. So, credit where it’s due.

Said credit hasn’t always been forthcoming. In light of the less inspiring aspects of Christianity’s history, there’s a marked tendency in some circles to condemn its faults without acknowledging its historical virtues, often accompanied by a romanticizing of the pre-Christian era. By the time Constantine converted to Christianity in AD 312, thus transforming it at a stroke from an upstart populist movement to the status it still holds today as the dominant religion of the Western world, the Roman Empire was getting long in the tooth indeed, and the thousand years of regress and stagnation that would come to be called the Dark Ages were looming. Given the timing, it’s all too easy for historians of certain schools to blame Christianity for what followed. Meanwhile many a libertine professor of art or literature has talked of the ancients’ comfort with matters of the body and sexuality, contrasting it favorably with the longstanding Christian discomfort with same.

But our foremost eulogizer of the ancient ways remains that foremost critic of the narrative of progress in general, Friedrich Nietzsche. His homage to the superiority of might makes right over Christian compassion carries with it an unpleasant whiff of the Nazi ideology that would burst into prominence thirty years after his death:

The sick are the greatest danger for the well. The weaker, not the stronger, are the strong’s undoing. It is not fear of our fellow man which we should wish to see diminished; for fear rouses those who are strong to become terrible in turn themselves, and preserves the hard-earned and successful type of humanity. What is to be dreaded by us more than any other doom is not fear but rather the great disgust, not fear but rather the great pity — disgust and pity for our human fellows.

The morbid are our greatest peril — not the “bad” men, not the predatory beings. Those born wrong, the miscarried, the broken — they it is, the weakest who are undermining the vitality of the race, poisoning our trust in life, and putting humanity in question. Every look of them is a sigh — “Would I were something other! I am sick and tired of what I am.” In this swamp soil of self-contempt, every poisonous weed flourishes, and all so small, so secret, so dishonest, and so sweetly rotten. Here swarm the worms of sensitiveness and resentment, here the air smells odious with secrecy, with what is not to be acknowledged; here is woven endlessly the net of the meanest conspiracies, the conspiracy of those who suffer against those who succeed and are victorious; here the very aspect of the victorious is hated — as if health, success, strength, pride, and the sense of power were in themselves things vicious, for which one ought eventually to make bitter expiation. Oh, how these people would themselves like to inflict expiation, how they thirst to be the hangmen! And all the while their duplicity never confesses their hatred to be hatred.

To be sure, there were good things about the ancient ways. When spiritual beliefs are a buffet, there’s little point in fighting holy wars; while the ancients fought frequently and violently over many things large and small, they generally didn’t go to war over their gods. Even governmental suppression of religious faith, which forms such an important part of the early legends of Christianity, was apparently suffered by few other groups of believers. [2]The most well-documented incidence of same occurred in 186 BC, and targeted worshipers of Bacchus, the famously rowdy god of wine. These drunkards got in such a habit of going on raping-and-pillaging sprees through the countryside that the Roman Senate, down to its last nerve with the bro-dudes of the classical world, rounded up 7000 of them for execution and pulled down their temples all over Roman territory. Still, it’s hard to believe that very many of our post-Christ romanticizers of the ancient ways would really choose to go back there if push came to shove — least of all among them Nietzsche, a sickly, physically weak man who suffered several major mental breakdowns over the course of his life. He wouldn’t have lasted a day among his beloved Bronze Age Greeks; ironically, it was only the fruits of the progress he so decried that allowed him to fulfill his own form of thymos.

At any rate, I hope I’ve made a reasonable case for Christianity as a set of ideas that have done the world much good, perhaps even enough to outweigh the atrocities committed in the religion’s name. At this juncture, I do want to emphasize again that one’s opinion of Christian values need not have any connection with one’s belief in the veracity of the Christian God. For my part, I try my deeply imperfect best to live by those core values of love, compassion, and equality, but I have absolutely no sense of an anthropomorphic God looking down from somewhere above, much less a desire to pray to Him.

It even strikes me as reasonable to argue that the God part of Christianity has outlived His essentialness; one might say that the political philosophy of secular humanism is little more than Christianity where faith in God is replaced with faith in human rationality. Certainly the world today is more secular than it’s ever been, even as it’s also more peaceful and prosperous than it’s ever been. A substantial portion of those 2.5 billion nominal Christians give lip service but little else to the religion; I think about the people all across Europe who still let a small part of their taxes go to their country’s official church out of some vague sense of patriotic obligation, despite never actually darkening any physical church’s doors.

Our modern world’s peace and prosperity would seem to be a powerful argument for secularism. Yet a question is still frequently raised: does a society lose something important when it loses the God part of Christianity — or for that matter the God part of any other religion — even if it retains most of the core values? Some, such as our atheist friend Richard Dawkins, treat the very notion of religiosity as social capital with contempt, another version of the same old bread-and-circuses coddling of the masses, keeping them peaceful and malleable by telling them that another, better life lies in wait after they die, thus causing them to forgo opportunities for bettering their lots in this life. But, as happens with disconcerting regularity, Dawkins’s argument here is an oversimplification. As we’ve seen already, a belief in an afterlife isn’t a necessary component of spiritual belief (although, as the example of Christianity proves, it certainly can’t hurt a religion’s popularity). It’s more interesting to address the question not through the micro lens of what is good for an individual or even a collection of individuals in society, but rather through the macro lens of what is good for society as an entity unto itself.

And it turns out that there are plenty of people, many of them not believers themselves, who express concern over what else a country loses as it loses its religion. The most immediately obvious of the problematic outcomes is a declining birth rate. The well-known pension crisis in Europe, caused by the failure of populations there to replace themselves, correlates with the fact that Europe is by far the most secular place in the world. More abstractly but perhaps even more importantly, the decline in organized religion in Europe and in North America has contributed strongly to a loss of communal commons. There was a time not that long ago when the local church was the center of a community’s social life, not just a place of worship but one of marriages, funerals, pot lucks, swap meets, dances, celebrations, and fairs, a place where people from all walks of life came together to flirt, to socialize, to hash out policy, to deal with crises, and to help those less fortunate. Our communities have grown more diffuse with the decline of religion, on both a local and a national scale.

Concern about the loss of religion as a binding social force, balanced against a competing and equally valid concern for the plight of those who choose not to participate in the majority religion, has provoked much commentary in recent decades. We live more and more isolated lives, goes the argument, cut off from our peers, existing in a bubble of multimedia fantasy and empty consumerism, working only to satisfy ourselves. Already in 1995, before the full effect of the World Wide Web and other new communications technologies had been felt, the political scientist Robert D. Putnam created a stir in the United States with his article “Bowling Alone: America’s Declining Social Capital,” which postulated that civic participation of the sort that had often been facilitated by churches was on a worrisome decline. For many critics of progress, the alleged isolating effect of technology has only made the decline more worrisome in more recent years. In Denmark, the country where I live now — and a country which is among the most secular even in secular Europe — newly arrived immigrants have sometimes commented to me about the isolating effect of even the comprehensive government-administered secular safety net: how elderly people who would once have been taken care of by their families get shunted off to publicly-funded nursing homes instead, how children can cut ties with their families as soon as they reach university age thanks to a generous program of student stipends.

The state of Christianity in many countries today, as more of a default or vestigial religion than a vital driving faith, is often contrasted unfavorably with that of Islam, its monotheistic younger brother which still trails it somewhat in absolute numbers of believers but seems to attract far more passion and devotion from those it has. Mixed with reluctant admiration of Islam’s vitality is fear of the intolerance it supposedly breeds and the acts of terrorism carried out in its name. I’ve had nothing to say about Islam thus far — in my defense, neither does the game of Civilization — and the end of this long article isn’t the best place to start analyzing it. I will note, however, that the history of Islam, like that of Christianity, has encompassed both inspirational achievements and horrible atrocities. Rather than it being Islam itself that is incompatible with liberal democracy, there seems to be something about conditions in the notorious cauldron of conflict that is the Middle East — perhaps the distortions produced by immense wealth sitting there just underground in the form of oil and the persistent Western meddling that oil has attracted — which has repeatedly stunted those countries’ political and economic development. Majority Muslim nations in other parts of the world, such as Indonesia and Senegal, do manage to exist as reasonably free and stable democracies. Ultimately, the wave of radical Islamic terrorism that has provoked such worldwide panic since September 11, 2001, may have at least as much to do with disenfranchisement and hopelessness as it does with religion. If and when the lives of the young Muslim men who are currently most likely to become terrorists improve, their zeal to be religious martyrs will likely fade — as quite likely will, for better or for worse, the zeal of many of them for their religion in general. After all, we’ve already seen this movie play out with Christianity in the starring role.

As for Christianity, the jury is still out on the effects of its decline in a world which has to a large extent embraced its values but may not feel as much of a need for its God and for its trappings of worship. One highly optimistic techno-progressivist view — one to which I’m admittedly very sympathetic — holds that the ties that bind us together haven’t really been weakened so very much at all, that the tyranny of geography over our circles of empathy is merely being replaced, thanks to new technologies of communication and travel, by true communities of interest, where physical location need be of only limited relevance. Even the demographic crisis provoked by declining birth rates might be solved by future technologies which produce more wealth for everyone with less manpower. And the fact remains that, taken in the abstract, fewer people on this fragile planet of ours is really a very good thing. We shall see.

I realize I’ve had little to say directly about the game of Civilization in this article, but I’m not quite willing to apologize for that. As I stated at the outset, the game’s handling of religion isn’t terribly deep; there just isn’t a lot of “there” there when it comes to religion and Civilization. Yet religion has been so profoundly important to the development of real-world civilization that this series of articles would have felt incomplete if I didn’t try to remedy the game’s lack by addressing the topic in some depth. And in another way, of course, the game of Civilization would never have existed without the religion of Christianity in particular, simply because so much of the animating force of the narrative of progress, which in turn is the animating force of Civilization, is rooted in Christian values. In that sense, then, this article has been all about the game of Civilization — as it has been all about the values underpinning so much of the global order we live in today.

(Sources: the books Civilization, or Rome on 640K a Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, The End of History and the Last Man by Francis Fukuyama, The Iliad by Homer, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, The Varieties of Religious Experience by William James, The Genealogy of Morals by Friedrich Nietzsche, The Communist Manifesto by Karl Marx and Friedrich Engels, The Human Use of Human Beings by Norbert Wiener, The Hitchhiker’s Guide to the Galaxy by Douglas Adams, The Triumph of Christianity: How a Forbidden Religion Swept the World by Bart D. Ehrman, The Past is a Foreign Country by David Lowenthal, Bowling Alone: America’s Declining Social Capital by Robert D. Putnam, A Christmas Carol by Charles Dickens, and The God Delusion by Richard Dawkins.)

Footnotes

Footnotes
1 There were just three exceptions to the rule of non-exclusivity, all of them also rare pre-Christian examples of monotheism. The Egyptian pharaoh Akhenaten decreed around 1350 BC that his kingdom’s traditional pantheon of gods be replaced with the single sun god Aten. But his new religion was accepted only resentfully, with the old gods continuing to be worshiped in secret, and was gradually done away with after his death. Then there was Zoroastrianism, a religion with some eyebrow-raising similarities to the later religion of Christianity which sprung up in Iran in the sixth century BC. It still has active adherents today. And then of course there were the Jews, whose single God would brook no rivals in His people’s hearts and minds. But the heyday of an independent kingdom of Judah was brief indeed, and in the centuries that followed the Jews were regarded as a minor band of oddball outcasts, a football to be kicked back and forth by their more powerful neighbors.
2 The most well-documented incidence of same occurred in 186 BC, and targeted worshipers of Bacchus, the famously rowdy god of wine. These drunkards got in such a habit of going on raping-and-pillaging sprees through the countryside that the Roman Senate, down to its last nerve with the bro-dudes of the classical world, rounded up 7000 of them for execution and pulled down their temples all over Roman territory.
 
 

Tags: , , ,

The Game of Everything, Part 5: Civilization and War

War appears to be as old as mankind, but peace is a modern invention.

— Henry Maine

As soon as they decided to bring rival civilizations into their game of Civilization to compete with the one being guided by the player, Sid Meier and Bruce Shelley knew they would also have to bring in ways of fighting wars. This understanding may be a depressing one on some level, but it squares with the realities of history. As far back as we can observe, humans have been killing one another. Even the possibility of long-term, lasting peace in the world is, as the Henry Maine quote above says, a very recent idea.

Tellingly, The Iliad, the oldest complete work in the Western literary canon, is a story of war. Likely written down for the first time in the eighth century BC, it hearkens back to the Trojan War of yet several centuries earlier, a conflict shrouded in myth and legend even at the time the supposed blind poet Homer first began to chant his tale. The epic does devote space to the ultimate pointlessness of being pawns in the sport of the gods that was the Bronze Age Greeks’ conception of war, as well as the suffering engendered by it. Yet that doesn’t prevent it from glorying in all the killing, thus illustrating that ultra-violent popular entertainments are anything but a modern phenomenon. The goriest videogame has nothing on Homer:

He hurled and Athena drove the shaft
and it split the archer’s nose between the eyes —
it cracked his glistening teeth, the tough bronze
cut off his tongue at the roots, smashed his jaw
and the point came ripping out beneath his chin.
He pitched from his car, armor clanged against him,
a glimmering blaze of metal dazzling round his back —
the purebreds reared aside, hoofs pounding the air
and his life and power slipped away on the wind.

Just as Homer looms large in the early Greek literary tradition, one Heraclitus does the same in early Greek philosophy; legend tells us he wrote around 500 BC. Only fragments of his works remain to us today, mostly in the form of quotations lifted from them by later philosophers. Those fragments and the things those later commentators wrote about him identify Heraclitus as a philosopher of flux and change; “No man ever steps in the same river twice,” goes his most famous aphorism. He was apparently the first to identify the tension between physis, the reality of being in all its chaotic, ever-changing splendor, and logos, meaning literally “word” or “speech” in Greek — all of the rules of logic and ethics which humans apply in the hopeless task of trying to understand and master physis. A disciple of Heraclitus would call the narrative of progress a pathetic attempt to bridle the physis of history by forcing a comforting logophilic structure upon it.

As a philosopher of unbridled physis, Heraclitus was also a philosopher of war, of conflict in all its forms. “We must know that strife is common to all and strife is justice,” he wrote, “and that all things come into being through strife necessarily.” Neglected for a long time in favor of the cooler metaphysics of Socrates, Plato, and Aristotle, Heraclitus burst back into prominence at last in the early nineteenth century AD, when he was rediscovered by the German school of idealist philosophers. Later in the same century, Friedrich Nietzsche, who loathed the rationality of the Enlightenment and the narrative of progress it inspired, saw in Homer and Heraclitus alike a purer, more essential reflection of the reality of existence.

But we need not agree with Nietzsche that the Greeks of the Bronze Age had everything right and that it’s been all downhill from there to find something of value in Heraclitus. Consider again this assertion that “all things come into being through strife.” There is, it seems to me, some truth there, perhaps more truth than we’d like to admit. As Nietzsche’s contemporary Charles Darwin taught us, this is how biological evolution works. Strife is, in other words, what made us, the human race, what we are as a species. And it would certainly appear that our earliest civilizations too came into being through strife.

During the Enlightenment era, two dueling points of view about the nature of primitive peoples dominated. The Swiss/French philosopher Jean-Jacques Rousseau coined the cliché of the “gentle savage,” who lived in a state of nature with his companions in an Eden of peace and tranquility, untouched by the profanities of modern life; progress in all its guises, Rousseau asserted, had only led humanity to “decrepitness.” Rousseau saw the narrative of progress as a narrative of regress, of civilization and all its trappings serving only to divorce humanity more and more from the idyllic state of nature. But Thomas Hobbes, whom we already encountered in my previous article, took the polar opposite view, seeing the lives of primitive peoples as “nasty, brutish, and short,” and seeing the civilizing forces of progress as the best things ever to befall his species. He believed, in other words, that humanity’s distancing itself more and more from the primitive state of nature was an unalloyed good thing.

This duality has remained with us to the present day. You can see much of Rousseau in the Woodstock Generation’s claim that “we are stardust, we are golden, and we’ve got to get ourselves back to the Garden,” as you can in some aspects of the modern environmental movement and our societies’ general fetish for all things “natural.” Meanwhile Hobbes’s ideology of progress is, of course, the core, driving idea behind the game of Civilization, among other signs of the times.

So, we have to ask ourselves, who’s right — or, at the very least, who’s more nearly right? There are few if any human communities left in the world today who live in so complete a state of nature as to conclusively prove or disprove the theory of the gentle savage. We can, however, turn to the evidence of archaeology to arrive at what feels like a fairly satisfying answer.

Almost all of the most famous finds of Stone Age corpses show evidence at the least of having suffered violent trauma, more commonly of having died from it. Indeed, the fact that it can seem almost impossible to turn up any human remains that don’t show evidence of violence has become something of a running joke among archaeologists. Ötzi the Iceman, as a 5000-year-old body discovered in the Austrian Alps in 1991 became known, turned out to have been shot with a bow and arrow and dumped into the crevasse where he was found. Kennewick Man, a 9500-year-old body discovered in Washington State in 1996, had been shot in the pelvis with some sort of stone projectile that remained embedded there. Lindow Man, a 2000-year-old body discovered in rural England in 1984, had been bonked on the head with a blunt object, had his neck broken with a twisted cord, and then, just to make sure, had his throat cut. Another 2000-year-old body found more recently in England had been beheaded, probably in a form of ritual sacrifice. Yet another recent discovery, a 4600-year-old family consisting of a man, a woman, and their two children, showed evidence of having been killed in a raid on their encampment. The Garden of Eden theory of early human history, it would appear, is right out.

Rather than being their antagonist, violence — or, often, the threat of violence — was a prime driver of early civilizations. Sentiment may have sufficed for primitive humans to keep their family and perhaps their friends close, but it was the logic of survival that pushed them to begin to enlarge their circles of concern, to band together into the larger communities that could form the basis for civilization. Long before humans had any inkling of a narrative of progress, the most important, tangible benefit of civilization was protection from the depredations of hostile neighbors. The Enlightenment philosopher Immanuel Kant called this development, which paradoxically arose out of the impulse toward conflict rather than cooperation, “asocial sociability.”

In a sense, this reality that civilizations are born in violence is baked into the game of Civilization. From the mid-game on, it’s possible to make a very good go of it as a peaceful democracy, to be a good global citizen not declaring war unless war is declared on you, striving to trade and research your way to Alpha Centauri. Before you reach that stage, however, you have to be a despotic state no better than any of the others. As every Civilization veteran knows, it’s absolutely vital to establish sovereignty over your starting continent during this early stage in order to have enough cities and resources to be competitive later on. Thus you can’t afford to play the gentle savage, even if you believe such a person ever existed. If, as is likely, there are rival civilizations on your starting continent, you have to conquer them before you can think about peaceful coexistence with anyone else.

But the debt which the narrative of progress owes to war and the threat of war extends far beyond a civilization’s early stages, both in real history and in the game. In fact, the modern world order, built around fairly large nation-states with strong centralized governments, is, along with all of the progress it has spawned on so many fronts, a direct outgrowth of the need to project military power effectively.

Early twentieth-century writers, reacting to the horrible wars of their times, concocted the legend of war as a more honorable affair during earlier ages, one in which civilians were spared and soldiers comported themselves as civilized men. One has only to read The Iliad to know what a load of bunk that is; war has always been the nastiest, most brutal business there is, and codes of behavior have seldom survived an army’s first contact with the enemy. And if one was unfortunate enough to be a civilian caught between two armies… well, raping and pillaging were as popular among soldiers of earlier centuries as it was among those of the twentieth, as the stories of same in The Iliad once again cogently illustrate.

Still, there were important differences between the wars that were fought prior to the eighteenth century and those that came later. It’s easy today to overlook how differently societies were organized prior to the Enlightenment era. Such modern countries as Germany and Italy were still collections of small independent states, cooperating at best under a framework of uneasy alliances. Even where there existed a centralized government, the monarch’s power was sharply limited under the feudal systems that held sway. If he wished to go to war, he was often reduced to begging his nobles for the money and manpower necessary to do so. In addition, economies in general had very limited resources to set aside from the basic task of feeding themselves in favor of waging war. It all added up to make wars into hugely inefficient businesses, where months or even years could go by between significant battles. In many ways, of course, that was good for the people of the countries fighting them.

It was the unification of England and Scotland as Great Britain in 1707 that marked the beginning of the modern nation-state. Thirty years before said unification, the entire English army consisted of no more than 15,000 soldiers, a number that could be packed into a typical modern sports arena and leave plenty of seats to spare. The historian John Brewer describes what followed:

The late seventeenth and eighteenth centuries saw an astonishing transformation in British government, one which put muscles on the bones of the British body politic, increasing its endurance, strength, and reach. Britain was able to shoulder an ever more ponderous burden of military commitments thanks to a radical increase in taxation, the development of public-debt finance (a national debt) on an unprecedented scale, and the growth of a sizable public administration devoted to organizing the fiscal and military activities of the state.

This radical remaking was driven by two principal factors. One was advances in technology and engineering that freed up more and more people to work at tasks other than food production; this was, after all, the period of the Enlightenment, when the narrative of progress went into overdrive. The other was the need to efficiently project military power to ever more far-flung locales in the world — the need of a burgeoning British Empire.

Alongside a centralized government bureaucracy and standing military grew that necessary evil for funding it: taxes. The effective average British tax rate rose from 3.5 percent in 1675 to 23 percent a century later, to no less than 35 percent at the height of the Napoleonic Wars of the early nineteenth century. And, what with the people from these earlier centuries not being all that different from us at bottom, lots and lots of them didn’t like it one bit. One William Pulteney spoke for them:

Let any gentleman but look into the Statute Books lying upon our Table, he will see there to what a vast Bulk, to what a Number of Volumes, our Statutes relating to Taxes have swelled. It is monstrous, it is even frightful to look into the Indexes, where for several Columns together we see nothing but Taxes, Taxes, Taxes.

The modern developed nation-state — bureaucratic, orderly, highly centralized, and absurdly highly taxed in comparison to any other era of human history — had been born, largely to meet the needs of the military.

With one country having remade itself in this more militarily efficient image, the other countries of the world felt they had no choice but to follow in order to remain competitive. In Europe, France and Spain concentrated more power than ever before in the hands of a central government, while the various small kingdoms that had traditionally made up Italy and Germany finally felt compelled to unify as centralized nations in their own right. The Ottoman Empire too remade itself after suffering a series of humiliating defeats at the hands of the ultra-modern nation-state of Napoleonic France, as did faraway Japan after the American Commodore Matthew Perry waltzed into Edo Bay with a small fleet of modern warships and held the shogunate hostage at gunpoint; “rich country, strong army” became the slogan for Japan’s economic, bureaucratic, social, and of course military modernization.

The game of Civilization does a rather remarkably clever job of depicting most of the factors I’ve just described. The deeper into the game you play, the more your maintenance costs go up, meaning that you have to tax your people more and more to maintain your civilization. And the game also captures the spur which the threat of war constantly gave to the progressive impulse; if you let your civilization fall too far behind its rivals in military terms, they will pounce. This alone provides a strong motivation to keep researching the latest advances, exactly as it did historically. The narrative of progress, in the game and in history, owes much to war.

But when we come to the second half of the twentieth century of our own planet’s history, the notion of war and/or the threat of war as a prime driver of the narrative of progress becomes more fraught. It has long been commonplace for critics of progress to contrast the bloody twentieth century with the relatively peaceful nineteenth century, using a range of seemingly telling statistics about death and suffering to anchor their contention that the narrative of progress has really only made us better at killing one another. Yet their insistence on passing their statistical judgment on the twentieth century as a whole obscures something rather important: while the first half of the century was indeed inordinately, almost inconceivably bloody, the second half was vastly less so. The statistics for the century as a whole, in other words, are hopelessly skewed by what we can all agree to hope were the historical anomalies of the two biggest wars ever fought.

Since the end of World War II, the situation has been much different. While small wars have certainly continued to be fought, two proverbial “great powers” haven’t met one another directly on a battlefield since 1945: that’s 73 years as I write these words, a record for all of post-classical human history. As the political scientist Robert Jervis could write already in 1988, “the most striking characteristic of the postwar period is just that — it can be called ‘postwar’ because the major powers have not fought each other since 1945. Such a lengthy period of peace among the most powerful states is unprecedented.” The change is so marked that historians have come up with a name for the period stretching from 1945 to the present: “The Long Peace.” This is the aspect of the Cold War which was overlooked by a public justifiably worried about the threat of nuclear annihilation, which was obscured by the small-scale proxy wars and police actions fought by the Americans in places like Vietnam and by the Soviets in places like Afghanistan. And yet the Long Peace has now outlasted the Cold War with which it overlapped by more than a quarter of a century.

If we want to identify what changed in the nature of warfare itself at the end of World War II, the answer is blazingly obvious: the atomic bomb entered the picture. The idea of a weapon so terrible that it would bring an end to war wasn’t, it should be noted, a new one at the time the bomb entered the picture. In 1864, Victor Hugo, looking forward to a future replete with flying machines, proposed that their potential on the battlefield would be sufficient to make armies “vanish, and with them the whole business of war.” Even the logic of mutually-assured destruction wasn’t really new at the dawn of the Cold War. In 1891, Alfred Nobel, the inventor of dynamite, suggested to an Austrian countess that “perhaps my factories will put an end to war sooner than your congresses: on the day that two army corps can mutually annihilate each other in a second, all civilized nations will surely recoil with horror and disband their troops.”

Still, nuclear weapons, with their capacity to mutually destroy not just opposing armies but opposing civilizations — and, indeed, the entirety of the world that built them — were clearly something new under the sun. It’s thus not hugely surprising to find that the game of Civilization doesn’t seem quite sure what to do with them when they finally appear so late in the day. After doing a credible job in the broad strokes, all things considered, of portraying the global balance of military power through World War II, the edges really begin to fray at the advent of the nuclear age. The game makes no space for the total destruction of an all-out nuclear exchange. Nuclear strikes come at a considerable cost to the environment, but it is possible in the game to win a nuclear war, sending what some critics regard as a regrettable message. To be fair to Sid Meier and Bruce Shelley, it would be very difficult indeed to implement nuclear weapons in a way that feels both true to history and satisfying in the game. Thus Civilization fell victim here to Meier’s old maxim of “fun trumps history.” That said, the designers did make an obvious attempt to simulate what a Pandora’s Box nuclear weapons really are in at least one way. When one civilization builds the Manhattan Project Wonder, all civilizations in the game who have researched the Rocketry advance get instant access to nuclear weapons.

This side of the game serves as a fine illustration of an aspect of strategy-game design that’s very easy to overlook. Many players believe that the ideal artificial intelligence plays just like a human would, but this isn’t always the case at all. If the more militaristic civilizations in the game were to start wildly nuking the player, ruining the civilization she’d spent so long building, she wouldn’t feel pleased that the artificial intelligence was so smart. Not at all; she’d feel like she was being punished for no good reason. Fun, it seems, also trumps perfect artificial intelligence. Your opponents in Civilization are notably reluctant to employ nuclear weapons in light of this maxim, only doing it to you if you start doing it to them.

In this sense, then, there is a version of the Long Peace to be found in the game, but it exists more to deliver a satisfying experience for the player than it arises out of any particular interpretation of history. For their part, historians’ points of view on the subject can be broadly sorted into two opposing camps, which I’ll call the realpolitik view and the globalist-idealist view. Both camps give due deference to the importance of nuclear weapons in any discussion of postwar geopolitics, but they diverge from there.

Those who fancy themselves the sober realists of the realpolitik school believe that the fundamentals of war and peace haven’t really changed at all, only the stakes in terms of potential destruction. From the first time that a primitive tribe armed itself with spears to make a neighboring tribe of warlike neighbors think twice before attacking its camp, weapons of war have been as useful for preventing wars as for fighting them. Nuclear weapons, the realpolitik camp claims, represent only a change in degree, not in kind. From this point of view arose the rhetoric of peace through strength, deployed liberally by steely Cold Warriors on both sides of the Iron Curtain. Safety, went the logic, lay in being so militarily strong that no one would ever dare mess with you. The Long Peace was a credit to the fact that the United States and the Soviet Union — and, after the Cold War, the United States alone — were so thoroughly prepared to kick any other country’s ass, with or without employing nuclear weapons.

The globalist-idealist view doesn’t ignore the awesome power of nuclear weapons by any means, but sees it through a more nuanced framework. Many people at the dawn of the nuclear age — not least many scientists of the Manhattan Project who had helped to build the bomb — hoped that its power would lead to a philosophical or even spiritual awakening, prompting humanity to finally put an end to war. Some went so far as to advocate for the sharing of the technology behind the bomb with all the countries of the world, thus placing the whole world on a level playing field and ending the dominion of strong over weak countries everywhere. Such a thing wasn’t done, but there may be reason to believe that the idealistic impulse which led to proposals like this one found another outlet which has done the world an immense amount of good.

Looking back to the actual horrors of the previous few decades and the potential horrors of nuclear war, countries across the world after World War II instituted an international system of order that would have sounded like a utopian dream five years before. Its centerpiece was the United Nations, a forum unlike any that had existed before in human history, a place to which disputes between countries could be brought, to be hashed out with the help of neutral peers before they turned into shooting wars. Meanwhile an International Court of Justice would, again for the first time in history, institute a binding, globalized system of law to which all of the United Nations’s signatories, big or small, would be bound.

These are the major, obvious institutions of the globalized postwar order, but the spirit that spawned them has led to countless other international organs of communication and dispute resolution. Perhaps the most amazing of these — and an institution whose amazingness is too often overlooked — is the European Union. Known throughout most of history as the world’s preeminent powder keg of war, Europe, with its dozens of culturally, ethnically, and linguistically diverse countries packed together more closely than anywhere else in the world, has at last managed to set aside ancient rivalries and the many wars to which they historically led in favor of a grand continent-spanning cooperative project that’s made the idea of another general European war all but unimaginable. Even the recent decision by Britain to withdraw from the European Union hasn’t, as was breathlessly predicted by so many Cassandras, led to the dissolution of the project. Instead the latest polling shows substantially increased support for the European Union among the citizens of its remaining member states, as if the blow that was Brexit caused many to wake up to just how precious it really is.

To the extent that it takes a position, the game of Civilization winds up pitching its tent with the realpolitik school, although one does sense that this is done almost by default. Its mechanics are suited to depicting a global order based on the military balance of power, but, while the United Nations does make a token appearance, the game has no real mechanism for transcending nationalism and the wars that tend to accompany it. Only limited cooperation between rival civilizations is possible, and, especially at the higher difficulty levels, it’s a careful player indeed who manages to avoid wars in the climactic stages of the game. All of this is perhaps unfortunate, but forgivable given the long arc of history the game has to cover.

In the real world, however, your humble writer here does see reason to believe that we may be edging into a new, post-national, postwar-in-the-universal-sense era. Of course, we need to be very careful when we begin to assert that we’re privileged to live in a unique time in history. Many an earlier era has been foolish enough to regard itself as unique, only to learn, sometimes painfully, that the old rules still apply. Yet recent decades really do seem to have altered our attitudes toward war. The acquisition of territory by military force, once considered a matter of course, is now looked upon so unfavorably by the world at large that even as established a bad actor on the world stage as Vladimir Putin’s Russia felt compelled to justify its annexation of the Crimea in 2014 with a sham referendum. The United States, widely regarded with some justification as the last remaining warmonger among the well-developed Western nations, nevertheless goes to lengths that would have been inconceivable in earlier eras to avoid civilian casualties in its various military adventures. The same reluctance to accept war for the ugly business it is does everything to explain why, despite having the most powerful military the world has ever known, the same country tends to clearly win so very few of the wars it starts.

Changing attitudes toward war in the West can also be charted through our war memorials. London’s Trafalgar Square, a celebration of a major naval victory over Napoleon, is almost a caricature of extravagant triumphalism, with an outsized Admiral Horatio Nelson looking proudly down on the scene from the top of a 170-foot column. The Vietnam War Memorial in Washington, D.C., on the other hand, engages with its subject — one of those recent wars the United States failed to win out of an unwillingness to behave as brutally as was necessary — not as a triumph but as a tragedy, being a somber roll call of the ordinary soldiers who lost their lives there. But perhaps nowhere is the transformation in attitudes more marked than in Germany, which, after instigating the most terrible war in history well under a century ago, is now arguably the most fundamentally pacifistic nation in the West, going so far as to anger free-speech advocates by banning blood in videogames and banning right-wing political parties that venture anywhere close to the ideological territory once occupied by the Nazi party.

This notion that we are on the cusp of a new era of peaceful international cooperation, that soon the brutality of war might be as unthinkable to the modern mind as that of slavery or institutionalized torture, was a key component of Francis Fukuyama’s assertion that humanity might be reaching the end of its history. A quarter-century on from that audacious thesis, the international order has been shaken at times, particularly by events in recent years, but the edifices built in the aftermath of World War II still stand. Even if we can only partially agree with the statement that humanity has finally found an orderly alternative to war through those edifices — reserving the other half of the Long Peace equation to the old realpolitik of might makes right, in the person of the peace-guaranteeing power of the United States and that ultimate deterrent of nuclear weapons — we might be slowly leaving behind the era of nationalism that began with the emergence of strong, centralized nation-states in the eighteenth and nineteenth centuries and led eventually to so much bloodshed in the first half of the twentieth. Even more optimistically, we might soon be able to say farewell to war as humanity has always known it. “Last night I had the strangest dream,” goes a lovely old folk song. “I dreamed the world had all agreed to put an end to war.” Today we are, by any objective measurement, closer to achieving that strange dream than we’ve ever been before. War has defined our past to a disconcerting degree, but perhaps it need not do the same for our future.

What would and should a postwar world really be like? Many have looked askance at the idea of a world free of war, seeing it as a world free as well of the noble virtues of honor, sacrifice, and courage, a world where people live only to selfishly gratify their personal needs. Unsurprisingly, Nietzsche is counted among the critics, painting a picture of a world full of “men without backs” who are no better than slaves to their creature comforts. More surprisingly, Georg Wilhelm Friedrich Hegel, that original architect of a narrative of progress climaxing in a peaceful and prosperous end of history, shared many of the same concerns, going so far as to state that nations at the end of history would need to continue to require military service of their citizens and continue to fight the occasional war in order to keep the noble virtues alive. Modern critics of the lifestyle of developed Western nations, speaking from both inside and outside those nations’ umbrellas, decry their servile “softness,” decry the way that the vicissitudes of fashion and consumerism take the place of the great feats that once stirred men’s souls. Peace and prosperity, goes another, related argument, are ultimately boring; some theories about the outbreak of World War I have long held that its biggest motivator was that countries just got tired of getting along, wanted a little mayhem to break up the monotony. Certainly our fictions — not least our videogame fictions — would be a lot duller without wars to relive.

I can understand such concerns on one level, but feel like they reflect a profound lack of imagination on another. I can’t, alas, count myself among the younger generation or generations who must put the finishing touches on a post-national, postwar world order, if it should ever come to be. Yet I can say that our current younger generation’s greater tolerance toward diversity and marked disinclination toward violence don’t strike me as being accompanied by any deficit of idealism or passion. And there is much that can replace war in their souls that is even more stirring. They could finally get serious about cleaning up this poor planet which their elders have spent so many centuries trashing. They could do something for the poorest regions of the world, to bring the benefits of the prosperous postwar international order to all. They could follow the example of humanity’s grandest adventure to date — the Apollo Moon landing, which truly was shared by the entire world thanks to the progressive technology of television — and look outward, first to Mars, perhaps eventually all the way to Alpha Centauri. For that matter, my own generation could make a solid start on many of these things right now. With all due respect to Hegel and Fukuyama, the end of war need not mean the end of history. It could mean that our real history is just getting started.

(Sources: the books Civilization, or Rome on 640K a Day by Johnny L. Wilson and Alan Emrich, The Story of Civilization Volume I: Our Oriental Heritage by Will Durant, The Past is a Foreign Country by David Lowenthal, The Sinews of Power by John Brewer, The Better Angels of Our Nature: Why Violence Has Declined by Steven Pinker, The End of History and the Last Man by Francis Fukuyama, The Iliad by Homer, Fragments by Heraclitus, The Social Contract by Jean-Jacques Rousseau, Leviathan by Thomas Hobbes, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, Thus Spoke Zarathustra by Friedrich Nietzsche, A Treatise of Human Nature by David Hume, Basic Writings of Kant by Immanuel Kant, and Nationalism: A Very Short Introduction by Steven Grosby; the article “Strategic Digital Defense: Video Games and Reagan’s ‘Star Wars’ Program, 1980-1987” by William M. Knoblauch, found in the book Playing with the Past: Digital Games and the Simulation of History.)

 
 

Tags: , , ,