RSS

Tag Archives: meier

Magic and Loss, Part 2: Magic on the Screen

It seems poetically apt that Peter Adkison first met Richard Garfield through Usenet. For Magic: The Gathering, the card game that resulted from that meeting, went on to usher in a whole new era of tabletop gaming, during which it became much more tightly coupled with digital spaces. The card game’s rise did, after all, coincide with the rise of the World Wide Web; Magic sites were among the first popular destinations there. The game could never have exploded so quickly if it had been forced to depend on the old-media likes of Dragon magazine to spread the word, what with print publishing’s built-in lag time of weeks or months.

But ironically, computers could all too easily also be seen as dangerous to the immensely profitable business Wizards of the Coast so speedily became. So much of the allure of Magic was that of scarcity. A rare card like, say, a Lord of the Pit was an awesome thing to own not because it was an automatic game-winner — it wasn’t that at all, being very expensive in terms of mana and having a nasty tendency to turn around and bite you instead of your opponent — but because it was so gosh-darned hard to get your hands on. Yet computers by their very nature made everything that was put into them abundant; here a Lord of the Pit was nothing but another collection of ones and zeroes, as effortlessly copyable as any other collection of same. Would Magic be as compelling there? Or, stated more practically if also more cynically, what profit was to be found for Wizards of the Coast in putting Magic on computers? If they made a killer Magic implementation for the computer, complete with Lords of the Pit for everyone, would anyone still want to play the physical card game? In the worst-case scenario, it would be sacrificing an ongoing revenue stream to die for in return for the one-time sales of a single boxed computer game.

Had it been ten years later, Wizards of the Coast might have been thinking about setting up an official virtual community for Magic, with online duels, tournaments, leader boards, forums, perhaps even a card marketplace. As it was, though, it was still the very early days of the Web 1.0, when most sites consisted solely of static HTML. Online play in general was in its infancy, with most computer games that offered it being designed to run over local-area networks rather than a slow and laggy dial-up Internet connection. In this technological milieu, then, a Magic computer game necessarily meant a boxed product that you could buy, bring home, install on a computer that may or may not even be connected to the Internet, and play all by yourself.

That last part of the recipe introduced a whole host of questions and challenges beyond the strictly commercial. Think again about the nature of Magic: a fairly simple game in itself, but one that could be altered in an infinity of ways by the instructions printed on the cards themselves. Making hundreds and hundreds of separate cards play properly on the computer would be difficult enough. And yet that wasn’t even the worst of it: the really hard part would be teaching the computer to use its millions of possible combinations of cards effectively against the player, in an era before machine learning and the like were more than a glint in a few artificial-intelligence theorists’ eyes.

But to their credit, Wizards of the Coast didn’t dismiss the idea of a Magic computer game out of hand on any of these grounds. When MicroProse Software came calling, promising they could make it happen, Wizards listened and agreed to let them take a stab at it.

It so happened that Magic had caught the attention of MicroProse’s star designer, Sid Meier of Pirates!, Railroad Tycoon, and Civilization fame. This was unsurprising in itself; Meier was a grizzled veteran of many a tabletop war, who still kept a finger on the pulse of that space. Although he was never a dedicated player of the card game, he was attracted to Magic precisely because it seemed so dauntingly difficult to implement on a computer. Meier was, you see, a programmer as well as a designer, one with a strong interest in artificial intelligence, who had in fact just spent a year or more trying to teach a 3DO console to create music in the mold of his favorite classical composer, Johann Sebastian Bach. In his memoir, he frames his interest in a Magic computer game as a way of placating the managers in the corner offices at MicroProse who were constantly pushing him and his colleagues in the trenches toward licensed properties. With Magic, he could have his cake and eat it too, pleasing the suits whilst still doing something he could get personally excited about. “It seemed prudent,” he writes dryly, “for us to choose the kind of license we liked before they assigned one to us.”

We cannot accuse MicroProse of thinking small when it came to Magic on the computer; they wound up creating not so much a game as a sort of all-purpose digital Magic toolkit. You could put together your dream deck in the “Deck Builder,” choosing from 392 different cards in all. Then you could take the deck you built into the “Duel” program, where you could participate in a single match or in a full-on tournament against computer opponents. If all of this left you confused, you could work your way through a tutorial featuring filmed actors. Or, last but by no means least, you could dive into Shandalar, which embedded the card game into a simple CRPG format, in which Magic duels with the monsters that roamed the world took the place of a more conventional combat engine and improving your deck took the place of improving your character’s statistics. Suffice to say that MicroProse’s Magic did not lack for ambition.

Like the cheesy advisors in the otherwise serious-minded Civilization II, the tutorial that uses clips of real actors dates the MicroProse Magic indelibly to the mid-1990s. The actress on the left is Rhea Seehorn, whose long journeyman’s career blossomed suddenly into fame and Emmy awards in 2015, when she began playing Kim Wexler in the acclaimed television series Better Call Saul.

Doubtless for this reason, it took an inordinately long time to make. The first magazine previews of the computer game, describing most of the features that would make it into the finished product, appeared in the spring of 1995, just as the craze for the card game was nearing its peak. Yet the finished product wasn’t released until March of 1997, by which point the frenzy was already beginning to cool off, as Magic slowly transformed into what it still is today: “just” an extremely popular card game. “This is the end of a long journey,” wrote Richard Garfield in his foreword to the computer game’s manual, a missive that exudes relief and exhaustion in equal measure.

In fact, by the time MicroProse and Garfield completed the journey a whole different digital Magic game had been started and completed by a different studio. Acclaim Entertainment’s Magic: The Gathering — Battlemage was Wizards of the Coast’s attempt to hedge their bets when the MicroProse project kept stretching out longer and longer. At the surface level, Battlemage played much like Shandalar: you wandered a fantasy world collecting cards and dueling with enemies. But its duels were far less ambitious; rather than trying to implement the real card game in nitty-gritty detail, it moved its broadest strokes only into a gimmicky real-time framework, with a non-adjustable clock that just so happened to run way too fast. “By the time [you] manage to summon one creature,” wrote Computer Gaming World in its review, “the enemy has five or six on the attack.” This, the very first Magic computer game to actually ship, is justifiably forgotten today.

Then, too, by the time MicroProse’s Magic appeared Sid Meier had been gone from that company for nine months already, having left with his colleagues Jeff Briggs and Brian Reynolds to form a new studio, Firaxis Games. In his memoir, he speaks to a constant tension between MicroProse, who just wanted to deliver the funnest possible digital implementation of Magic, and Wizards of the Coast, who were worried about destroying their cash cow’s mystique. “I was frustrated,” he concludes. “Magic was a good computer game, but not as good as it could be.”

I concur. The MicroProse Magic is a good game — in fact, a well-nigh miraculous achievement when one considers the technological times in which it was created. Yet Shandalar in particular is a frustrating case: a good game that, one senses, just barely missed being spectacular.

The heart of the matter, the Duel screen.

But without a doubt, the most impressive thing about this Magic is that it works at all. The interface is a breeze to use once you grasp its vagaries, the cards all function just as they should in all of their countless nuances, and the computer actually does make a pretty credible opponent most of the time, capable of combining its cards in ingenious ways that may never have occurred to you until you get blasted into oblivion by them. Really, I can’t say enough about what an incredible programming achievement this is. Yes, familiarity may breed some contempt in the course of time; you will eventually notice patterns in some of your opponents’ play that you can exploit, and the computer players will do something flat-out stupid every once in a while. (Then again, isn’t that true of a human player as well?) Early reviewers tended to understate the quality of the artificial intelligence because it trades smarts for speed on slower computers, not looking as far ahead in its calculations. These days, when some of our toasters probably have more processing power than the typical 1997 gaming computer, that isn’t a consideration.

The MicroProse game even manages to implement cards like Magic Hack, which lets you alter the text(!) found on other cards.

Wow. Just… wow.

Meanwhile Shandalar is a characteristic stroke of genius from Sid Meier, who was crazily good at translating lived experiences of all sorts into playable game mechanics. As we saw at length in the last article, it was the meta-game of collecting cards and honing decks that turned the card game into a way of life for so many of its players. Shandalar transplants this experience into a procedurally-generated fantasy landscape, capturing in the process the real heart of its analog predecessor’s appeal in a way that the dueling system on its own never could have, no matter how beautifully implemented. You start out as a callow beginner with a deck full of random junk, just like someone who has just returned from a trip to her friendly local game store with her first Magic Starter Pack. Your objective must now be to improve your deck into something you can win with on a regular basis, whilst learning how to use the cards you’ve collected most effectively and slowly building a reputation for yourself. Again, just like in real life.

The framing story has it that you are trying to protect the world of Shandalar from five evil wizards — one for each of the Magic colors — who are vying with one another and with you to take it over. You travel between the many cities and towns, buying and selling cards in their marketplaces and doing simple quests for their inhabitants that can, among other things, add to your dueling life-point total, which is just ten when starting out. Enemies in the employ of the wizards wander the same paths you do with decks of their own. Defeat them, and you can win one of their cards for yourself; get defeated by them, and you lose one of your own cards. (Shandalar is the last Magic product to use the misbegotten ante rule that the Wizards of the Coast of today prefers not to mention.)

After you’ve been at it a while, the other wizards’ lieutenants will begin attacking the towns directly. If any one enemy wizard manages to take over just three towns, he wins the game and you lose. (Unfortunately, the same lax victory conditions don’t apply to you…) Therefore it’s important not to let matters get out of hand on this front. You can rush to a town that’s being attacked and defend it by defeating the attacker in a duel, or you can even attack an already occupied town yourself in the hope of freeing it again, although this tends to be an even harder duel to win. When not thus occupied, you can explore the dungeons that are scattered about the map, stocked with tough enemies and tempting rewards in the form of gold, cards, and magical gems that confer special powers. Your ultimate goal, once you think you have the perfect deck, is to attack and defeat each wizard in his own stronghold; his strength in this final battle is determined by how many enemies of his color you’ve defeated elsewhere, so it pays to take your time. Don’t dawdle too long, though, because the other wizards get more and more aggressive about attacking towns as time goes by, which can leave you racing around willy-nilly trying to put out fire after fire, with scant time to take the offensive.


The MicroProse Magic was the first Sid Meier-designed game to appear in many years without the “Sid Meier’s…” prefix. His name was actually scrubbed from the credits completely, what with him having left the company before its completion. It was probably just as well: as he notes in his memoir, if MicroProse had tried to abide by its usual practice the game would presumably have needed to be called Sid Meier’s Wizards of the Coast’s Magic: The Gathering, which doesn’t exactly trip off the tongue.

Wandering the world of Shandalar.

You can accept quests for cards and other treasures.

When you bump into an enemy, you can either duel him for an ante or give him some money to go away.

You can reclaim towns that have been occupied by one of the enemy wizards, but it’s a risky battle, for which you must ante three cards to your opponent’s one.

Exploring a dungeon.



All told, it’s a heck of a lot of fun, the perfect way to enjoy Magic if you don’t want to spend a fortune on cards and/or aren’t overly enamored with the culture of nerdy aggression that surrounds the real-life game to some extent even today. I spent way more time with Shandalar than I could really afford to as “research” for this article, restarting again and again to explore the possibilities of many different colors and decks and the variations in the different difficulty levels. Shandalar is great just as it is; I highly recommend it, and happily add it to my personal Hall of Fame.

And yet the fact is that the balance of the whole is a little off — not enough so as to ruin the experience, but just enough to frustrate when you consider what Shandalar might have been with a little more tweaking. My biggest beef is with the dungeons. They ought to be one of the best things about the game, being randomly generated labyrinths stocked with unusual opponents and highly desirable cards. Your life total carries over from battle to battle within a dungeon and you aren’t allowed to save there, giving almost a roguelike quality to your underground expeditions. It seems to be a case of high stakes and high rewards, potentially the most exciting part of the game.

It makes no sense to risk the dungeons when you can randomly stumble upon places on the world map that let you have your choice of any card in the entire game. Happy as you are when you find them, these places are devastating to game balance.

But it isn’t, for the simple reason that the rewards aren’t commensurate with the risks in the final analysis. Most of the time, the cards you find in a dungeon prove not to be all that great after all; in fact, you can acquire every single one of them above-ground in one way or another, leaving you with little reason to even enter a dungeon beyond sheer, bloody-minded derring-do. A whole dimension of the game falls away into near-pointlessness. Yes, you can attempt to compensate for this by, say, pledging not to buy any of the most powerful cards at the above-ground marketplaces, but why should you have to? It shouldn’t be up to you to balance someone else’s game for them.

Even looking beyond this issue, Shandalar just leaves me wanting a little more — a bigger variety of special encounters on the world map, more depth to the economy, more and more varied quests. This is not because what we have is bad, mind you, but because it’s so good. My problem is that I just can’t stop seeing how it could be even better, can’t help wondering how it might have turned out had Sid Meier stayed at MicroProse through the end of the project. Which isn’t to say that you shouldn’t try this game if you already enjoy the card game or are even slightly curious about it. The MicroProse Magic retains a cult following to this day, many of whom will tell you that Shandalar in particular is still the most fun you can have with Magic on a computer.

In its own time, however, the most surprising thing about the MicroProse Magic is that it wasn’t more commercially successful. “I’ve found a wonderful place to play Magic: The Gathering,” wrote Computer Gaming World in its review. “I can play as much as I want whenever I want, and use legendary cards like Black Lotus and the Moxes without spending hundreds of dollars.” Nevertheless, the package didn’t set the world on fire. Perhaps the substandard Acclaim game, which was released just a month before the MicroProse version, muddied the waters too much. Or perhaps even more of the appeal of the card game than anyone had realized lay in the social element, which no digital version in 1997 could possibly duplicate.

Not that MicroProse didn’t try. “This game is exceedingly expandable,” wrote Richard Garfield in his foreword in the manual, strongly implying that the MicroProse Magic was just the beginning of a whole line of follow-on products that would keep it up to date with the ever-evolving card game. But that didn’t really happen. MicroProse did release Spells of the Ancients, a sort of digital Booster Pack with some new cards, followed by a standalone upgrade called Duels of the Planeswalkers, with yet more new cards and the one feature that was most obviously missing from the original game: the ability to duel with others over a network, albeit without any associated matchmaking service or the like that could have fostered a centralized online community of players. Not long after Duels of the Planeswalkers came out in January of 1998, the whole line fell out of print, having never quite lived up to MicroProse’s expectations for it. Wizards of the Coast, for their part, had always seemed a bit lukewarm about it, perchance not least because Shandalar relied so heavily on the ante system which they were by now trying hard to bury deep, deep down in the memory hole. Their next foray into digital Magic wouldn’t come until 2002, when they set up Magic: The Gathering Online, precisely the dynamic online playing space I described as infeasible earlier in this article in the context of the 1990s.

I’ll have more to say about the Magic phenomenon in future articles, given that it was the fuel for the most shocking deal in the history of tabletop gaming. The same year that the MicroProse Magic game came out, a swaggering, cash-flush Wizards of the Coast bought a teetering, cash-strapped TSR, who had seen the market for Dungeons & Dragons all but destroyed by Richard Garfield’s little card game. This event would have enormous repercussions on virtual as well as physical desktops, occurring as it did just after Interplay Entertainment had been awarded the license to make the next generation of Dungeons & Dragons computer games.

For today, though, let me warmly recommend the MicroProse Magic — if you can see your way to getting it running, that is. (See below for more on that subject.) Despite my quibbles about the ways in which it could have been even better, Shandalar remains almost as addictive for me today as the card game was for so many teenagers of the 1990s, only far less expensively so. When I pulled it up again to capture screenshots for this article, I blundered into a duel and just had to see it out. Ditto the next one, and then the one after that. Don’t say I didn’t warn you.


Where to Get It: The MicroProse Magic: The Gathering is unfortunately not an easy game to acquire or get running; the former difficulty is down to the complications of licensing, which have kept it out of digital-download stores like GOG.com, while the latter is down to its status as a very early Windows 95 game, from before DirectX was mature and before many standards for ensuring backward compatibility existed. Because I’d love for you to be able to play it, though, I’ll tell you how I got it working. Fair warning: it does take a bit of effort. But you don’t need to be a technical genius to make it happen. You just have to take it slow and careful.

  1. First of all, you’re going to need a virtual machine running Windows XP. This is not as onerous an undertaking as you might expect. I recommend a video tutorial from TheHowToGuy123, which walks you step by step through installing the operating system under Oracle VirtualBox in a very no-nonsense way.
  2. Next you need an image of the Magic CD. As of this writing, a search for “Magic The Gathering MicroProse” on archive.org will turn one up. Note that these procedures assume you are installing the original game, not Duels of the Planeswalkers. The patches you install will actually update it to that version.
  3. Boot up your virtual Windows XP machine and mount the Magic image from the VirtualBox “Devices” menu. Ignore the warning about not being on Windows 95 and choose “Install” from the window that pops up. Take the default options and let it do its thing. Do not install DirectX drivers and do not watch the tutorial; it won’t work anyway.
  4. Now you need to patch the game — twice, in fact. You can download the first patch from this very site. Mount the image containing the patch in VirtualBox and open the CD drive in Windows Explorer. You’ll see three executable files there, each starting with “MTGV125.” Drag all three to your desktop, then double-click them from there to run them one at a time. You want to “Unzip” each into the default directory.
  5. Restart your virtual Windows XP machine.
  6. Now you need the second patch, which you can also get right here. Mount this disk image on your virtual machine, create a folder on its desktop, and copy everything in the image into that folder. Double-click “Setup” from the desktop folder and wait a minute or two while it does its thing.
  7. Now copy everything from that same folder on your desktop into “C:\Magic\Program,” selecting “Yes to All” at the first warning prompt to overwrite any files that already exist there. If you see an error message about open file handles or the like, restart your virtual machine and try again.
  8. Here’s where it gets a little weird. The “Shandalar” entry on your Start menu is no longer pointing to the Shandalar game, but rather to the multiplayer engine. Go figure. To fix this, navigate into “C:\Magic\Program,” find “shandalar.exe,” and make a shortcut to it on your desktop. Double-click this to play the game. If it complains about a lack of swap space, just ignore it and go on.
  9. You’ll definitely want the manual as well.

Shandalar, the Deck Builder, and the single-player Duel app should all work now. The first does still have some glitches, such as labels that don’t always appear in town menus, but nothing too devastating (he says, having spent an inordinate amount of time… er, testing it thoroughly). I haven’t tested multiplayer, but it would surprise me if it still works. Alas, the cheesily charming tutorial is a complete bust with this setup; you can watch it on YouTube if you like.

Note that this is just one way to get Magic running on a modern computer, the one that worked out for me. Back in 2010, a group of fans made a custom version that ran seamlessly under Windows 7 without requiring a virtual machine, but it’s my understanding that that version doesn’t work under more recent versions of the operating system. Sigh… retro-gaming in the borderlands between the MS-DOS and Windows eras is a bit like playing Whack-a-Mole sometimes. If you have any other tips or tricks, by all means, share them in the comments.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book Sid Meier’s Memoir!: A Life in Computer Games by Sid Meier with Jennifer Lee Noonan; Computer Gaming World of June 1995, August 1996, May 1997, June 1997, and May 1998. And Soren Johnson’s interview with Sid Meier on his Designer Notes podcast.)

 
38 Comments

Posted by on September 22, 2023 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

Sequels in Strategy Gaming, Part 1: Civilization II

How do you make a sequel to a game that covers all of human history?

— Brian Reynolds

At the risk of making a niche website still more niche, allow me to wax philosophical for a moment on the subject of those Roman numerals that have been appearing just after the names of so many digital games almost from the very beginning. It seems to me that game sequels can be divided into two broad categories: the fiction-driven and the systems-driven.

Like so much else during gaming’s formative years, fiction-driven sequels were built off the example of Hollywood, which had already discovered that no happily ever after need ever be permanent if there was more money to be made by getting the old gang of heroes back together and confronting them with some new threat. Game sequels likewise promised their players a continuation of an existing story, or a new one that took place in a familiar setting with familiar characters. Some of the most iconic names in 1980s and early 1990s gaming operated in this mode: Zork, Ultima, Wizardry, King’s Quest, Carmen Sandiego, Leisure Suit Larry, Wing Commander. As anyone who has observed the progress of those series will readily attest, their technology did advance dramatically over the years. And yet this was only a part of the reason people stayed loyal to them. Gamers also wanted to get the next bit of story out of them, wanted to do something new in their comfortingly recognizable worlds. Unsurprisingly, the fiction-driven sequel was most dominant among games that foregrounded their fictions — namely the narrative-heavy genres of the adventure game and the CRPG.

But there was another type of sequel, which functioned less like a blockbuster Hollywood franchise and more like the version numbers found at the end of other types of computer software. It was the domain of games that were less interested in their fictions. These sequels rather promised to do and be essentially the same thing as their forerunner(s), only to do and be it even better, taking full advantage of the latest advances in hardware. Throughout the 1980s and well into the 1990s, the technology- or systems-driven sequel was largely confined to the field of vehicular simulations, a seemingly fussily specific pursuit that was actually the source in some years of no less than 25 percent of the industry’s total revenues. The poster child for the category is Microsoft’s Flight Simulator series, the most venerable in the entire history of computer gaming, being still alive and well as I write these words today, almost 43 years after it debuted on the 16 K Radio Shack TRS-80 under the imprint of its original publisher subLogic. If you were to follow this franchise’s evolution through each and every installment, from that monochrome, character-graphic-based first specimen to today’s photo-realistic feast for the senses, you’d wind up with a pretty good appreciation of the extraordinary advances personal computing has undergone over the past four decades and change. Each new Flight Simulator didn’t so much promise a new experience as the same old one perfected, with better graphics, better sound, a better frame rate, better flight modeling,  etc. When you bought the latest Flight Simulator — or F-15 Strike Eagle, or Gunship, or Falcon — you did so hoping it would take you one or two steps closer to that Platonic ideal of flying the real thing. (The fact that each installment was so clearly merely a step down that road arguably explains why these types of games have tended to age more poorly than others, and why you don’t find nearly as many bloggers and YouTubers rhapsodizing about old simulations today as you do games in most other genres.)

For a long time, the conventional wisdom in the industry held that strategy games were a poor fit with both of these modes of sequel-making. After all, they didn’t foreground narrative in the same way as adventures and CRPGs, but neither were they so forthrightly tech-centric as simulations. As a result, strategy games — even the really successful ones — were almost always standalone affairs.

But all that changed in a big way in 1993, when Maxis Software released SimCity 2000, a sequel to its landmark city-builder of four years earlierSimCity 2000 was a systems-driven sequel in the purest sense. It didn’t attempt to be anything other than what its predecessor had been; it just tried to be a better incarnation of that thing. Designer Will Wright had done his level best to incorporate every bit of feedback he had received from players of his original game, whilst also taking full advantage of the latest hardware to improve the graphics, sound, and interface. “Is SimCity 2000 a better program than the original SimCity?” asked Computer Gaming World magazine rhetorically. “It is without question a superior program. Is it more fun than the original SimCity? It is.” Wright was rewarded for his willingness to revisit his past with another huge hit, even bigger than his last one.

Other publishers greeted SimCity 2000‘s success as something of a revelation. At a stroke, they realized that the would-be city planners and generals among their customers were as willing as the would-be pilots and submarine captains to buy a sequel that enhanced a game they had already bought before, by sprucing up the graphics, addressing exploits, incongruities, and other weaknesses, and giving them some additional complexity to sink their teeth into. For better or for worse, the industry’s mania for franchises and sequels thus came to encompass strategy games as well.

In the next few articles, I’d like to examine a few of the more interesting results of this revelation — not SimCity 2000, a game about which I have oddly little to say, but another trio that would probably never have come to be without it to serve as a commercial proof of concept. All of the games I’ll write about are widely regarded as strategy classics, but I must confess that I can find unreserved love in my heart for only one of them. As for which one that is, and the reasons for my slight skepticism about the others… well, you’ll just have to read on and see, won’t you?


Civilization, Sid Meier’s colossally ambitious and yet compulsively playable strategy game of everything, was first released by MicroProse Software just in time to miss the bulk of the Christmas 1991 buying season. That would have been the death knell of many a game, but not this one. Instead Civilization became the most celebrated computer game since SimCity in terms of mainstream-media coverage, even as it also became a great favorite with the hardcore gamers. Journalists writing for newspapers and glossy lifestyle magazines were intrigued by it for much the same reason they had been attracted to SimCity, because its sweeping, optimistic view of human Progress writ large down through the ages marked it in their eyes as something uniquely high-toned, inspiring, and even educational in a cultural ghetto whose abiding interest in dwarfs, elves, and magic spells left outsiders like them and their readers nonplussed. The gamers loved it, of course, simply because it could be so ridiculously fun to play. Never a chart-topping hit, Civilization became a much rarer and more precious treasure: a perennial strong seller over months and then years, until long after it had begun to look downright crude in comparison to all of the slick multimedia extravaganzas surrounding it on store shelves. It eventually sold 850,000 copies in this low-key way.

Yet neither MicroProse nor Sid Meier himself did anything to capitalize on its success for some years. The former turned to other games inside and outside of the grand-strategy tent, while the latter turned his attention to C.P.U. Bach, a quirky passion project in computer-generated music that wasn’t even a game at all and didn’t even run on conventional computers. (Its home was the 3DO multimedia console.) The closest thing to a Civilization sequel or expansion in the three years after the original game’s release was Colonization, a MicroProse game from designer Brian Reynolds that borrowed some of Civilization‘s systems and applied them to the more historically grounded scenario of the European colonization of the New World. The Colonization box sported a blurb declaring that “the tradition of Civilization continues,” while Sid Meier’s name became a possessive prefix before the new game’s title. (Reynolds’s own name, by contrast, was nowhere to be found on the box.) Both of these were signs that MicroProse’s restless marketing department felt that the legacy of Civilization ought to be worth something, even if it wasn’t yet sure how best to make use of it.

Colonization hit the scene in 1994, one year after SimCity 2000 had been accorded such a positive reception, and proceeded to sell an impressive 300,000 copies. These two success stories together altered MicroProse’s perception of Civilization forever, transforming what had started as just an opportunistic bit of marketing on Colonization‘s box into an earnest attempt to build a franchise. Not one but two new Civilization games were quickly authorized. The one called CivNet was rather a stopgap project, which transplanted the original game from MS-DOS to Windows and added networked or hot-seat multiplayer capabilities to the equation. The other Civilization project was also to run under Windows, but was to be a far more extensive revamping of the original, making it bigger, prettier, and better balanced than before. Its working title of Civilization 2000 made clear its inspiration. Only at the last minute would MicroProse think better of making SimCity 2000‘s influence quite so explicit, and rename it simply Civilization II.

Unfortunately for MicroProse’s peace of mind, Sid Meier, a designer who always followed his own muse, said that he had no interest whatsoever in repeating himself at this point in time. Thus the project devolved to Brian Reynolds as the logical second choice: he had acquitted himself pretty well with Colonization, and Meier liked him a lot and would at least be willing to serve as his advisor, as he had for Reynold’s first strategy game. “They pitched it to me as if [they thought] I was probably going to be really upset,” laughs Reynolds. “I guess they thought I had my heart set on inventing another weird idea like Colonization. ‘Okay, will he be too mad if we tell him that we want him to do Civilization 2000?’ Which of course to me was the ultimate dream job. You couldn’t have asked me to do something I wanted to do more than make a version of Civilization.”

Like his mentor Meier, Reynolds was an accomplished programmer as well as game designer. This allowed him to do the initial work of hammering out a prototype on his own — from, of all locations, Yorkshire, England, where he had moved to be with his wife, an academic who was there on a one-year Fulbright scholarship. While she went off to teach and be taught every day, he sat in their little flat putting together the game that would transform Civilization from a one-off success into the archetypal strategy franchise.

Brian Reynolds

As Reynolds would be the first to admit, Civilization II is more of a nuts-and-bolts iteration on what came before than any wild flight of fresh creativity. He approached his task as a sacred trust. Reynolds:

My core vision for Civ II was not to be the guy that broke Civilization. How can I make each thing a little bit better without breaking any of it? I wanted to make the AI better. I wanted to make it harder. I wanted to add detail. I wanted to pee in all the corners. I didn’t have the idea that we were going to change one thing and everything else would stay the same. I wanted to make everything a little bit better. So, I both totally respected [Civilization I] as an amazing game, and thought, I can totally do a better job at every part of this game. It was a strange combination of humility and arrogance.

Reynolds knew all too well that Civilization I could get pretty wonky pretty quickly when you drilled down into the details. He made it his mission to fix as many of these incongruities as possible — both the ones that could be actively exploited by clever players and the ones that were just kind of weird to think about.

At the top of his list was the game’s combat system, the source of much hilarity over the years, what with the way it made it possible — not exactly likely, mind you, but possible — for a militia of ancient spearmen to attack and wipe out a modern tank platoon. This was a result of the game’s simplistic “one hit and done” approach to combat. Let’s consider our case of a militia attacking tanks. A militia has an attack strength of one, a tank platoon a defense strength of five. The outcome of the confrontation is determined by adding these numbers together, then taking each individual unit’s strength as its chance of destroying the other unit rather than being destroyed itself. In this case, then, our doughty militia men have a one-in-six chance of annihilating the tanks rather than vice versa — not great odds, to be sure, but undoubtedly better than those they would enjoy in any real showdown.

It was economic factors that made this state of affairs truly unbalancing. A very viable strategy for winning Civilization every single time was the “barbarian hordes” approach: forgo virtually all technological and social development, flood the map with small, primitive cities, then use those cities to pump out huge numbers of primitive units. A computer opponent diligently climbing the tech tree and developing its society over a broader front would in time be able to create vastly superior units like tanks, but would never come close to matching your armies in quantity. So, you could play the law of averages: you might have to attack a given tank platoon five times or more with different militias, but you knew that you would eventually destroy it, as you would the rest of your opponent’s fancy high-tech military with your staggering numbers of bottom feeders. The barbarian-horde strategy made for an unfun way to play once the joy of that initial eureka moment of discovering it faded, yet many players found the allure of near-certain victory on even the highest difficulty levels hard to resist. Part of a game designer’s job is to save players like this from themselves.

This was in fact the one area of Civilization II that Sid Meier himself dived into with some enthusiasm. He’d been playing a lot of Master of Magic, yet another MicroProse game that betrayed an undeniable Civilization influence, although unlike Colonization it was never marketed on the basis of those similarities. When two units met on the world map in Master of Magic, a separate tactical-battle screen opened up for you to manage the fight. Meier went so far as prototyping such a system for Civilization II, but gave up on it in the end as a poor fit with the game’s core identity. “Being king is the heart of Civilization,” he says. “Slumming as a lowly general puts the player in an entirely different story (not to mention violates the Covert Action rule). Win-or-lose battles are not the only interesting choice on the path to good game design, but they’re the only choice that leads to Civ.”

With his mentor having thus come up empty, Brian Reynolds addressed the problem via a more circumspect complication of the first game’s battle mechanics. He added a third and fourth statistic to each unit: firepower and hit points. Now, instead of being one-and-done, each successful “hit” would merely subtract the one unit’s firepower from the other’s total hit points, and then the battle would continue until one or the other reached zero hits points. The surviving unit would quite possibly exit the battle “wounded” and would need some time to recuperate, adding another dimension to military strategy. It was still just barely possible that a wildly inferior unit could defeat its better — especially if the latter came into a battle already at less than its maximum hit points — but such occurrences became the vanishingly rare miracles they ought to be. Consider: Civilization II‘s equivalent of a militia — renamed now to “warriors” — has ones across the board for all four statistics; a tank platoon, by contrast, has an attack strength of ten, a defense strength of five, a firepower of one, and three hit points when undamaged. This means that a group of ancient warriors needs to roll the same lucky number three times in a row on a simulated six-sided die in order to attack an undamaged tank platoon and win. A one-in-six chance has become one chance in 216 — odds that we can just about imagine applying in the real world, where freak happenstances really do occur from time to time.

This change was of a piece with those Reynolds introduced at every level of the game — pragmatic and judicious, evolutionary rather than revolutionary in spirit. I won’t enumerate them exhaustively here, but will just note that they were all very defensible if not always essential in this author’s opinion.

Civilization II was written for Windows 3, and uses that operating system’s standard Windows interface.

The layers of the program that were not immediately visible to the player got an equally judicious sprucing up — especially diplomacy and artificial intelligence, areas where the original had been particularly lacking. The computer players became less erratic in their interactions with you and with one another; no longer would Mahatma Gandhi go to bed one night a peacenik and wake up a nuke-spewing madman. Combined with other systemic changes, such as a rule making it impossible for players to park their military units inside the city boundaries of their alleged allies, these improvements made it much less frustrating to pursue a peaceful, diplomatic path to victory — made it less likely, that is to say, that the other players would annoy you into opening a can of Gandhi-style whoop-ass on them just to get them out of your hair.

In addition to the complications that were introduced to address specific weaknesses of the first game, Civilization II got a whole lot more stuff for the sake of it: more nationalities to play and play against (21 instead of 14); more advances to research (89 instead of 71); more types of units to move around the map (51 instead of 28); a bewildering variety of new geological, biological, and ecological parameters to manipulate to ensure that the game built for you just the sort of random world that you desired to play in; even a new, ultra-hard “Deity” difficulty level to address Reynold’s complaint that Meier’s Civilization was just too easy. There was also a new style of government added to the original five: “Fundamentalism” continued the tradition of mixing political, economic, and now religious ideologies indiscriminately, with all of them seen through a late-twentieth-century American triumphalist lens that might have been offensive if it wasn’t so endearingly naïve in its conviction that the great debates down through history about how human society can be most justly organized had all been definitively resolved in favor of American-style democracy and capitalism. And then the game got seven new Wonders of the World to add to the existing 21. Like their returning stablemates, they were a peculiar mix of the abstract and the concrete, from Adam Smith’s Trading Company (there’s that triumphalism again!) in the realm of the former to the Eiffel Tower in that of the latter.

Reynolds’s most generous move of all was to crack open the black box of the game for its players, turning it into a toolkit that let them try their own hands at strategy-game design. Most of the text and vital statistics were stored in plain-text files that anyone could open up in an editor and tinker with. Names could be changed, graphics and sounds could be replaced, and almost every number in the game could be altered at will. MicroProse encouraged players to incorporate their most ambitious “mods” into set-piece scenarios, which replaced the usual randomized map and millennia-spanning timeline with a more focused premise. Scenarios dealing with Rome during the time of transition from Republic to Empire and World War II in Europe were included with the game to get the juices flowing. In shrinking the timeline so dramatically and focusing on smaller goals, scenarios did tend to bleed away some of Civilization‘s high-concept magic and turn it into more of a typical strategic war game, but that didn’t stop the hardcore fans from embracing them. They delivered scenarios of their own about everything from Egyptian, Greek, and Norse mythology to the recent Gulf War against Iraq, from a version of Conway’s Game of Life to a cut-throat competition among Santa’s elves to become the dominant toy makers.

The ultimate expression of Brian Reynolds’s toolkit approach can be seen right there on the menu every time you start a new game of Civilization II, under the heading of simply “Cheat.” You can use it to change anything you want any time you want, at the expense of not having your high score recorded, should you earn one. At a click of the mouse, you can banish an opposing player from the game, research any advance instantly, give yourself infinite money… you name it. More importantly in the long run, the Cheat menu lets you peek behind the curtain to find out exactly what is going on at any given moment, almost like a programmer sitting in front of a debugging console. Sid Meier was shocked the first time he saw it.

Cheating was an inherent part of the game now, right on the main screen? This was not good. Like all storytelling, gaming is about the journey, and if you’re actively finding ways to jump to the end, then we haven’t made the fantasy compelling enough. A gripping novel would never start with an insert labeled, “Here’s the Last Page, in Case You Want to Read It Now.” Players who feel so inclined will instinctively find their own ways to cheat, and we shouldn’t have to help them out. I could not be convinced this was a good idea.

But Reynolds stuck to his guns, and finally Meier let him have it his way. It was, he now acknowledges, the right decision. The Cheat menu let players rummage around under the hood of the game as it was running, until some of them came to understand it practically as well as Reynolds himself. This was a whole new grade of catnip for the types of mind that tend to be attracted by big, complex strategy games like this one. Meanwhile the loss of a high score to boast about was enough to ensure that gamers weren’t unduly tempted to use the Cheat menu when playing for keeps, as it were.

Of course, the finished Civilization II is not solely a creation of Brian Reynolds. After he returned from Britain with his prototype in hand, two other MicroProse designers named Doug Kaufman and Jeff Briggs joined him for the hard work of polishing, refining, and balancing. Ditto a team of artists and even a film crew.

Yes, a film crew: the aspect of Civilization II that most indelibly dates it to the mid-1990s — even more so than its Windows 3 interface — must surely be your “High Council,” who pop up from time to time to offer their wildly divergent input on the subject of what you should be doing next. They’re played by real actors, hamming it up gleefully in video clips, changing from togas to armor to military uniforms to business suits as the centuries go by. Most bizarre of all is the entertainment advisor, played by… an Elvis Presley impersonator. What can one say? This sort of thing was widely expected to be the future of gaming, and MicroProse didn’t want to be left completely in the cold when the much-mooted merger of Silicon Valley and Hollywood finally became a reality.


Civilization II was released in the spring of 1996 to glowing reviews. Computer Gaming World gave it five stars out of five, calling it “a spectacularly addictive and time-consuming sequel.” Everything I’ve said in this article and earlier ones about the appeal, success, and staying power of Civilization I applies treble to Civilization II. It sold 3 million copies over the five years after its release, staying on store shelves right up to the time that the inevitable Civilization III arrived to replace it. Having now thoroughly internalized the lesson that strategy games could become franchises too, MicroProse sustained interest in the interim with two scenario packs, a “Multiplayer Gold Edition” that did for Civilization II what CivNet had done for Civilization I, and another reworking called Civilization II: Test of Time that extended the timeline of the game into the distant future. Civilization as a whole thus become one of gaming’s most inescapable franchises, the one name in the field of grand strategy that even most non-gamers know.

Given all of this, and given the obvious amount of care and even love that was lavished on Civilization II, I feel a bit guilty to admit that I struggled to get into it when I played it in preparation for this article. Some of my lack of enthusiasm may be down to purely proximate causes. I played a lot of Civilization I in preparation for the long series of articles I wrote about it and the Progress-focused, deeply American worldview it embodies, and the sequel is just more of the same from this perspective. If I’d come to Civilization II cold, as did the majority of those 3 million people who bought it, I might well have had a very different experience with it.

Still, I do think there’s a bit more to my sense of vague dissatisfaction than just a jaded player’s ennui. I miss one or two bold leaps in Civilization II to go along with all of the incrementalist tinkering. Its designers made no real effort to address the big issues that dog games of this ilk: the predictable tech tree that lends itself to rote strategies, the ever more crushing burden of micromanagement as your empire expands, and an anticlimactic endgame that can go on for hours after you already know you’re going to win. How funny to think that Master of Orion, another game published by MicroProse, had already done a very credible job of addressing all of these problems three years before Civilization II came to be!

Then, too, Civilization II may be less wonky than its predecessor, but I find that I actually miss the older game’s cock-eyed jeu d’esprit, of which those ancient militias beating up on tanks was part and parcel. Civilization II‘s presentation, using the stock Windows 3 menus and widgets, is crisper and cleaner, but only adds to the slight sense of sterility that dogs the whole production. Playing it can feel rather like working a spreadsheet at times — always a danger in these kinds of big, data-driven strategy games. Those cheesy High Council videos serve as a welcome relief from the austerity of it all; if you ask me, the game could have used some more of that sort of thing.

I do appreciate the effort that went into all the new nationalities, advances, units, and starting parameters. In the end, though, Civilization II only provides further proof for me — as if I needed it — that shoehorning more stuff into a game doesn’t always or even usually make it better, just slower and more ponderous. In this sense too, I prefer its faster playing, more lovably gonzo predecessor. It strikes me that Civilization II is more of a gamer’s game, emphasizing min-maxing and efficient play above all else, at the expense of the original’s desire to become a flight of the imagination, letting you literally write your own history of a world. Sid Meier liked to call his game first and foremost “an epic story.” I haven’t heard any similar choice of words from Brian Reynolds, and I’ve definitely never felt when playing Civilization I that it needed to be harder, as he did.

I hasten to emphasize, however, that mine is very much a minority opinion. Civilization II was taken up as a veritable way of life by huge numbers of strategy gamers, some of whom have refused to abandon it to this day, delivering verdicts on the later installments in the series every bit as mixed as my opinions about this one. Good for them, I say; there are no rights or wrongs in matters like these, only preferences.


Postscript: The Eternal War

In 2012, a fan with the online handle of Lycerius struck a chord with media outlets all over the world when he went public with a single game of Civilization II which he had been playing on and off for ten years of real time. His description of it is… well, chilling may not be too strong a word.

The world is a hellish nightmare of suffering and devastation. There are three remaining super nations in AD 3991, each competing for the scant resources left on the planet after dozens of nuclear wars have rendered vast swaths of the world uninhabitable wastelands.

The ice caps have melted over 20 times, due primarily to the many nuclear wars. As a result, every inch of land in the world that isn’t a mountain is inundated swampland, useless to farming. Most of which is irradiated anyway.

As a result, big cities are a thing of the distant past. Roughly 90 percent of the world’s population has died either from nuclear annihilation or famine caused by the global warming that has left absolutely zero arable land to farm. Engineers are busy continuously building roads so that new armies can reach the front lines. Roads that are destroyed the very next turn. So, there isn’t any time to clear swamps or clean up the nuclear fallout.

Only three massive nations are left: the Celts (me), the Vikings, and the Americans. Between the three of us, we have conquered all the other nations that have ever existed and assimilated them into our respective empires.

You’ve heard of the 100 Year War? Try the 1700 Year War. The three remaining nations have been locked in an eternal death struggle for almost 2000 years. Peace seems to be impossible. Every time a ceasefire is signed, the Vikings will surprise-attack me or the Americans the very next turn, often with nuclear weapons. So, I can only assume that peace will come only when they’re wiped out. It is this that perpetuates the war ad infinitum.

Because of SDI, ICBMs are usually only used against armies outside of cities. Instead, cities are constantly attacked by spies who plant nuclear devices which then detonate. Usually the downside to this is that every nation in the world declares war on you. But this is already the case, so it’s no longer a deterrent to anyone, myself included.

The only governments left are two theocracies and myself, a communist state. I wanted to stay a democracy, but the Senate would always overrule me when I wanted to declare war before the Vikings did. This would delay my attack and render my turn and often my plans useless. And of course the Vikings would then break the ceasefire like clockwork the very next turn. I was forced to do away with democracy roughly a thousand years ago because it was endangering my empire. But of course the people hate me now, and every few years since then, there are massive guerrilla uprisings in the heart of my empire that I have to deal with, which saps resources from the war effort.

The military stalemate is airtight, perfectly balanced because all remaining nations already have all the technologies, so there is no advantage. And there are so many units at once on the map that you could lose twenty tank units and not have your lines dented because you have a constant stream moving to the front. This also means that cities are not only tiny towns full of starving people, but that you can never improve the city. “So you want a granary so you can eat? Sorry! I have to build another tank instead. Maybe next time.”

My goal for the next few years is to try to end the war and use the engineers to clear swamps and fallout so that farming may resume. I want to rebuild the world. But I’m not sure how.

One can’t help but think about George Orwell’s Oceania, Eurasia, and Eastasia when reading of Lycerius’s three perpetually warring empires. Like Nineteen Eighty-Four, his after-action report has the uncanny feel of a dispatch from one of our own world’s disturbingly possible futures. Many people today would surely say that recent events have made his dystopia seem even more probable than ten years ago.

But never fear: legions of fans downloaded the saved game of the “Eternal War” which Lycerius posted and started looking for a way to end the post-apocalyptic paralysis. A practical soul who called himself “stumpster” soon figured out how to do so: “I opted for a page out of MacArthur’s book and performed my own Incheon landing.” In the game of Civilization, there is always a way. Let us hope the same holds true in reality.

(Sources: the book Sid Meier’s Memoir! by Sid Meier; Computer Gaming World of April/May 1985, November 1987, March 1993, June 1996, July 1996, and August 1996; Retro Gamer 86, 112, and 219. Online sources include Soren Johnson’s interviews with Sid Meier and Brian Reynolds, PC Gamer‘s “Complete History of Civilization,” and  Huffington Post‘s coverage of Lycerius’s game of Civilization and stumpster’s resolution of the stalemate. The original text of original Lycenrius’s Reddit message is posted on the Civilization II wiki.

Civilization II is not currently available for online purchase. You can, however, find it readily enough on any number of abandonware archives; some are dodgier than others, so be cautious. I recommend that you avoid the Multiplayer Gold Edition in favor of the original unless you really, really want to play with your mates. For, in a rather shocking oversight, MicroProse released the Gold Edition with bugged artificial intelligence that makes all of the computer-controlled players ridiculously aggressive and will keep you more or less constantly at war with everyone. If perpetual war is your thing, on the other hand, go for it…

Update: See Blake’s comment below for information on how to get the Multiplayer Gold Edition running with the original artificial intelligence, thereby getting the best of both worlds!

Once you’ve managed to acquire it, there’s a surprisingly easy way to run Civilization II on modern versions of Windows. You just need to install a little tool called WineVDM, and then the game should install and run transparently, right from the Windows desktop. It’s probably possible to get it running on Linux and MacOS using the standard Wine layer, but I haven’t tested this personally.)

In a feat of robust programming of which its makers deserve to be proud, Civilization II is capable of scaling to seemingly any size of screen. Here it is running on my Windows 10 desktop at a resolution of 3440 X 1440 — numbers that might as well have been a billion by a million back in 1996.

 
 

Tags: , , ,

Ethics in Strategy Gaming, Part 2: Colonization

Just what do you do next after you’ve created an epic, career-defining masterpiece? That was the question facing Sid Meier after the release of Civilization in the waning days of 1991, after the gushing reviews and the impressive sales figures had begun pouring in to his employer MicroProse. How could he go back to making games that were merely about something when he had already made the game of everything? “Civilization was such a big game that it’s hard to find a topic that doesn’t feel as if you were going backwards,” he admitted in an interview in the summer of 1992. Anything he did next seemed destined to be an anticlimax.

Meier’s first decision about his future was an eminently sensible one: he would take a break. Asked what he was currently working on during that same interview, his reply was blunt: “Absolutely nothing! I’m going to take it easy for a while.” And truly, if anyone in the games industry deserved a timeout, it was him. Meier had maintained an insane pace for the last decade, acting as both lead designer and lead programmer on no less than 21 commercially released games, three of them — Pirates!Railroad Tycoon, and of course Civilization — universally lauded icons whose influence has remained pervasive to this day. Indeed, those three games alone, released within five years of one another, constitute as extraordinary a creative outpouring as the field of gaming has ever known. But now, Meier was finally feeling burnt out, even as his marriage was ending — at least partially the result, no doubt, of all those years spent burning the candle at both ends. He desperately needed to catch his breath.

The Sid Meier who returned to the job months later had a new attitude toward his work. He wouldn’t try to somehow top Civilization in terms of scale and scope, but would rather use the fame and money it had brought him to work on whatever most interested him personally at any given time, whilst maintaining a much more sustainable work-life balance. Sometimes these projects would strike others — not least among them MicroProse’s management team — as almost perversely esoteric.

Never was this more the case than with his very first post-Civilization endeavor, as dramatic a departure from the expected as any game designer has ever dared to make. In fact, C.P.U. Bach wasn’t actually a game at all.


The music of Johann Sebastian Bach had long been enormously important to Meier, as he wrote in his recent memoir:

The sense I get when I listen to his work is that he’s not telling me his story, but humanity’s story. He’s sharing the joys and sorrows of his life in a more universal sense, a language that doesn’t require me to understand the specifics of his situation. I can read a book from eighteenth-century Germany, and find some amount of empathy with the historical figures inside, but there will always be a forced translation of culture, society, and a thousand other details that I can never truly understand. Bach isn’t bogged down in those things — he’s cutting straight to the heart of what we already have in common. He can reach across those three hundred years and make me, a man who manipulates electromagnetic circuits with my fingertips on a keyboard, feel just as profoundly as he made an impoverished farmer feel during a traditional rural celebration. He includes me in his story, just as I wanted to include my players in my games; we make the story together. Bach’s music is a perfect illustration of the idea that it’s not the artist that matters, but the connection between us.

Often described as the greatest single musical genius in the history of the world, Bach is as close to a universally beloved composer as one can find, as respected by jazz and rock musicians as he is in the classical concert halls. And mathematicians tend to find him almost equally alluring: the intricate patterns of his fugues illustrate the mathematical concepts that underlie all music, even as they take on a fragile beauty in their own right, outside the sound that they produce. The interior of Bach’s music is a virtual reality as compelling as any videogame, coming complete with an odd interactive quality. Meier:

He routinely used something called invertible counterpoint, in which the notes are designed to be reversible for an entirely new, but still enjoyable, sound. He also had a fondness for puzzle canons, in which he would write alternating lines of music and leave the others blank for his students — often his own children — to figure out what most logically belonged in between.

Bach even went so far as to hide codes in many of his works. Substituting place values for letters creates a numeric total of 14 for his last name, and this number is repeatedly embedded in the patterns of his pieces, as is its reverse 41, which happens to be the value of his last name plus his first two initials. His magnum opus, The Art of the Fugue, plays the letters of his name in the notes themselves (in German notation, the letter B refers to the note we call B-flat, and H is used for B-natural). At the top of one famous piece, The Well-Tempered Clavier, he drew a strange, looping flourish that scholars now believe is a coded set of instructions for how to tune the piano to play in every possible key, opening up new possibilities for variation and modulation.

With C.P.U. Bach, Meier attempted to make a computer write and play “new” Bach compositions, working off of the known techniques of the master, taking advantage of the way that his musical patterns were, as Meier puts it, “both predictable and stunning.” Meier insists that he created the program with no intent to diminish his favorite composer, only to celebrate him. “Creating a computer [program] that creates art counts as a form of artistic expression itself,” he says.

To aid him in the endeavor, he enlisted one Jeff Briggs, a soundtrack composer at MicroProse. Together the two labored away for more than a year on the most defiantly artsy, uncommercial product of MicroProse or Sid Meier’s history. They decided to publish it exclusively on the new 3DO multimedia console, another first for the company and the designer, because they couldn’t bear to hear their creation through the often low-fidelity computer sound cards of the time; by targeting the 3DO, they guaranteed that their program’s compositions would be heard by everyone in CD-quality fidelity.

Still, the end result is a bit underwhelming, managing only to provide an ironic proof of the uniquely human genius of Johann Sebastian Bach. C.P.U. Bach generates music that is pleasantly Bach-like, but it cannot recreate the ineffable transcendence of the master’s great works.

Pick a Baroque musical form, and C.P.U. Bach will compose a brand new example of same for you.

An esoteric product for a console that would itself prove a failure, C.P.U. Bach sold horribly upon its release in 1994. But Meier doesn’t apologize for having made this least likely of all possible follow-ups to Civilization: “My only regret is that [it] is essentially unplayable today, now that the physical console has become a lost relic.” Sometimes you just have to follow your muse, in game design as in music — or, in this case, in a bit of both.



While Sid Meier was first taking a breather and then pursuing his passion project, the public image of MicroProse was being transformed by Civilization. Having made their name in the 1980s as a publisher of vehicular military simulations, they suddenly became the premiere publisher of strategy games in the eyes of many, taking over that crown from SSI, who had largely abandoned those roots to plunge deep into licensed Dungeons & Dragons CRPGs. MicroProse was soon inundated with submissions from outsiders who had played Civilization and wanted their strategy game to go out with the same label on the box as that one, thank you very much. By no means were all of the strategy games MicroProse came to publish as a result equally worthy, but the cream of the crop — titles like Master of Orion, Master of Magic, X-COM, and Transport Tycoon — were as creatively and commercially successful as the genre got during the first half of the 1990s.

The great irony about the MicroProse of this period is that these kinds of games, the ones with which the company was now most identified in the minds of gamers, were almost all sourced from outsiders while the company’s internal developers marched in a multitude of other directions. Much effort was still poured into making yet more hardcore flight simulators like the ones of old, a case of diminishing returns as the tension of the Cold War and the euphoria of the First Gulf War faded into the past. Other internal teams plunged into standup-arcade machines, casual “office games,” complicated CRPGs, and a line of multimedia-heavy adventure games that were meant to go toe-to-toe with the likes of Sierra and LucasArts.

These ventures ranged from modest successes to utter disasters in the marketplace, trending more toward the latter as time went on. The income from the outside-developed strategy games wasn’t enough to offset the losses; by 1993, the company was facing serious financial problems. In June of that year, Spectrum Holobyte, a company with a smaller product catalog but a large amount of venture capital, acquired MicroProse.

Many projects were cancelled in the wake of the acquisition, leaving many employees in limbo, waiting to find out whether their future held a new work assignment or a pink slip. One of this group was Brian Reynolds, a programmer and dedicated tabletop wargamer who had come to MicroProse to escape from his Berkeley graduate program in philosophy and been assigned to the now-cancelled adventure line. With nothing else to do, he started to tinker with a strategy game dealing with what he found to be one of the most fascinating subjects in all of human history: the colonization of the New World. Having never designed a grand-strategy game before, he used Civilization, his favorite example of the genre, as something of a crutch: he adapted most of its core systems to function within his more focused, time-limited scenario. (Although said scenario brings to mind immediately Dani Bunten Berry’s Seven Cities of Gold — a game which was ironically a huge influence on Meier’s Pirates!, Railroad Tycoon, and Civilization — Reynolds claims not to have had it much in mind when he started working on his own game. “I didn’t personally like it as a game,” he says. “It all felt like empty forests.”) Reynolds had little expectation that his efforts would amount to much of anything in the end. “I was just doing this until they laid me off,” he says. Although he was working in the same building as Meier, it never even occurred to him to ask for the Civilization source code. Instead he reverse-engineered it in the same way that any other hacker would have been forced to do.

Nevertheless, word of the prototype slowly spread around the office, finally reaching Meier. “Can you come talk to Sid about this?” Reynold’s manager asked him one day. From that day forward, Colonization was an official MicroProse project.

The powers that were at the company would undoubtedly have preferred to give the reins of the project to Meier, placing Reynolds in some sort of junior design and/or programming role. But Meier was, as we’ve already seen, up to his eyebrows in Johann Sebastian Bach at the time, and was notoriously hard to corral under any circumstances. Further, his sense of fair play was finely developed. “This is your idea,” he said to Reynolds. “You deserve to have ownership of it.” He negotiated an arrangement with MicroProse’s management whereby he would serve as a design advisor, but the project as a whole would very much remain Brian Reynold’s.


Having secured our charter…

… we set off for the New World.

The early game of exploration and settlement is in some ways the most satisfying, being free from the micromanagement that crops up later.

The map can get crowded indeed as time goes on.

Like so much in the game, the city-management screen draws heavily from Civilization, but the row of trade goods along the bottom of the screen reflects the more complex economic model.

We declare independence! Hopefully our armies are up for the war that will follow.


The finished Colonization lets you play as the British, the Spanish, the Dutch, or the French. You begin the game in that pivotal year of 1492, ready to explore and found your first colony in the Americas. In keeping with the historical theme, trade is extremely important — much more so than in the highly abstracted economic model employed by Civilization. Sugar, cotton, and tobacco — grown, processed, and shipped back to the Old World — are the key to your colonies’ prosperity. (Brian Reynolds has said only semi-facetiously that his intention with Colonization was to “combine together all the best things from Civilization and Railroad Tycoon — because that would make the game even better!”) Naturally, you have to deal with the Native Americans who already inhabit the lands into which you want to expand, as you do the other European powers who are jockeying for dominance. Your ultimate goal is to build a federation of colonies self-sufficient enough to declare independence from its mother country, an event which is always followed by a war. If you win said war, you’ve won the game. If, on the other hand, you lose the war, or fail to force an outcome to it by 1850, or fail to trigger it at all by 1800, you lose the game.

Even if we set aside for the moment some of the uncomfortable questions raised by its historical theme and the aspects thereof which it chooses to include and exclude, Colonization reveals itself to be a competent game but far from a great one. Sid Meier himself has confessed to some serious misgivings about the rigid path — independence by an arbitrary date or bust — down which Brian Reynolds elected to force its player:

It was a grandiose, win-or-lose proposition with the potential to invalidate hours of successful gameplay. Generally speaking, I would never risk alienating the player to that degree. It was historically accurate, however, and Brian saw it as a satisfying boss battle rather than a last-minute bait and switch, so I deferred to him. Good games don’t get made by committee.

Not only is the choice problematic from a purely gameplay perspective, but it carries unfortunate overtones of all-too-typically-American historical chauvinism in forcing the Spanish, Dutch, and French colonies to clone the experience of the British colonies that turned into the United States in order to win the game — the implication being that those colonies’ very different real histories mark them as having somehow done things wrong in contrast to the can-do Yankees.

But Colonization has plenty of other, more practical flaws. Micromanagement, that ever-present bane of so many grand-strategy games, is a serious issue here, thanks not least to the nitty-gritty complexities of the economic model; by the time you’re getting close to the point of considering independence, you’ll be so bogged down with the busywork of handing out granular work assignments to your colonists and overseeing every freight shipment back home that you’ll be in danger of losing all sense of any bigger picture. In contrast to the seamless wholeness of Civilization, Colonization remains always a game of disparate parts that don’t quite mesh. For example, the military units you can raise always seem bizarrely expensive in proportion to their potency. It takes an eternity of micro-managing tedium to build even a halfway decent military, and even when you finally get to send it out into the field you still have to spend the vast majority of your time worrying about more, shall we say, down-to-earth matters than fighting battles — like, say, whether you’ve trained enough carpenters in your cities and whether their tools are in good repair. The funnest parts of Colonization are the parts you spend the least amount of time doing.

In the end, then, Colonization never manages to answer the question of just why you ought to be playing this game instead of the more generous, open-ended, historically expansive Civilization. Computer Gaming World magazine, the industry’s journal of record at the time of the game’s release in late 1994, published a sharply negative review, saying that there was “more tedium and less care” in Colonization than in Civilization.

One might expect such a review from such an influential publication to be a game’s death knell. Surprisingly, though, Colonization did quite well for itself in the marketplace. Brian Reynolds estimates today that it sold around 300,000 copies. Although that figure strikes me as perhaps a little on the high side, there’s no question that the game was a solid success. For proof, one need only look to what Reynolds got to do next: he was given the coveted role of lead designer on Civilization II after Sid Meier, ever the iconoclast, refused it.

But here’s the odd thing: Meier’s name would appear in bold letters on the box of Civilization II, as it had on the box of Colonization before it, while that of Brian Reynolds was nowhere to be found on either. MicroProse’s marketing department had first hit upon the idea of using Meier’s name prominently back in 1987, when they’d pondered how to sell Pirates!, a game that was not only radically different from anything MicroProse had released before but was impossible to comfortably classify into any existing gaming genre. It seemed to work; Sid Meier’s Pirates! became a big hit. Since then, the official titles of most of Meier’s games had come with the same prefix. Sid Meier’s Colonization, however, was something new, marking the first time that MicroProse’s marketers assigned Meier ownership of a game he hadn’t truly designed at all. “Yes, I made suggestions along the way,” he says today, “but it had been up to Brian whether to accept them. Colonization was not Sid Meier’s game.”

And yet the name emblazoned at the top of the box stated just the opposite. Meier rationalizes this fact by claiming that “‘Sid Meier’s’ now meant ‘Sid Meier mentored and approved’ instead of ‘Sid Meier personally coded.'” But even this statement is hard to reconcile with the text on the back of the box, which speaks of “Colonization, the newest strategy game from Sid Meier [that] continues the great tradition of Civilization.” Clearly MicroProse’s marketing department, if not Meier himself, was completely eager to make the public believe that Sid Meier had designed Colonization, full stop — and, indeed, the game was received on exactly these terms by the press and public. Brian Reynolds, for his part, was happy to give his mentor all of the public credit for his work, as long as it helped the game to sell better and gave him a chance to design more games in the future. The soft-spoken, thoughtful Sid Meier, already the most unlikely of celebrities, had now achieved the ultimate in celebrity status: he had become a brand unto himself. I trust that I don’t need to dwell on the irony of this in light of his statement that “it’s not the artist that matters.”



But MicroProse’s decision to publicly credit Colonization to someone other than the person who had actually designed it is hardly the most fraught of the ethical dilemmas raised by the game. As I’ve already noted, the narrative about the colonization of the New World which it forces its player to enact is in fact the semi-mythical origin story of the United States. It’s a story that’s deeply rooted in the minds of white Americans like myself, having been planted there by the grade-school history lessons we all remember: Pilgrims eating their Thanksgiving dinner with the Indians, Bostonians dumping British tea into the ocean to protest taxation without representation, Paul Revere making his midnight ride, George Washington leading the new country to victory in war and then showing it how it ought to conduct itself in peace.

In presenting all this grade-school history as, if not quite inevitable, at least the one satisfactory course of events — it is, after all, a matter of recreating the American founding myth or losing the game — Colonization happily jettisons any and all moral complexity. One obvious example is its handling of the Native American peoples who were already living in the New World when Europeans decided to claim those lands for themselves. In the game, the Native Americans you encounter early on are an amiable if primitive and slightly dim bunch who are happy enough to acknowledge your hegemony and work for you as long as you give them cigars to smoke and stylish winter coats to wear. Later on, when they start to get uppity, they’re easy enough to put back in line using the stick instead of the carrot.

And then there’s the game’s handling of slavery — or rather its lack of same. It’s no exaggeration to say that all of the modern-day countries of North and South America were built by the sweat of slaves’ brows. Certainly the extent to which the United States in particular was shaped by what John C. Calhoun dubbed The Peculiar Institution can hardly be overstated; the country’s original sin still remains with us today in the form of an Electoral College and Senate that embody the peculiarly undemocratic practice of valuing the votes of some citizens more than those of others, not to mention the fault lines of racial animus that still fracture American politics and society. Yet the game of Colonization neatly sidesteps all of this; in its world, slavery simply doesn’t exist. Is this okay, or is it dangerous to so blithely dismiss the sins and suffering of our ancestors in a game that otherwise purports to faithfully recreate history?



Johnny L. Wilson, the editor-in-chief of Computer Gaming World, stood virtually alone among his peers in expressing concern about the thin slice of life’s rich pageant that games of the 1990s were willing and able to encompass. He alone spoke of “the preponderance of violent solutions as opposed to creative exploration and experimentation, the increasingly narrow scope of subject matter perceived as marketable, the limited nature of non-player characters and our assumptions about game players.” Unsurprisingly, then, he was the first and as it turned out the only gaming journalist of his era to address Colonization not just as a good, bad, or indifferent game in the abstract, but as a rhetorical statement about the era which it attempted to recreate, whether it wished to be such a thing or not. (As the school of Deconstructionism constantly reminds us, it’s often the works that aren’t actually trying to say anything at all about a subject which end up having the most to tell us about their makers’ attitudes toward it…) Wilson raised his concerns before Colonization was even released, when it existed only in a beta version sent to magazines like his.

Two upcoming games on the colonial era will excise slavery from the reality they are simulating: Sid Meier’s Colonization from MicroProse and Impressions’ High Seas Trader. Both design teams find the idea of slavery, much less the institution of slavery, to be repugnant, and both teams resist the idea of “rewarding” the gamer for behavior which is and was abominable.

This reminds me of the film at Mount Vernon where the narration explains that Washington abhorred slavery, so he left wording in his will so that, upon his and Martha’s deaths, his slaves would be freed. To me, that’s tantamount to saying, “I’ll correct this immoral practice as soon as it doesn’t cost me anything anymore!”

It is obvious that George didn’t find it economically viable to be moral in that circumstance. So, if slavery was such an important facet of the colonial economy that even the “father of our country” couldn’t figure out how to build a successful business without it, how do we expect to understand the period in which he lived without having the same simulated tools at our disposal? Maybe we would have some belated appreciation for those early slaves if we didn’t try to ignore the fact of their existence.

Of course, we know what the answer is going to be. The game designers will say that they “only put in the cool parts” of history. We hear that. Yet, while there is nothing wrong with emphasizing the most entertaining parts of a historical situation, there is a danger in misrepresenting that historical situation. Maybe it doesn’t add credibility to the revisionist argument that Auschwitz never happened when we remove the Waffen SS from a computer game, but what happens when someone removes Auschwitz from the map? What happens when it is removed from the history books?

Removing the horrors of history from computer games may not be a grand conspiracy to whitewash history, but it may well be a dangerous first step.

Wilson’s editorial prompted an exchange in the reader-letters section of a subsequent issue. I’d like to reprint it in an only slightly edited form here because the points raised still pop up regularly today in similar discussions. We begin with a letter from one Ken Fishkin, who takes exception with Wilson’s position.

Johnny Wilson seems to have forgotten that the primary purpose of a game is to entertain. Computer games routinely engage in drastic alterations, simplifications, and omissions of history. Railroad Tycoon omitted Chinese labor and union strife. In SimCity, the mayor is an absolute dictator who can blithely bulldoze residential neighborhoods and churches with a mere click of the mouse, and build the Golden Gate Bridge in weeks instead of decades. In Sid Meier’s Civilization, Abraham Lincoln is immortal, phalanxes can sink battleships, and religious strife, arguably the single most important factor in the history of international relations, is totally omitted. And yet Computer Gaming World gave these games its highest praise, placing all of them in its Hall of Fame!

It is hypocritical of Computer Gaming World to criticize Sid Meier’s Colonization in the same issue in which it effusively praises Sid Meier’s Civilization. Computer Gaming World used to know that computer games shouldn’t be held to the same standards of historical accuracy as a textbook.

The magazine’s editorial staff — or really, one has to suspect, Wilson himself — replied thusly:

Is it hypocritical? The same Johnny Wilson that wrote the column had an entire chapter in The SimCity Planning Commission Handbook which talked about the realities that were not simulated (along with some elaborate workarounds that would enable gamers to see how much had been abstracted) and he also questioned certain historical abstractions in [his Civilization strategy guide] Rome on 640K a Day. Do these citations seem hypocritical? Different games have different levels of perspective and different levels of abstraction. Their success or failure will always depend on the merit of their gameplay, but that doesn’t mean we shouldn’t consider their historical/factual underpinning as well.

Even if certain historical/real aspects have to be abstracted for the sake of gameplay, the designers have a responsibility to acknowledge, tip their hat to, or clarify those conditions which they have abstracted. When it comes to orders of battle and dominant practices, they should be addressed in some way and not ignored because they are inconvenient. We agree that a game should be balanced enough to play well, but the lessons of history should not be totally glossed over. We fear that there is a tendency of late to do just that.

Finally, we have a letter from Gilbert L. Brahms, writing in support of Wilson’s position.

Your theses are very well-taken. Computer games become nothing [more] than schlock entertainment if they strip realism from historical recreations. There is no point in presenting any [game] referring to World War II Germany without presenting Nazism in all its symbology, nay, without including the imagery which ensorcelled those desperate and gullible Germans of the time into surrendering themselves “mit ganzen Willen” to Hitler’s blandishments.

The sins of the past are not eradicated by repression; in fact, they become all the more fascinating for having become forbidden fruit. Only critical confrontation can clarify such atrocities as occurred in the 1940s and can tutor us to resist such temptations again, in ourselves as well as in others.

If, therefore, a computer game should truly aspire to become a work of art, it must fulfill both the recreative and the didactive functions inherent in all serious aesthetic productions: it must present horrible conflicts with all of their nasty details.

I’ll return to the arguments presented above in due course. Before I do that, though, I’d like to take a brief leap forward in time.

In 2008, Firaxis Games — a company founded by Sid Meier, Brian Reynolds, and Jeff Briggs — announced a new version of Colonization, which once again chose to present Native Americans as dim-witted primitives and to completely ignore the historical reality of slavery. Even before its release, Ben Fritz, a gaming blogger for Variety, loudly attacked it for having committed the vaguely defined, all-purpose crime of being “offensive.” Fritz’s blog post is neither well-argued nor well-written — “I literally exclaimed ‘holy sh*t’ out loud when I was reading an email this morning,” goes its unpromising beginning — so I won’t bother to quote more from it here. But it was a harbinger of the controversy to come, which came to dominate the critical discussion around the new Colonization to the point that its qualities as a mere game were all but ignored. Firaxis published the following terse missive in a fruitless attempt to defuse the situation:

For seventeen years the Civilization series has given people the opportunity to create their own history of the world. Colonization deals with a specific time in global history, and treats the events of that time with respect and care. As with all previous versions of Civilization, the game does not endorse any particular position or strategy – players can and should make their own moral judgments. Firaxis keeps the player at the center of the game by providing them with interesting choices and decisions to make, which has proven to be a fun experience for millions of people around the world.

Whatever its merits or lack thereof, this argument was largely ignored. The cat was now well and truly out of the bag, and many academics in particular rushed to criticize the “gamefication of imperialism” that was supposedly at the core of even the original game of Civilization. In his recent memoir, Sid Meier describes their critiques with bemusement and more than a touch of condescension.

This philosophical analysis quickly spread to my older titles — or as one paper described them, my “Althusserian unconscious manifestations of cultural claims” with “hidden pedagogical aspirations.” Pirates! wasn’t about swashbuckling, it turned out, but rather “asymmetrical and illegal activities [that] seem to undermine the hierarchical status quo while ultimately underlining it.” Even C.P.U. Bach was accused of revealing “a darker side to the ideological sources at work behind ludic techniques.”

All I can say is that our motives were sincere, and maybe these guys have a little too much time on their hands.

For all that I’m usually happy to make fun of the impenetrable writing which too many academics use to disguise banal ideas, I won’t waste space shooting those fish in a barrel here. It’s more interesting to consider the differing cultural moments exemplified by the wildly divergent receptions of the two versions of Colonization — from a nearly complete silence on the subject of the potentially problematic aspects of its theme and implementation thereof to red-faced shouting matches all over the Internet on the same subjects. Through this lens, we can see how much more seriously people came to take games over a span of fourteen years, as well as how much more diverse the people playing and writing about them became. And we can also see, of course, how the broader dialog around history changed.

Those changes have only continued and, if anything, accelerated in the time since 2008; I write these words at the close of a year in which the debates surrounding our various historical legacies have become more charged than ever. One side accuses the other of ignoring all of the positive aspects of the past and trying to “cancel” any historical figure who doesn’t live up to its fashionable modern ideals of “wokeness.” Meanwhile the opposing side accuses its antagonists of being far too eager to all too literally whitewash the past and make excuses for the reprehensible conduct of its would-be heroes. Mostly, though, the two sides prefer just to call one another nasty names.



So, rather than wading further into that morass, let’s return to the arguments I reprinted without much commentary above, applying them now not only to Colonization but also to Panzer General, the subject of my first article in this two-part series. It strikes me that the best way to unpack a subtle and difficult subject might be to consider in turn each line of argument supporting the claim that Colonization — and by implication Panzer General — are fine just as they are. We’ll begin with the last of them: Firaxis’s corporate response to the controversy surrounding the second Colonization.

Said response can be summed up as the “it’s not the game, it’s the player!” argument. It’s long been trotted out in defense of a huge swath of games with objectionable or potentially objectionable content; Peter Molyneux was using it to defend the ultra-violence in Syndicate already in 1993, and there are doubtless examples that predate even that one. The core assertion here is that the game doesn’t force the player’s hand at all — that in a game like, say, Grand Theft Auto it’s the player who chooses to indulge in vehicular mayhem instead of driving politely from place to place like a law-abiding citizen.

Of course, this argument can’t be used as an equally efficacious escape hatch for all games. While Panzer General will allow you to command the Allied forces if you play a single scenario, the grand campaign which is the heart of that game’s appeal only allows you to play a Nazi general, and certainly gives you no option to turn against the Nazi cause at some point, as Erwin Rommel may or may not have done, beyond the obvious remedy of shutting off the computer. But Colonization does appear to do a little better on this front, at least at first glance. As many defenders of the game are at pains to point out, you can choose to treat the Native Americans you encounter relatively gently in comparison to the European colonizers of recorded history (admittedly, not really a high bar to clear). Still, the fact does remain that you will be forced to subjugate them to one degree or another in order to win the game, simply because you need the land and resources which they control if you hope to win the final war for independence.

Here, then, we come to the fatal flaw that undermines almost all applications of this argument. Its proponents would seemingly have you believe that the games of which they speak are rhetorically neutral sandboxes, exact mirror images of some tangible objective reality. But this they are not. Even if they purport to “simulate” real events to one degree or another, they can hope to capture only a tiny sliver of their lived experience, shot through with the conscious and subconscious interests and biases of the people who make them. These last are often most clearly revealed through a game’s victory conditions, as they are in the case of Colonization. To play Colonization the “right” way — to play it as the designers intended it to be played — requires you to exploit and subjugate the people who were already in the New World millennia before your country arrived to claim it. Again, then, we’re forced to confront the fact that every example of a creative expression is a statement about its creators’ worldview, whether those creators consciously wish it to be such a thing or not. Labeling it a simulation does nothing to change this.

The handling — or rather non-handling — of slavery by Colonization is an even more telling case in point. By excising slavery entirely, Colonization loses all claim to being a simulation of real history to any recognizable degree whatsoever, given how deeply intertwined the Peculiar Institution was with everything the game does deign to depict. Just as importantly, the absence of slavery invalidates at a stroke the claim that the game is merely a neutral sandbox of a bygone historical reality for the player’s id, ego, and superego to prance through. For this yawning absence is something over which the player has no control. She isn’t given the chance to take the moral high road by refusing to participate in the slave trade; the designers have made that choice for her, as they have so many others.

I require less space to dispense with Ken Fishkin’s equating of Railroad Tycoon‘s decision not to include exploited Chinese laborers and SimCity‘s casting you in the role of an autocratic mayor with the ethical perils represented by Colonization‘s decision not to include slavery and Panzer General‘s casting you in the role of a Nazi invader. Although Fishkin expresses the position about as well as can reasonably be expected, these sorts of pedantic, context-less gotcha arguments are seldom very convincing to anyone other than the overly rigid thinkers who trot them out. I freely acknowledge that all games which purport to depict the real world do indeed simplify it enormously and choose a very specific domain to focus upon. So, yes, Railroad Tycoon as well does whitewash the history it presents to some extent. Yet the exploitation of Chinese labor in the Old West, appalling though it was, cannot compare to the pervasive legacy of American slavery and the European Holocaust in today’s world. Debaters who claim otherwise quickly start to sound disingenuous. In any discussion of this nature, space has to be allowed for degree as well as kind.

And so we arrive at Fishkin’s other argument from principle, the very place where these sorts of discussions always tend to wind up sooner or later. “The primary purpose of a game is to entertain,” he tells us. Compare that statement with these assertions of Gilbert L. Brahms: “Computer games become nothing [more] than schlock entertainment if they strip realism from historical recreations. If a computer game should truly aspire to become a work of art, it must fulfill both the recreative and the didactive functions inherent in all serious aesthetic productions: it must present horrible conflicts with all of their nasty details.” Oh, my. It seems that we’ve landed smack dab in the middle of the “are games art?” debate. What on earth do we do with this?

Many of us have been conditioned since childhood to believe that games are supposed to be fun — no more, no less. Therefore when a game crosses our path that aspires to be more than just fun — or, even more strangely, doesn’t aspire to be “fun” in the typical sense of the word at all — we can find it deeply confusing. And, people being people, our first reaction is often outrage. Three years before the second version of Colonization was released, one Danny Ledonne made Super Columbine Massacre RPG!, an earnest if rather gawkily adolescent attempt to explore the backgrounds and motivations of the perpetrators of the high-school massacre in question. A book on the same theme would have been accepted and reviewed on its merits, but the game received widespread condemnation simply for existing. Since games by definition can aspire only to being fun, Ledonne must consider it fun to reenact the Columbine massacre, right? The “games as art” and “serious games” crews tried to explain that this edifice of reasoning was built upon a faulty set of assumptions, but the two sides mostly just talked past one another.

Although the “just a game” defense may seem a tempting get-out-of-jail-free card in the context of a Panzer General or a Colonization, one should think long and hard before one plays it. For to do so is to infantilize the entire medium — to place it into some other, fundamentally different category from books and movies and other forms of media that are allowed a place at the table where serious cultural dialog takes place.

The second version of Colonization found itself impaled on the horns of these two very different sets of assumptions about games. Its excision of slavery drew howls of protest calling it out for its shameful whitewashing of history. But just imagine the alternative! As Rebecca Mir and Trevor Owens pointed out in a journal article after the hubbub had died down, the controversy we got was nothing compared to the one we would have had if Colonization had given the naysayers what many of them claimed to want: had better captured historical reality by actually letting you own and trade slaves. The arguments against the one approach are predicated on the supposition that at least some types of games are more than idle entertainments, that a game which bills itself as a reasonably accurate reenactment of colonial history and yet excises slavery from its narrative deserves to be condemned in the same terms as a book or movie which does the same; the arguments against the other are rooted in the supposition that games are just fun, and how dare you propose that it’s fun to join the slave trade. Damned if you do, damned if you don’t. Perhaps the only practical solution to the dilemma is that of simply not making any more versions of Colonization. No, it’s not a terribly satisfying solution, placing limits as it does on what games are allowed to do and be. Nevertheless, it’s the one that Firaxis will almost certainly choose to employ in the future.

I do want to emphasize one more time here at the end of this pair of articles that neither Panzer General nor Colonization was created with any conscious bad intent. They stem from a time when computer gaming was much more culturally homogeneous than it has become, when computer gamers were to an almost overwhelming degree affluent, stereotypically “nerdy” white males between the ages of 10 and 35. People of privilege that they were, usually immersed in the hard sciences rather than the irritatingly amorphous but more empathetic humanities, they struggled to identify with those crosscurrents of society and history outside their own. Although the wargaming subculture that spawned Panzer General and Colonization still exists, and would still receive those exact games today in the same unquestioning way, it’s vastly smaller than it used to be in proportion to the overall mass of gamers. And, again, its blind spots then and now remain venal sins at worst in the grand scale of things.

That said, I for one am happy that the trajectory of gaming since 1994 has been ever outward, both in terms of the types of people who play games and the kinds of themes and experiences those games present. Indeed, it sometimes seems to me that their very scope of possibility is half the reason we can so easily confuse one another when we try to talk about games. Certainly one person’s idea of a satisfying game can be markedly different from another’s, such that even as brilliant a mind as that of Sid Meier can have trouble containing it all. His famous categorical claim that a good game is a “series of interesting decisions” is true enough in the case of the games he prefers to play and make, but fails to reckon with the more experiential aspects of interactivity which many players find at least equally appealing. It’s thus no surprise that he offhandedly dismisses adventures games and other interactive experiences that are more tightly plotted and less zero-sum.

I’ve often wondered whether this label of “game” is really all that useful at all, whether there’s really any more taxonomical kinship between a Colonization and a Super Columbine RPG! than there is between, say, books and movies. Digital games are the ultimate form of bastard media, appropriating elements from all of the others and then slathering on top of it all the special sauce of interactivity. Perhaps someday we’ll figure out how to talk about this amorphous stew of possibility that just keeps bubbling up out of the pot we want to use to contain it; perhaps someday we’ll divide it up into a collection of separate categories of media, using those things we call “gaming genres” now as their basis. In the meantime, we’ll just have to hang on for the ride, and try not to rush to judgment too quickly when our expectations of the medium don’t align with those of others.

(Sources: the books Sid Meier’s Memoir!: A Life in Computer Games by Sid Meier with Jennifer Lee Noonan and the article “Modeling Indigenous Peoples: Unpacking Ideology in Sid Meier’s Colonization” by Rebecca Mir and Trevor Owens, from the book Playing with the Past: Digital Games and the Simulation of History; PC Review of August 1992; Computer Gaming World of April 1994, September 1994, November 1994, and December 1994; online sources include “How Historical Games Integrate or Ignore Slavery” by Amanda Kerry on Rock Paper Shotgun; “Colonialism is Fun? Sid Meier’s Civilization and the Gamefication of Imperialism” by CIGH Exeter on the Imperial and Global Forum; Soren Johnson’s interview with Brian Reynolds; IGN‘s interview with Brian Reynolds; Ben Fritz’s blog on Variety.

Colonization is available for digital purchase on GOG.com. C.P.U. Bach, having been made only for a long-since-orphaned console, is sadly not.)

 
 

Tags: , , , , , , , ,

The Game of Everything, Part 10: Civilization and the Limits of Progress

To listen to what Sid Meier says about his most famous achievement today, my writing all of these articles on Civilization has been like doing a deep reading of an episode of The Big Bang Theory; there just isn’t a whole lot of there there. Meier claims that the game presents at best a children’s-book view of history, that the only real considerations that went into it were what would be fun and what wouldn’t. I don’t want to criticize him for that stance here, any more than I want to minimize the huge place that fun or the lack thereof really did fill in the decisions that he and his partner Bruce Shelley made about Civilization. I understand why he says what he says: he’s a commercial game designer, not a political pundit, and he has no desire to wade into controversy — and possibly shrink his customer base — by taking public positions on the sorts of fractious topics I’ve been addressing over the course of these articles. If he should need further encouragement to stay well away from those topics, he can find it in the many dogmatic academic critiques of Civilization which accuse it of being little more than triumphalist propaganda. He’d rather spend his time talking about game design, which strikes me as perfectly reasonable.

Having said all that, it’s also abundantly clear to me that Civilization reflects a much deeper and more earnest engagement with the processes of history than Meier is willing to admit these days. This is, after all, a game which cribs a fair amount of its online Civilopedia directly from Will Durant, author of the eleven-volume The Story of Civilization, the most ambitious attempt to tell the full story of human history to date. And it casually name-drops the great British historian Arnold J. Toynbee, author of the twelve-volume A Study of History, perhaps the most exhaustive — and certainly the most lengthy — attempt ever to construct a grand unified theory of history. These are not, needless to say, books which are widely read by children. There truly is a real theory of history to be found in Civilization as well, one which, if less thoroughly worked-out than what the likes of Toynbee have presented in book form, is nevertheless worth examining and questioning at some length.

The heart of Civilization‘s theory of history is of course the narrative of progress. In fact, the latter is so central to the game that it’s joined it as the second of our lodestars throughout this series of articles. And so, as we come to the end of the series, it seems appropriate to look at what the game and the narrative of progress have to say about one another one last time, this time in the context of a modern society like the ones in which we live today. Surprisingly given how optimistic the game’s take on history generally is, it doesn’t entirely ignore the costs that have all too clearly been shown to be associated with progress in this modern era of ours.

Meier and Shelley were already working on Civilization when the first international Earth Day was held on April 22, 1990, marking the most important single event in the history of the environmental movement since the publication of Rachel Carson’s Silent Spring back in 1962. Through concerts, radio and television programs, demonstrations, and shrewd publicity stunts like a Mount Everest “Peace Climb” including American, Soviet, and Chinese climbers roped together in symbolic co-dependence, Earth Day catapulted the subject of global warming among other environmental concerns into the mass media, in some cases for the first time.

Whether influenced by this landmark event or not, Civilization as well manifests a serious concern for the environment in the later, post-Industrial Revolution stages of the game. Coal- and oil-fired power plants increase the productivity of your factories dramatically, but also spew pollution into the air which you must struggle to clean up. Nuclear power plants, while the cheapest, cleanest, and most plentiful sources of energy most of the time, can occasionally melt down with devastating consequences to your civilization. Large cities generate pollution of their own even absent factories and power plants, presumably as a result of populations that have discovered the joy of automobiles. Too much pollution left uncleaned will eventually lead not only to sharply diminished productivity for your civilization but also to global warming, making Civilization one of the first works of popular entertainment to acknowledge the growing concern surrounding the phenomenon already among scientists of the early 1990s.

In fighting your rearguard action against these less desirable fellow travelers on the narrative of progress, you have various tools at your disposal. To clean up pollution that’s already occurred, you can build and deploy settler units to the affected areas. To prevent some pollution from occurring at all, you can invest in hydroelectric plants in general and/or the Wonder of the World that is the Hoover Dam. And/or you can build mass-transit systems to wean your people away from their precious cars, and/or build recycling centers to prevent some of their trash from winding up in landfills.

Interestingly, the original Civilization addresses the issues of environment and ecology that accompany the narrative of progress with far more earnestness than any of its sequels — another fact that rather gives the lie to Meier’s assertion that the game has little to do with the real world. Although even the first game’s implementation of pollution is far from unmanageable by the careful player, it’s something that most players just never found to be all that much fun, and this feedback caused the designers who worked on the sequels to gradually scale back its effects.

In the real world as well, pollution and the threat of global warming aren’t much fun to talk or think about — so much so that plenty of people, including an alarming number of those in positions of power, have chosen to stick their heads in the sand and pretend they don’t exist. None of us enjoy having our worldviews questioned in the uncomfortable ways that discussions of these and other potential limits of progress — progress as defined in terms of Francis Fukuyama’s explicit and Civilization‘s implicit ideals of liberal, capitalistic democracy — tend to engender.

As Adam Smith wrote in the pivotal year of 1776 and the subsequent centuries of history quite definitively proved, competitive free markets do some things extraordinarily well. The laws of supply and demand conspire to ensure that a society’s resources are allocated to those things its people actually need and want, while the profit motive drives innovation in a way no other economic system has ever come close to equaling. The developed West’s enormous material prosperity — a prosperity unparalleled in human history — is thanks to capitalism and its kissing cousin, democracy.

Yet unfettered capitalism, that Platonic ideal of libertarian economists, has a tendency to go off the rails if not monitored and periodically corrected by entities who are not enslaved by the profit motive. The first great crisis of American capitalism could be said to have taken place as early as the late 1800s, during the “robber baron” era of monopolists who discovered a way to cheat the law of supply and demand by cornering entire sectors of the market to themselves. Meanwhile the burgeoning era of mass production and international corporations, so dramatically different from Adam Smith’s world of shopkeepers and village craftsmen, led to the mass exploitation of labor. The response from government was an ever-widening net of regulations to keep corporations honest, while the response from workers was to unionize for the same purpose. Under these new, more restrictive conditions, capitalism continued to hum along, managing to endure another, still greater crisis of confidence in the form of the Great Depression, which led to the idea of a taxpayer-funded social safety net for the weak and the unlucky members of society.

The things that pure capitalism doesn’t do well, like providing for the aforementioned weak and unlucky who lack the means to pay for goods and services, tend to fall under the category that economists call “externalities”: benefits and harms that aren’t encompassed by Adam Smith’s supposedly all-encompassing law of supply and demand. In Smith’s world of shopkeepers, what was best for the individual supplier was almost always best for the public at large: if I sold you a fine plow horse for a reasonable price, I profited right then and there, and also knew that you were likely to tell your friends about it and to come back yourself next year when you needed another. If I sold you a lame horse, on the other hand, I’d soon be out of business. But if I’m running a multinational oil conglomerate in the modern world, that simple logic of capitalism begins to break down in the face of a much more complicated web of competing concerns. In this circumstance, the best thing for me to do in order to maximize my profits is to deny that global warming exists and do everything I can to fight the passage of laws that will hurt my business of selling people viscous black gunk to burn inside dirty engines. This, needless to say, is not in the public’s long-term interest; it’s an externality that could quite literally spell the end of human civilization. So, government must step in — hopefully! — to curb the burning of dirty fuels and address the effects of those fossil fuels that have already been burned.

But externalities are absolutely everywhere in our modern, interconnected, globalized world of free markets. Just as there’s no direct financial benefit in an unfettered free market for a doctor to provide years or decades worth of healthcare to a chronically sick person who lacks the means to pay for it, there’s no direct financial harm entailed in a factory dumping its toxic effluent into the nearest lake. There is, of course, harm in the abstract, but that harm is incurred by the people unlucky enough to live by the lake rather than by the owners of the factory. The trend throughout the capitalist era has therefore been for government to step in more and more; every successful capitalist economy in the world today is really a mixed economy, to a degree that would doubtless have horrified Adam Smith. As externalities continue to grow in size and scope, governments are forced to shoulder a bigger and bigger burden in addressing them. At what points does that burden become unbearable?

One other internal contradiction of modern capitalism, noticed by Karl Marx already in the nineteenth century, has come to feel more real and immediate than ever before in the years since the release of Civilization. The logic of modern finance demands yearly growth — ever greater production, ever greater profits. Just holding steady isn’t good enough; if you doubt my word, consider what your pension fund will look like come retirement time if the corporations in which you’ve invested it are content to merely hold steady. Up to this point, capitalism’s efficiency as an economic system has allowed it to deliver this growth over a decade-by-decade if not always year-by-year basis. But the earth’s resources are not unlimited. At some point, constant growth — the constant demand for more, more, more — must become unsustainable. What happens to capitalism then?

Exactly the future that believers in liberal democracy and capitalism claim to be the best one possible — that the less-developed world remakes itself in the mold of North America and Western Europe — would appear to be literally impossible in reality. The United States alone, home to 6 percent of the world’s population, consumes roughly 35 percent of its resources. One doesn’t need to be a statistician or an ecologist to understand that the rest of the world simply cannot become like the United States without ruining a global ecosystem that already threatens to collapse under the weight of 7.5 billion souls — twice the number of just thirty years ago. Humans are now the most common mammal on the planet, outnumbering even the ubiquitous mice and rats. Two-thirds of the world’s farmland is already rated as “somewhat” or “strongly” degraded by the Organization for Economic Cooperation and Development. Three-quarters of the world’s biodiversity has been lost since 1900, and 50 percent of all remaining plant and animal species are expected to go extinct before 2100. And hovering over it all is the specter of climate change; the polar ice caps have melted more in the last 20 years than they did in the previous 12,000 years since the end of the last ice age.

There’s no doubt about it: these are indeed uncomfortable conversations to have. Well before the likes of Brexit and President Donald Trump, even before the events of September 11, 2001, Western society was losing the sense of triumphalism that had marked the time of the original Civilization, replacing it with a jittery sense that humanity was packed too closely together on an overcrowded and overheating little planet, that the narrative of progress was rushing out of control toward some natural limit point that was difficult to discern or describe. The first clear harbinger of the generalized skittishness to come was perhaps the worldwide angst that accompanied the turn of the millennium — better known as “Y2K,” a fashionable brand name for disaster that smacked of Hollywood, thereby capturing the strange mixture of gloom and mass-media banality that would come to characterize much of post-millennial life. The historian of public perception David Lowenthal, writing in 2015:

Events spawned media persistently catastrophic in theme and tone, warning of the end of history, the end of humanity, the end of nature, the end of everything. Millennial prospects in 2000 were lacklustre and downbeat; Y2K seemed a portent of worse to come. Not even post-Hiroshima omens of nuclear annihilation unleashed such a pervasive glum foreboding. Today’s angst reflects unexampled loss of faith in progress: fears that our children will be worse off than ourselves, doubts that neither government nor industry, science nor technology, can set things right.

The turn of the millennium had the feeling of an end time, yet none of history’s more cherished eschatologies seemed to be coming true: not Christianity’s Rapture, not Karl Marx’s communist world order, not Wilhelm Friedrich Hegel or Francis Fukuyama’s liberal-democratic end of history, certainly not Sid Meier and Bruce Shelley’s trip to Alpha Centauri. Techno-progressives began to talk more and more of a new secular eschatology in the form of the so-called Singularity, the point where, depending on the teller, artificial intelligence would either merge with human intelligence to create a new super-species fundamentally different from the humans of prior ages, or our computers would simply take over the world, wiping out their erstwhile masters or relegating them to the status of pets. And that was one of the more positive endgames for humanity that came to be batted around. Others nursed apocalyptic visions of a world ruined by global warming and the rising sea levels associated with it — a secular version of the Biblical Flood — or completely overrun by Islamic Jihadists, those latest barbarians at the gates of civilization heralding the next Dark Ages. Our television and movies turned increasingly dystopic, with anti-heroes and planet-encompassing disasters coming to rule our prime-time entertainment.

The last few years in particular haven’t been terribly good ones for believers in the narrative of progress and the liberal-democratic world order it has done so much to foster. The Arab Spring, touted for a time as a backward region’s belated awakening to progress, collapsed without achieving much of anything at all. Britain is leaving the European Union; the United States elected Donald Trump; Russia is back to relishing the role of the Evil Empire, prime antagonist to the liberal-democratic West; China has gone a long way toward consummating a marriage once thought impossible: the merging of an autocratic, human-rights-violating government with an economy capable of competing with the best that democratic capitalism can muster. Our politicians issue mealy-mouthed homages to “realism” and “transactional diplomacy,” ignoring the better angels of our nature. Everywhere nativism and racism seem to be on the rise. Even in the country where I live now, the supposed progressive paradise of Denmark, the Danish People’s Party has won considerable power in the government by sloganeering that “Denmark is not a multicultural society,” by drawing lines between “real” Danes and those of other colors and other religions. In my native land of the United States, one side of the political discourse, finding itself unable to win a single good-faith argument on the merits, has elected to simply lie about the underlying facts, leading some to make the rather chilling assertion that we now live in a “post-truth” world. (How ironic that the American right, long the staunchest critic of postmodernism, should have been the ones to turn its lessons about the untenability of objective truth into an electoral strategy!)

And then there’s the incoming fire being taken by the most sacred of all of progress’s sacred cows, as The Economist‘s latest Democracy Index announces that it “continues its disturbing retreat.” In an event redolent with symbolism, the same index in 2016 changed the classification of the United States, that beacon of democracy throughout its history, from a “Full Democracy” to a “Flawed Democracy.” Functioning as both cause and symptom of this retreat is the old skepticism about whether democracy is just too chaotic to efficiently run a country, whether people who can so easily be duped by Facebook propaganda and email chain letters can really be trusted to decide their countries’ futures.

Looming over such discussions of democracy and its efficacy is the specter of China. When Mao Zedong’s Communist Party seized power there in 1949, the average Chinese citizen earned just $448 per year in inflation-adjusted terms, making it one of the poorest countries in the world. Mao’s quarter-century of orthodox communist totalitarianism, encompassing the horrors of the Great Leap Forward and the Cultural Revolution, managed to improve that figure only relatively slowly; average income had increased to $978 by 1978. But, following Mao’s death, his de-facto successor Deng Xiaoping began to depart from communist orthodoxy, turning from a centrally-managed economy to the seemingly oxymoronic notion of “market-oriented communism” — effectively a combination of authoritarianism with capitalism. Many historians and economists — not least among them Francis Fukuyama — have always insisted that a non-democracy simply cannot compete with a democracy on economic terms over a long span of time. Yet the economy of the post-Mao China has seemingly grown at a far more impressive rate than they allow to be possible, with average income reaching $6048 by 2006, then $16,624 by 2017. China today would seem to be a compelling rebuttal to all those theories about the magic conjunction of personal freedoms and free markets.

But is it really? We should be careful not to join some of our more excitable pundits in getting ahead of the real facts of the case. China’s economic transformation, remarkable as it’s been, has only elevated it to the 79th position among all the world’s nations in terms of GDP per capita. Its considerable economic clout in the contemporary world, in other words, has a huge amount to do with the fact that it’s the most populous country in the world. Further, the heart of its economy is manufacturing, as is proved by all of those “Made in China” tags on hard goods of every description that are sold all over the world. China is still a long, long way from joining the vanguard of post-industrial knowledge economies. To a large extent, economic innovation still comes from the latter; China then does the grunt work of manufacturing the products that the innovators design.

Of course, authoritarianism does have its advantages. China’s government, which doesn’t need to concern itself with elections every set number of years, can set large national projects in motion, such as a green energy grid spanning the entire country or even a manned trip to Mars, and see them methodically through over the course of decades if need be. But can China under its current system of government produce a truly transformative, never-seen-or-imagined-anything-like-it product like the Apple iPhone and iPad, the World Wide Web, or the Sony Walkman? It isn’t yet clear to me that it can transcend being an implementor of brilliant ideas — thanks to all those cheap and efficient factories — to being an originator of same. So, personally, I’m not quite ready to declare the death of the notion that a country requires democracy to join the truly top-tier economies of the world. The next few decades should be very interesting in one way or another — whether because China does definitively disprove that notion, because its growth tops out, or, most desirably, because a rising standard of living there and the demands of a restive middle class bring an end at last to China’s authoritarian government.

Still, none of these answers to The China Puzzle will do anything to help us with the fundamental limit point of the capitalistic world order: the demand for infinite economic growth in a world of decidedly finite resources. Indeed, the Chinese outcome I just named as the most desirable — that of a democratic, dynamic China free of the yoke of its misnamed Communist Party — only causes our poor, suffering global ecosystem to suffer that much more under the yoke of capitalism. For this reason, economists today have begun to speak more and more of a “crisis of capitalism,” to question whether Adam Smith’s brilliant brainchild is now entering its declining years. For a short time, the “Great Recession” of 2007 and 2008, when some of the most traditionally rock-solid banks and corporations in the world teetered on the verge of collapse, seemed like it might be the worldwide shock that signaled the beginning of the end. Desperate interventions by governments all over the world managed to save the capitalists from themselves at the last, but even today, when the economies of most Western nations are apparently doing quite well, the sense of unease that was engendered by that near-apocalypse of a decade ago has never fully disappeared. The feeling remains widespread that something has to give sooner or later, and that that something might be capitalism as we know it today.

But what would a post-capitalist world look like? Aye, there’s the rub. Communism, capitalism’s only serious challenger over the course of the last century, would seem to have crashed and burned a long time ago as a practical way of ordering an economy. Nor, based on the horrid environmental record of the old Soviet bloc, is it at all clear that it would have proved any better a caretaker of our planet than capitalism even had it survived.

One vision for the future, favored by the anarchist activists whom we briefly met in an earlier article, entails a deliberate winding down of the narrative of progress before some catastrophe or series of catastrophes does it for us. It’s claimed that we need to abandon globalization and re-embrace localized, self-sustaining ways of life; it’s thus perhaps not so much a complete rejection of capitalism as a conscious return to Adam Smith’s era of shopkeepers and craftsman. The prominent American anarchist Murray Bookchin dreams of a return to “community, decentralization, self-sufficiency, mutual aid, and face-to-face democracy” — “a serious challenge to [globalized] society with its vast, hierarchical, sexist, class-ruled state apparatus and militaristic history.” Globalization, he and other anarchists note, often isn’t nearly as efficient as its proselytizers claim. In fact, the extended international supply chains it fosters for even the most basic foodstuffs are often absurdly wasteful in terms of energy and other resources, and brittle to boot, vulnerable to the slightest shock to the globalized system. Why should potatoes which can be grown in almost any back garden in the world need to be shipped in via huge, fuel-guzzling jet airplanes and forty-ton semis? Locally grown agriculture, anarchists point out, can provide eight units of food energy for every one unit of fossil-fuel energy needed to bring it to market, while in many cases exactly the opposite ratio holds true for internationally harvested produce.

But there’s much more going on here philosophically than a concern with the foodstuff supply chain. Modern anarchist thought reflects a deep discomfort with consumer culture, a strand of philosophy we’ve met before in the person of Jean-Jacques Rousseau and his “noble savage.” In truth, Rousseau noted, the only things a person really, absolutely needs to survive are food and shelter. All else is, to paraphrase the Bible, vanity, and all too often brings only dissatisfaction. Back in the eighteenth century, Rousseau could already describe the collector who is never satisfied by the collection he’s assembled, only dissatisfied by its gaps.

What would he make of our times? Today’s world is one of constant beeping enticements — cars, televisions, stereos, computers, phones, game consoles — that bring only the most ephemeral bursts of happiness before we start craving the upgraded model. The anarchist activist Peter Harper:

People aspire to greater convenience and comfort, more personal space, easy mobility, a sense of expanding possibilities. This is the modern consumerist project: what modern societies are all about. It is a central feature of mainstream politics and economics that consumerist aspirations are not seriously challenged. On the contrary, the implied official message is “Hang on in there: we will deliver.” The central slogan is brutally simple: MORE!

Harper claims that, as the rest of the world continues to try and fail to find happiness in the latest shiny objects, anarchists will win them over to their cause by example. For those who reject materialist culture “will quite visibly be having a good time: comfortable, with varied lives and less stress, healthy and fit, having rediscovered the elementary virtues of restraint and balance.”

Doubtless we could all use a measure of restraint and balance in our lives, but the full anarchist project for happiness and sustainability through a deliberate deconstruction of the fruits of progress is so radical — entailing as it does the complete dissolution of nation-states and a return to decentralized communal living — that it’s difficult to fully envision how it could happen absent the sort of monumental precipitating global catastrophe that no one can wish for. While human nature will always be tempted to cast a wistful eye back to an imagined simpler, more elemental past, another, perhaps nobler part of our nature will always look forward with an ambitious eye to a bolder, more exciting future. The oft-idealized life of a tradesman prior to the Industrial Revolution, writes Francis Fukuyama, “involved no glory, dynamism, innovation, or mastery; you just plied the same traditional markets or crafts as your father and grandfather.” For many or most people that may be a fine life, and more power to them. But what of those with bigger dreams, who would spur humanity on to bigger and better things? That is to say, what of the authors of the narrative of progress of the past, present, and future, who aren’t willing to write the whole thing off as fun while it lasted and return to the land? The builders among us will never be satisfied with a return to some agrarian idyll.

The world’s current crisis of faith in progress and in the liberal-democratic principles that are so inextricably bound up with it isn’t the first or the worst of its kind. Not that terribly long ago, Nazi Germany and Imperial Japan posed a far more immediate and tangible threat to liberal democracy all over the world than anything we face today; the American Nazi party was once strong enough to rent and fill Madison Square Garden, a fact which does much to put the recent disconcerting events in Charlottesville in perspective. And yet liberal democracy got through that era all right in the end.

Even in 1983, when the Soviet Union was already teetering on the verge of economic collapse, an unknowing Jean-François Revel could write that “democracy may, after all, turn out to have been an historical accident, a brief parenthesis that is closing before our eyes.” The liberal West’s periods of self-doubt have always seemed to outnumber and outlast its periods of triumphalism, and yet progress has continued its march. During the height of the fascist era, voting rights in many democratic countries were being expanded to include all of their citizens at long last; amidst the gloominess about the future that has marked so much of post-millennial life, longstanding prejudices toward gay and lesbian people have fallen away so fast in the developed West that it’s left even many of our ostensibly progressive politicians scrambling to keep up.

Of course, the fact still remains that our planet’s current wounds are real, and global warming may in the long run prove to be the most dangerous antagonist humanity has ever faced. If we’re unwilling to accept giving up the fruits of progress in the name of healing our planet, where do we go from here? One thing that is clear is that we will have to find different, more sustainable ways of ordering our economies if progress is to continue its march. Capitalism is often praised for its ability to sublimate what Friedrich Nietzsche called the megalothymia of the most driven souls among us — the craving for success, achievement, recognition, victory — into the field of business rather than the field of battle. Would other megalothymia sublimators, such as sport, be sufficient in a post-capitalist world? What would a government/economy look like that respects people’s individual freedoms but avoids the environment-damaging, resource-draining externalities of capitalism? No one — certainly not I! — can offer entirely clear answers to these questions today. This is not so much a tribute to anything unique about our current times as it is a tribute to the nature of history itself. Who anticipated Christianity? Who anticipated that we would use the atomic bomb only twice? Who, for that matter, anticipated a President Donald Trump?

One possibility, at least in the short term, is to rejigger the rules of capitalism to bring its most problematic externalities back under the umbrella of the competitive marketplace. Experiments in cap-and-trade, which turn environment-ruining carbon emissions into a scarce commodity that corporations can exchange among themselves, have shown promising results.

But in the longer term, still more than just our economics will have to change. Because the problems of ecology and environment are global problems of a scope we’ve never faced before, we will need to think of ourselves more and more as a global society in order to solve them. In time, the nation-states in which we still invest so much patriotic fervor today may need to go the way of the scattered, self-sufficient settlements of a few dozens or hundreds that marked the earliest stages of the earliest civilizations. In time, the seeds that were planted with the United Nations in the aftermath of the bloodiest of all our stupid, pointless wars may flower into a single truly global civilization.

Really, though, I can’t possibly predict how humanity will progress its way out of its current set of predicaments. I can only have faith in the smarts and drive that have brought us this far. The best we can hope for is probably to muddle through by the skin of our teeth — but then, isn’t that what we’ve always been doing? The first civilizations began as improvised solutions to the problem of a changing climate, and we’ve been making it up as we go along ever since. So, maybe the first truly global civilization will also arise as, you guessed it, an improvised solution to the problem of a changing climate. Even if we’ve met our match with our latest nemesis of human-caused climate change, perhaps it really is better to burn out than to fade away. Perhaps it’s better to go down swinging than to survive at the cost of the grand dream of an eventual trip to Alpha Centauri.

The game which has the fulfillment of that dream as its most soul-stirring potential climax has been oft-chided for promoting a naive view of history — for being Western- and American-centric, for ignoring the plights of the vast majority of the people who have ever inhabited this planet of ours, for ignoring the dangers of the progress it celebrates. It is unquestionably guilty of all these things in whole or in part, and guilty of many more sins against history besides. But I haven’t chosen to emphasize overmuch its many problems in this series of articles because I find its guiding vision of a human race capable of improving itself down through the millennia so compelling and inspiring. Human civilization needs it critics, but it needs its optimists perhaps even more. So, may the optimistic outlook of the narrative of progress last as long as our species, and may we always have to go along with it the optimism of the game of Civilization — or of a Civilization VI, Civilization XVI, or Civilization CXVI — to exhort us to keep on keeping on.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Democracy: A Very Short Introduction by Bernard Crick, Anarchism: A Very Short Introduction by Colin Ward, Environmental Economics: A Very Short Introduction by Stephen Smith, Globalization: A Very Short Introduction by Manfred B. Steger, Economics: A Very Short Introduction by Partha Dasgupta, Global Economic History: A Very Short Introduction by Robert C. Allen, Capital by Karl Marx, The Social Contract by Jean-Jacques Rousseau, The Genealogy of Morals by Friedrich Nietzsche, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, The Wealth of Nations by Adam Smith, How Democracies Perish by Jean-François Revel, and The Past is a Foreign Country by David Lowenthall.)

 

Tags: , , ,

The Game of Everything, Part 9: Civilization and Economics

If the tailor goes to war against the baker, he must henceforth bake his own bread.

— Ludwig von Mises

There’s always the danger that an analysis of a game spills into over-analysis. Some aspects of Civilization reflect conscious attempts by its designers to model the processes of history, while some reflect unconscious assumptions about history; some aspects represent concessions to the fact that it first and foremost needs to work as a playable and fun strategy game, while some represent sheer random accidents. It’s important to be able to pull these things apart, lest the would-be analyzer wander into untenable terrain.

Any time I’m tempted to dismiss that prospect, I need only turn to Johnny L. Wilson and Alan Emrich’s ostensible “strategy guide” Civilization: or Rome on 640K a Day, which is actually far more interesting as the sort of distant forefather of this series of articles — as the very first attempt ever to explore the positions and assumptions embedded in the game. Especially given that it is such an early attempt — the book was published just a few months after the game, being largely based on beta versions of same that MicroProse had shared with the authors — Wilson and Emrich do a very credible job overall. Yet they do sometimes fall into the trap of seeing what their political beliefs make them wish to see, rather than what actually existed in the minds of the designers. The book doesn’t explicitly credit which of the authors wrote what, but one quickly learns to distinguish their points of view. And it turns out that Emrich, whose arch-conservative worldview is on the whole more at odds with that of the game than Wilson’s liberal-progressive view, is particularly prone to projection. Among the most egregious and amusing examples of him using the game as a Rorschach test is his assertion that the economy-management layer of Civilization models a rather dubious collection of ideas that have more to do with the American political scene in 1991 than they do with any proven theories of economics.

We know we’re in trouble as soon as the buzzword “supply-side economics” turns up prominently in Emrich’s writing. It burst onto the stage in a big way in the United States in 1980 with the election of Ronald Reagan as president, and has remained to this day one of his Republican party’s main talking points on the subject of economics in general. Its central, counter-intuitive claim is that tax revenues can often be increased by cutting rather than raising tax rates. Lower taxes, goes the logic, provide such a stimulus to the economy as a whole that people wind up making a lot more money. And this in turn means that the government, even though it brings in less taxes per dollar, ends up bringing in more taxes in the aggregate.

In seeing what he wanted to see in Civilization, Alan Emrich decided that it hewed to contemporary Republican orthodoxy not only on supply-side economics but also on another subject that was constantly in the news during the 1980s and early 1990s: the national debt. The Republican position at the time was that government deficits were always bad; government should be run like a business in all circumstances, went their argument, with an orderly bottom line.

But in the real world, supply-side economics and a zero-tolerance policy on deficits tend to be, shall we say, incompatible with one another. Since the era of Ronald Reagan, Republicans have balanced these oil-and-water positions against one another by prioritizing tax cuts when in power and wringing their hands over the deficit — lamenting the other party’s supposedly out-of-control spending on priorities other than their own — when out of power. Emrich, however, sees in Civilization‘s model of an economy the grand unifying theory of his dreams.

Let’s quickly review the game’s extremely simplistic handling of the economic aspects of civilization-building before we turn to his specific arguments, such as they are. The overall economic potential of your cities is expressed as a quantity of “trade arrows.” As leader, you can devote the percentage of trade arrows you choose to taxes, which add money to your treasury for spending on things like the maintenance costs of your buildings and military units and tributes to other civilizations; research, which lets you acquire new advances; and, usually later in the game, luxuries, which help to keep your citizens content. There’s no concept of deficit spending in the game; if ever you don’t have enough money in the treasury to maintain all of your buildings and units at the end of a turn, some get automatically destroyed. This, then, leads Emrich to conclude that the game supports his philosophy on the subject of deficits in general.

But the more entertaining of Emrich’s arguments are the ones he deploys to justify supply-side economics. At the beginning of a game of Civilization, you have no infrastructure to support, and thus you have no maintenance costs at all — and, depending on which difficulty level you’ve chosen to play at, you may even start with a little bit of money already in the treasury. Thus it’s become standard practice among players to reduce taxes sharply from their default starting rate of 50 percent, devoting the bulk of their civilization’s economy early on to research on basic but vital advances like Pottery, Bronze Working, and The Wheel. With that in mind, let’s try to follow Emrich’s thought process:

To maximize a civilization’s potential for scientific and technological advancement, the authors recommend the following exercise in supply-side economics. Immediately after founding a civilization’s initial city, pull down the Game Menu and select “Tax Rate.” Reduce the tax rate from its default 50% to 10% (90% Science). This reduced rate will allow the civilization to continue to maintain its current rate of expenditure while increasing the rate at which scientific advancements occur. These advancements, in turn, will accelerate the wealth and well-being of the civilization as a whole.

In this way, the game mechanics mirror life. The theory behind tax reduction as a spur to economic growth is built on two principles: the multiplier and the accelerator. The multiplier effect is abstracted out of Sid Meier’s Civilization because it is a function of consumer spending.

The multiplier effect says that each tax dollar cut from a consumer’s tax burden and actually spent on consumer goods will net an additional 50 cents at a second stage of consumer spending, an additional 25 cents at a third stage, an additional 12.5 cents at a fourth stage, etc. Hence, economists claim that the full progression nets a total of two dollars for each extra consumer dollar spent as a result of a tax cut.

The multiplier effect cannot be observed in the game because it is only presented indirectly. Additional consumer spending causes a flash point where additional investment takes place to increase, streamline, and advance production capacity and inventory to meet the demands of the increased consumption. Production increases and advances, in turn, have an additional multiplier effect beyond the initial consumer spending. When the scientific advancements occur more rapidly in Sid Meier’s Civilization, they reflect that flash point of additional investment and allow civilizations to prosper at an ever accelerating rate.

Wow. As tends to happen a lot after I’ve just quoted Mr. Emrich, I’m not quite sure where to start. But let’s begin with his third paragraph, in particular with a phrase which is all too easy to overlook: that for this to work, the dollar cut must “actually be spent on consumer goods.” When tax rates for the wealthy are cut, the lucky beneficiaries don’t tend to go right out and spend their extra money on consumer goods. The most direct way to spur the economy through tax cuts thus isn’t to slash the top tax bracket, as Republicans have tended to do; it’s to cut the middle and lower tax brackets, which puts more money in the pockets of those who don’t already have all of the luxuries they could desire, and thus will be more inclined to go right out and spend their windfall.

But, to give credit where it’s due, Emrich does at least include that little phrase about the importance of spending on consumer goods, even if he does rather bury the lede. His last paragraph is far less defensible. To appreciate its absurdity, we first have to remember that he’s talking about “consumer spending” in a Stone Age economy of 4000 BC. What are these consumers spending on? Particularly shiny pieces of quartz?  And for that matter what are they spending, considering that your civilization hasn’t yet developed currency? And how on earth can any of this be said to justify supply-side economics over the long term? You can’t possibly maintain your tax rate of 10 percent forever; as you build up your cities and military strength, your maintenance costs steadily increase, forcing you back toward that starting default rate of 50 percent. To the extent that Civilization can be said to send any message at all on taxes, said message must be that a maturing civilization will need to steadily increase its tax rate as it advances toward modernity. And indeed, as we learned in an earlier article in this series, this is exactly what has happened over the long arc of real human history. Your economic situation at the beginning of a game of Civilization isn’t some elaborate testimony to supply-side economies; it just reflects the fact that one of the happier results of a lack of civilization is the lack of a need to tax anyone to maintain it.

In reality, then, the taxation model in the game is a fine example of something implemented without much regard for real-world economics, simply because it works in the context of a strategy game like this one. Even the idea of a such a centralized system of rigid taxation for a civilization as a whole is a deeply anachronistic one in the context of most societies prior to the Enlightenment, for whose people local government was far more important than some far-off despot or monarch. Taxes, especially at the national level, tended to come and go prior to AD 1700, depending on the immediate needs of the government, and lands and goods were more commonly taxed than income, which in the era before professionalized accounting was hard for the taxpayer to calculate and even harder for the tax collector to verify. In fact, a fixed national income tax of the sort on which the game’s concept of a “tax rate” seems to be vaguely modeled didn’t come to the United States until 1913. Many ancient societies — including ones as advanced as Egypt during its Old Kingdom and Middle Kingdom epochs —  never even developed currency at all. Even in the game Currency is an advance which you need to research; the cognitive dissonance inherent in earning coins for your treasury when your civilization lacks the concept of money is best just not thought about.

Let’s take a moment now to see if we can make a more worthwhile connection between real economic history and luxuries, that third category toward which you can devote your civilization’s economic resources. You’ll likely have to begin doing so only if and when your cities start to grow to truly enormous sizes, something that’s likely to happen only under the supercharged economy of a democracy. When all of the usual bread and circuses fail, putting resources into luxuries can maintain the delicate morale of your civilization, keeping your cities from lapsing into revolt. There’s an historical correspondence that actually does seem perceptive here; the economies of modern Western democracies, by far the most potent the world has ever known, are indeed driven almost entirely by a robust consumer market in houses and cars, computers and clothing. Yet it’s hard to know where to really go with Civilization‘s approach to luxuries beyond that abstract statement. At most, you might put 20 or 30 percent of your resources into them, leaving the rest to taxes and research, whereas in a modern developed democracy like the United States those proportions tend to be reversed.

Ironically, the real-world economic system to which Civilization‘s overall model hews closest is actually a centrally-planned communist economy, where all of a society’s resources are the property of the state — i.e, you — which decides how much to allocate to what. But Sid Meier and Bruce Shelley would presumably have run screaming from any such association — not to mention our friend Mr. Emrich, who would probably have had a conniption. It seems safe to say, then, that what we can learn from the Civilization economic model is indeed sharply limited, that most of it is there simply as a way of making a playable game.

Still, we might usefully ask whether there’s anything in the game that does seem like a clear-cut result of its designers’ attitudes toward real-world economics. We actually have seen some examples of that already in the economic effects that various systems of government have on your civilization, from the terrible performance of despotism to the supercharging effect of democracy. And there is one other area where Civilization stakes out some clear philosophical territory: in its attitude toward trade between civilizations, a subject that’s been much in the news in recent years in the West.

In the game, your civilization can reap tangible benefits from its contact with other civilizations in two ways. For one, you can use special units called caravans, which become available after you’ve researched the advance of Trade, to set up “trade routes” between your cities and those of other civilizations. Both then receive a direct boost to their economies, the magnitude of which depends on their distance from one another — farther is better — and their respective sizes. A single city can set up such mutually beneficial arrangements with up to five other cities, and see them continue as long as the cities in question remain in existence.

In addition to these arrangements, you can horse-trade advances directly with the leaders of other civilizations, giving your counterpart one of your advances in exchange for one you haven’t yet acquired. It’s also possible to take advances from other civilizations by conquering their cities or demanding tribute, but such hostile approaches have obvious limits to which a symbiotic trading relationship isn’t subject; fighting wars is expensive in terms of blood and treasure alike, and you’ll eventually run out of enemy cities to conquer. If, on the other hand, you can set up warm relationships with four or five other civilizations, you can positively rocket up the Advances Chart.

The game’s answer to the longstanding debate between free trade and protectionism — between, to put a broader framing on it, a welcoming versus an isolationist attitude toward the outside world — is thus clear: those civilizations which engage economically with the world around them benefit enormously and get to Alpha Centauri much faster. Such a position is very much line in line with the liberal-democratic theories of history that were being espoused by thinkers like Francis Fukuyama at the time Meier and Shelley were making the game — thinkers whose point of view Civilization unconsciously or knowingly adopts.

As has become par for the course by now, I believe that the position Civilization and Fukuyama alike take on this issue is quite well-supported by the evidence of history. To see proof, one doesn’t have to do much more than look at where the most fruitful early civilizations in history were born: near oceans, seas, and rivers. Egypt was, as the ancient historian Herodotus so famously put it, “the gift of the Nile”; Athens was born on the shores of the Mediterranean; Rome on the east bank of the wide and deep Tiber river. In ancient times, when overland travel was slow and difficult, waterways were the superhighways of their era, facilitating the exchange of goods, services, and — just as importantly — ideas over long distances. It’s thus impossible to imagine these ancient civilizations reaching the heights they did without this access to the outside world. Even today port cities are often microcosms of the sort of dynamic cultural churn that spurs civilizations to new heights. Not for nothing does every player of the game of Civilization want to found her first city next to the ocean or a river — or, if possible, next to both.

To better understand how these things work in practice, let’s return one final time to the dawn of history for a narrative of progress involving one of the greatest of all civilizations in terms of sheer longevity.

Egypt was far from the first civilization to spring up in the Fertile Crescent, that so-called “cradle of civilization.” The changing climate that forced the hunter-gatherers of the Tigris and Euphrates river valleys to begin to settle down and farm as early as 10,000 BC may not have forced the peoples roaming the lands near the Nile to do the same until as late as 4000 BC. Yet Egyptian civilization, once it took root, grew at a crazy pace, going from primitive hunter-gatherers to a culture that eclipsed all of its rivals in grandeur and sophistication in less than 1500 years. How did Egypt manage to advance so quickly? Well, there’s strong evidence that it did so largely by borrowing from the older, initially wiser civilizations to its east.

Writing is among the most pivotal advances for any young civilization; it allows the tallying of taxes and levies, the inventorying of goods, the efficient dissemination of decrees, the beginning of contracts and laws and census-taking. It was if anything even more important in Egypt than in other places, for it facilitated a system of strong central government that was extremely unusual in the world prior to the Enlightenment of many millennia later. (Ancient Egypt at its height was, in other words, a marked exception to the rule about local government being more important than national prior to the modern age.) Yet there’s a funny thing about Egypt’s famous system of hieroglyphs.

In nearby Sumer, almost certainly the very first civilization to develop writing, archaeologists have traced the gradual evolution of cuneiform writing by fits and starts over a period of many centuries. But in Egypt, by contrast, writing just kind of appears in the archaeological record, fully-formed and out of the blue, around 3000 BC. Now, it’s true that Egypt didn’t simply take the Sumerian writing system; the two use completely different sets of symbols. Yet many archaeologists believe that Egypt did take the idea of writing from Sumer, with whom they were actively trading by 3000 BC. With the example of a fully-formed vocabulary and grammar, all translated into a set of symbols, the actual implementation of the idea in the context of the Egyptian language was, one might say, just details.

How long might it have taken Egypt to make the conceptual leap that led to writing without the Sumerian example? Not soon enough, one suspects, to have built the Pyramids of Giza by 2500 BC. Further, we see other diverse systems of writing spring up all over the Mediterranean and Middle East at roughly the same time. Writing was an idea whose time had come, thanks to trading contacts. Trade meant that every new civilization wasn’t forced to reinvent every wheel for itself. It’s since become an axiom of history that an outward-facing civilization is synonymous with youth and innovation and vigorous growth, an inward-turning civilization synonymous with age and decadence and decrepit decline. It happened in Egypt; it happened in Greece; it happened in Rome.

But, you might say, the world has changed a lot since the heyday of Rome. Can this reality that ancient civilizations benefited from contact and trade with one another really be applied to something like the modern debate over free trade and globalization? It’s a fair point. To address it, let’s look at the progress of global free trade in times closer to our own.

In the game of Civilization, you won’t be able to set up a truly long-distance, globalized trading network with other continents until you’ve acquired the advance of Navigation, which brings with it the first ships that are capable of transporting your caravan units across large tracts of ocean. In real history, the first civilizations to acquire such things were those of Europe, in the late fifteenth century AD. Economists have come to call this period “The First Globalization.”

And, tellingly, they also call this period “The Great Divergence.” Prior to the arrival of ships capable of spanning the Atlantic and Pacific Oceans, several regions of the world had been on a rough par with Europe in terms of wealth and economic development. In fact, at least one great non-European civilization — that of China — was actually ahead; roughly one-third of the entire world’s economic output came from China alone, outdistancing Europe by a considerable margin. But, once an outward-oriented Europe began to establish itself in the many less-developed regions of the world, all of that changed, as Europe surged forward to the leading role it would enjoy for the next several centuries.

How did the First Globalization lead to the Great Divergence? Consider: when the Portuguese explorer Vasco de Gama reached India in 1498, he found he could buy pepper there, where it was commonplace, for a song. He could then sell it back in Europe, where it was still something of a delicacy, for roughly 25 times what he had paid for it, all while still managing to undercut the domestic competition. Over the course of thousands of similar trading arrangements, much of the rest of the world came to supply Europe with the cheap raw materials which were eventually used to fuel the Industrial Revolution and to kick the narrative of progress into overdrive, making even tiny European nations like Portugal into deliriously rich and powerful entities on the world stage.

And what of the great competing civilization of China? As it happens, it might easily have been China instead of Europe that touched off the First Globalization and thereby separated itself from the pack of competing civilizations. By the early 1400s, Chinese shipbuilding had advanced enough that its ships were regularly crisscrossing the Indian Ocean between established trading outposts on the east coast of Africa. If the arts of Chinese shipbuilding and navigation had continued to advance apace, it couldn’t have been much longer until its ships crossed the Pacific to discover the Americas. How much different would world history have been if they had? Unfortunately for China, the empire’s imperial leaders, wary of supposedly corrupting outside influences, made a decision around 1450 to adopt an isolationist posture. Existing trans-oceanic trade routes were abandoned, and China retreated behind its Great Wall, leaving Europe to reap the benefits of global trade. By 1913, China’s share of the world’s economy had dropped to 4 percent. The most populous country in the world had become a stagnant backwater in economic terms. So, we can say that Europe’s adoption of an outward-facing posture just as China did the opposite at this critical juncture became one of the great difference-makers in world history.

We can already see in the events of the late fifteenth century the seeds of the great debate over globalization that rages as hotly as ever today. While it’s clear that the developed countries of Europe got a lot out of their trading relationships, it’s far less clear that the less-developed regions of the world benefited to anything like the same extent — or, for that matter, that they benefited at all.

This first era of globalization was the era of colonialism, when developed Europe freely exploited the non-developed world by toppling or co-opting whatever forms of government already existed among its new trading “partners.” The period brought a resurgence of the unholy practice of slavery, along with forced religious conversions, massacres, and the theft of entire continents’ worth of territory. Much later, over the course of the twentieth century, Europe gradually gave up most of its colonies, allowing the peoples of its former overseas possessions their ostensible freedom to build their own nations. Yet the fundamental power imbalances that characterized the colonial period have never gone away. Today the developing world of poor nations trades with the developed world of rich nations under the guise of being equal sovereign entities, but the former still feeds raw materials to the industrial economies of the latter — or, increasingly, developing industrial economies feed finished goods to the post-industrial knowledge economies of the ultra-developed West. Proponents of economic globalization argue that all of this is good for everyone concerned, that it lets each country do what it does best, and that the resulting rising economic tide lifts all their boats. And they argue persuasively that the economic interconnections globalization has brought to the world have been a major contributing factor to the unprecedented so-called “Long Peace” of the last three quarters of a century, in which wars between developed nations have not occurred at all and war in general has become much less frequent.

But skeptics of economic globalism have considerable data of their own to point to. In 1820, the richest country in the world on a per-capita basis was the Netherlands, with an inflation-adjusted average yearly income of $1838, while the poorest region of the world was Africa, with an average income of $415. In 2017, the Netherlands had an average income of $53,582, while the poorest country in the world for which data exists was in, you guessed it, Africa: it was the Central African Republic, with an average income of $681. The richest countries, in other words, have seen exponential economic growth over the last two centuries, while some of the poorest have barely moved at all. This pattern is by no means entirely consistent; some countries of Asia in particular, such as Taiwan, South Korea, Singapore, and Japan, have done well enough for themselves to join the upper echelon of highly-developed post-industrial economies. Yet it does seem clear that the club of rich nations has grown to depend on at least a certain quantity of nations remaining poor in order to keep down the prices of the raw materials and manufactured goods they buy from them. If the rising tide lifted these nations’ boats to equality with those of the rich, the asymmetries on which the whole world economic order runs today wouldn’t exist anymore. The very stated benefits of globalization carry within them the logic for keeping the poor nations’ boats from rising too high: if everyone has a rich, post-industrial economy, who’s going to do the world’s grunt work? This debate first really came to the fore in the 1990s, slightly after the game of Civilization, as anti-globalization became a rallying cry of much of the political left in the developed world, who pointed out the seemingly inherent contradictions in the idea of economic globalization as a universal force for good.

Do note that I referred to “economic globalization” there. We should do what we can to separate it from the related concepts of political globalization and cultural globalization, even as the trio can often seem hopelessly entangled in the real world. Still, political globalization, in the form of international bodies like the United Nations and the International Court of Justice, is usually if not always supported by leftist critics of economic globalization.

But cultural globalization is decried to almost an equal degree, being sometimes described as the “McDonaldization” of the world. Once-vibrant local cultures all over the world, goes the claim, are being buried under the weight of an homogenized global culture of consumption being driven largely from the United States. Kids in Africa who have never seen a baseball game rush out to buy the Yankees caps worn by the American rap stars they worship, while gangsters kill one another over Nike sneakers in the streets of China. Developing countries, the anti-globalists say, first get exploited to produce all this crap, then get the privilege of having it sold back to them in ways that further eviscerate their cultural pride.

And yet, as always with globalization, there’s also a flip side. A counter-argument might point out that at the end of the day people have a right to like what they like (personally, I have no idea why anyone would eat a McDonald’s hamburger, but tastes evidently vary), and that cultures have blended with and assimilated one another from the days when ancient Egypt traded with ancient Sumer. Young people in particular in the world of today have become crazily adept at juggling multiple cultures: getting married in a traditional Hindu ceremony on Sunday and then going to work in a smart Western business suit on Monday, listening to Beyoncé on their phone as they bike their way to sitar lessons. Further, the emergence of new forms of global culture, assisted by the magic of the Internet, have already fostered the sorts of global dialogs and global understandings that can help prevent wars; it’s very hard to demonize a culture which has produced some of your friends, or even just creative expressions you admire. As the younger generations who have grown up as members of a sort of global Internet-enabled youth culture take over the levers of power, perhaps they will become the vanguard of a more peaceful, post-nationalist world.

The debate about economic globalization, meanwhile, has shifted in some surprising ways in recent years. Once a cause associated primarily with the academic left, cosseted in their ivory towers, the anti-globalization impulse has now become a populist movement that has spread across the political spectrum in many developed countries of the West. Even more surprisingly, the populist debate has come to center not on globalization’s effect on the poor nations on the wrong side of the power equation but on those rich nations who would seem to be its clear-cut beneficiaries. In just the last couple of years as of this writing, blue-collar workers who feel bewildered and displaced by the sheer pace of an ever-accelerating narrative of progress in an ever more multicultural world were a driving force behind the Brexit vote in Britain and the election of Donald Trump to the presidency of the United States. The understanding of globalization which drove both events was simplistic and confused — trade deficits are no more always a bad thing for any given country than is a national tax deficit — but the visceral anger behind them was powerful enough to shake the established Western world order more than any event since the World Trade Center attack of 2001. It should become more clear in the next decade or so whether, as I suspect, these movements represent a reactionary last gasp of the older generation before the next, more multicultural and internationalist younger generation takes over, or whether they really do herald a more fundamental shift in geopolitics.

As for the game of Civilization: to attempt to glean much more from its simple trading mechanisms than we already have would be to fall into the same trap that ensnared Alan Emrich. A skeptic of globalization might note that the game is written from the perspective of the developed world, and thus assumes that your civilization is among the privileged ranks for whom globalization on the whole has been — sorry, Brexiters and Trump voters! — a clear benefit. This is true even if the name of the civilization you happen to be playing is the Aztecs or the Zulus, peoples for whom globalization in the real world meant the literal end of their civilizations. As such examples prove, the real world is far more complicated than the game makes it appear. Perhaps the best lesson to take away — from the game as well as from the winners and arguable losers of globalization in our own history — is that it really does behoove a civilization to actively engage with the world. Because if it doesn’t, at some point the world will decide to engage with it.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Economics by Paul Samuelson, The Rise and Fall of Ancient Egypt by Toby Wilkinson, Enlightenment Now: The Case for Reason, Science, Humanism, and Progress by Steven Pinker, Global Economic History: A Very Short Introduction by Robert C. Allen, Globalization: A Very Short Introduction by Manfred B. Steger, Taxation: A Very Short Introduction by Stephen Smith, and Guns, Germs, and Steel: The Fates of Human Societies by Jared Diamond.)

 

Tags: , , ,