RSS

Tag Archives: microprose

X-COM

X-COM seemed to come out of nowhere. Its release was not preceded by an enormous marketing campaign with an enormous amount of hype. It had no video demo playing in the front window of Babbages, it wasn’t advertised twelve months in advance on glossy foldout magazine inserts, it had no flashing point-of-purchase kiosks. It didn’t come in a box designed by origamists from the school of abstract expressionism. It featured no full-motion video starring the best TV actors of the 80s; it had no voice-overs. It offered neither Super VGA graphics, nor General MIDI support. It wasn’t Doom-like, Myst-like, or otherwise like a hit game from the previous season; it didn’t steal the best features from several other successful games. It wasn’t even on a CD-ROM!

In short, if you plugged X-COM’s variables into the “success formula” currently in use by the majority of large game companies, you’d come up with a big, fat goose egg. According to the prevailing wisdom, there’s no way X-COM could survive in today’s gaming marketplace. And yet it sold and sold, and gamers played on and on.

— Chris Lombardi, writing in the April 1995 issue of Computer Gaming World

In the early days of game development, there existed little to no separation between the roles of game programmer and game designer. Those stalwart pioneers who programmed the games they themselves designed could be grouped into two broad categories, depending on the side from which they entered the field. There were the technologists, who were fascinated first and foremost with the inner workings of computers, and chose games as the most challenging, creatively satisfying type of software to which they could apply their talents. And then there were those who loved games themselves above all else, and learned to program computers strictly in order to make better, more exciting ones than could be implemented using only paper, cardboard, and the players’ imaginations. Julian Gollop, the mastermind behind the legendary original X-COM, fell most definitely into this latter category. He turned to the computer only when the games he wanted to make left him no other choice.

Growing up in the English county of Essex, Julian and his younger brother Nick lived surrounded by games, courtesy of their father. “Every Christmas, we didn’t watch TV, we’d play games endlessly,” Julian says. From Cluedo, they progressed to Escape from Colditz, then on to the likes of Sniper! and Squad Leader.

Julian turned fifteen in 1980, the year that the Sinclair ZX80 arrived to set off a microcomputer fever all across Britain, but he was initially immune to the affliction. Unimpressed by the simplistic games he saw being implemented on those early machines, which often had as little as 1 K of memory, he started making his own designs to be played the old-fashioned way, face-to-face around a tabletop. It was only when he hit a wall of complexity with one of them that he reassessed the potential of computers.

The game in question was called Time Lords; as the name would imply, it was based on the Doctor Who television serials. It asked two to five players to travel through time and space and alter the course of history to their advantage, but grew so complex that it came to require an additional person to serve in the less-than-rewarding capacity of referee.

By this point, it was 1982, and a friend of Julian’s named Andy Greene had acquired one of the first BBC Micros. Its relatively cavernous 32 K of memory opened up the possibility of using the computer as a referee instead of a bored human. Greene coded up the program in BASIC, staying faithful to Julian’s board game to the extent of demanding that players leave the room when it wasn’t their turn, so as not to see anything they weren’t supposed to of their opponents’ actions. The owner of the tabletop-games store where Julian shopped was so impressed with the result that he founded a new company, Red Shift Games, in order to publish it. They all traveled to computer fairs together, carrying copies of the computerized Time Lords packaged in Ziploc baggies. The game didn’t take the world by storm — Personal Computer News, one of the few publications to review it, pronounced it a “bored game” instead of a board game — but it was a start.

The two friends next made Islandia, another multiplayer strategy game of a similar stripe. In the meantime, Julian acquired a Sinclair Spectrum, the cheap and cheerful little machine destined to drive British computer gaming for the next half-decade. Having now a strong motivation to learn to program it, Julian did just that. His first self-coded game, and his first on the Spectrum, appeared in 1984 in the form of Nebula, a conquer-the-galaxy exercise that for the first time offered a computer opponent to play against.

The artificial intelligence disappeared again from his next game, but it mattered not at all. Rebelstar Raiders was the prototype for Julian Gollop’s most famous work. In contrast to the big-picture strategy of his earlier games, it honed in on individual soldiers in conflict with one another in a Starship Troopers-like science-fictional milieu. Still, it was very much based on the board games he loved; there was a lot of Sniper! and Squad Leader in its turn-based design. Despite being such a cerebral game, despite being one that you couldn’t even play without a mate to hand, it attracted considerable attention. Red Shift faded out of existence shortly thereafter as its owner lost interest in the endeavor, but Rebelstar Raiders had already made Julian’s reputation, such that other publishers were now knocking at his door.

Rebelstar Raiders, the first of Julian Gollop’s turn-based tactical-combat games. Ten years later, the approach would culminate in X-COM.

It must have been a thrill for Julian Gollop the board-game fanatic when Games Workshop, the leading British publisher of hobbyist tabletop games, signed him to make a computer game for their new — if ultimately brief-lived — digital division. Chaos, a spell-slinging fantasy free-for-all ironically based to some extent on a Games Workshop board game known as Warlock — not that Julian told them that! — didn’t sell as well as Rebelstar Raiders, although it has since become something of a cult classic.

So, understandably, Julian went where the market was. Between 1986 and 1988, he produced three more iterations on the Rebelstar Raiders concept, each boasting computer opponents as well as multiplayer options and each elaborating further upon the foundation of its predecessor. Game designers are a bit like authors in some ways. Some authors — like, say, Margaret Atwood — try their hands at a wide variety of genres and approaches, while others — like, say, John Cheever — compulsively sift through the same material in search of new nuggets of insight. Julian became, in the minds of the British public at least, an example of the Cheever type of designer. “It could be said by the cruelest among us that Julian has only ever written one game,” wrote the magazine New Computer Express in 1990, “but has released various substantially enhanced versions of it over the years.”

Of those enhanced versions, Julian published Rebelstar and Rebelstar 2: Alien Encounter through Firebird as a lone-wolf developer, then published Laser Squad through a small outfit known as Blaze Software. Before he made this last game, he founded a company called Target Games — soon to be renamed to the less generic Mythos Games — with his father as silent partner and his brother Nick in an active role; the latter had by now become an accomplished programmer in his own right, in fact surpassing Julian’s talents in that area. In 1990, the brothers made the Chaos sequel Lords of Chaos together in order to prove to the likes of New Computer Express that Julian was at least a two-trick pony. And then came the series of events that would lead to Julian Gollop, whose games were reasonably popular in Britain but virtually unknown elsewhere, becoming one of the acknowledged leading lights of strategy gaming all over the world.



The road to X-COM traveled through the terrain of happenstance rather than any master plan. Julian’s career-defining project started as Laser Squad 2 in spirit and even in name, the next manifestation of his ongoing obsession with small-scale, turn-based, single-unit tactics. The big leap forward this time was to be an isometric viewpoint, adding an element of depth to the battlefield. He and Nick coded a proof of concept on an Atari ST. While they were doing so, Blaze Software disappeared, yet another ephemeral entity in a volatile industry. Now, the brothers needed a new publisher for their latest game.

Both of them had been playing hours and hours of Railroad Tycoon, from the American publisher MicroProse. Knowing that MicroProse had a British branch, they decided to take their demo there first. It was a bold move in its way; as I’ve already noted, their games were popular in their sphere, but had mostly borne the imprints of smaller publishers and had mostly been sold at cheaper price points. MicroProse was a different animal entirely, carrying with it the cachet that still clung in Europe to American games, with their bigger budgets and higher production values. In their quiet English way, the Gollops were making a bid for the big leagues.

Luckily for them, MicroProse’s British office was far more than just a foreign adjunct to the American headquarters. It was a dynamic, creative place in its own right, which took advantage of the laissez-faire attitude of “Wild” Bill Stealey, MicroProse’s flamboyant fly-boy founder, to blaze its own trails. When the Gollops brought in the nascent Laser Squad 2, they were gratified to find that just about everyone at MicroProse UK already knew of them and their games. Peter Moreland, the head of development, was cautiously interested, but with plenty of caveats. For one thing, they would need to make the game on MS-DOS rather than the Atari ST in order to reach the American market. For another, a small-scale tactical-combat game alone wouldn’t be sufficient — wouldn’t be, he said, “MicroProse enough.” After making their name in the 1980s with Wild Bill’s beloved flight simulators, MicroProse was becoming at least as well known in this incipient new decade for grand-strategy games of or in the spirit of their star designer Sid Meier, like the aforementioned Railroad Tycoon and the soon-to-be-released Civilization. The emphasis here was on the “grand.” A Laser Squad 2 just wouldn’t be big enough for MicroProse.

Finally, Moreland wasn’t thrilled by all these far-future soldiers fighting battles in unknown regions of space for reasons that were abstract at best. Who could really relate to any of that? He wanted something more down to earth — literally. Maybe something to do with alien visitors in UFOs… that sort of thing. Julian nodded along, then went home to do some research and refine his proposal.

He quickly learned that he was living in the midst of a fecund period in the peculiar field of UFOlogy. In 1989, a sketchy character named Bob Lazar had given an interview for a Las Vegas television station in which he claimed to have been employed as a civilian contractor at the top-secret Nevada military base known only as Area 51. In that location, so he said, the American Air Force was actively engaged in testing fantastic technologies derived from extraterrestrial visitors. The interview would go down in history as the wellspring of a whole generation of starry-eyed conspiracy theorists, whose outlandish beliefs would soon enter the popular media zeitgeist via such vehicles as the television series The X-Files. When Julian first investigated the subject in 1991, however, UFOs and aliens were still a fairly underground obsession. Nevertheless, he took much from the early lore and legends of Area 51, such as a supposed new chemical element — called ununpentium by Lazar, elerium by the eventual game — which powered the aliens’ spaceships.

His other major source of inspiration was the 1970 British television series entitled simply UFO. In fact, his game would eventually be released as UFO: Enemy Unknown in Europe, capitalizing on the association with a show that a surprising number of people there still remembered. (I’ve chosen to use the American name of X-COM globally in this article because all subsequent games in the franchise would be known all over the world under that name; it has long since become the iconic one.) UFO the television series takes place in the then-near-future of 1980, when aliens are visiting the Earth in ever-increasing numbers, abducting humans and wreaking more and more havoc. An international organization known as SHADO (“Supreme Headquarters Alien Defence Organisation”) has been formed to combat the menace. The show follows the exploits of the SHADO operatives, complete with outlandish “futuristic” costumes and sets and gloriously cheesy special effects. Gollop lifted this basic scenario and moved it to his own near-future: to the year 1999, thus managing to nail not only his decade’s burgeoning obsession with aliens but also its unease about the looming millennium.

The game is divided into two distinct halves — so much so that each half is almost literally an entirely separate game: each unloads itself completely from memory to run a separate executable file at the point of transition, caching on the hard drive before doing so the relatively small amount of state data which its companion needs to access.

The first part that you see is the strategic level. As the general in charge of the “Extra-Terrestrial Combat Force,” or X-COM — the name was suggested by Stephen Hand and Mike Brunton, two in-house design consultants at MicroProse UK — you must hire soldiers and buy equipment for them; research new technologies, a process which comes more and more to entail reverse-engineering captured alien artifacts in order to use your enemy’s own technology against them; build new bases at strategic locations around the world, as well as improve your existing ones (you start with just one modest base); and send your aircraft out to intercept the alien craft that are swarming the Earth. In keeping with the timeless logic of computer games, the countries of the Earth have chosen to make X-COM, the planet’s one real hope for defeating the alien menace, into a resource-constrained semi-capitalist enterprise; you’ll often need to sell gadgets you’ve manufactured or stolen from the aliens in order to make ends meet, and if you fail to perform well your sponsoring countries will cut their funding.

The “Geoscape” view, where you place your bases and use them to intercept airborne alien attackers. You can find a wealth of discussion online about where best to position your first base — but naturally, most people prefer to put it in their home town. Like the ability to name your individual soldiers, the ability to start right in your own backyard forges a personal connection between the game and its player.

This half of the game was a dizzying leap into uncharted territory for the Gollop brothers. Thankfully, then, they were on very familiar ground when it came to the other half: the half that kicks in when your airborne interceptors force a UFO to land, or when you manage to catch the aliens in the act of terrorizing some poor city, or when the aliens themselves attack one of your bases. Here you find yourself in what amounts to Laser Squad 2 in form and spirit if not in name: an ultra-detailed turn-based single-unit combat simulator, the latest version of a game which Julian Gollop had already made four times before. (Or close enough to it, at any rate: X-COM, the culmination of what had begun with Rebelstar Raiders on the Spectrum, is ironically single-player only, whereas that first game had not just allowed but required two humans to play.) Although the strategic layer sounds far more complex than this tactical layer — and, indeed, it is in certain ways — it’s actually the tactical game where you spend the majority of your time, fighting battles which can consume an entire evening each.

The “Battlescape” view, where tactical combat takes place.

For all their differences, the two halves of the game do interlock in the end as two facets of a whole. Your research efforts, equipment purchases, and hiring practices in the strategic half determine the nature of the force you lead into the tactical man-against-alien battles. Less obviously but just as significantly, your primary reward for said battles proves to be the recovery of alien equipment, alien corpses, and even live alien specimens (all is fair in love and genocidal interplanetary war), which you cart back to your bases to place at the disposal of your research teams. And so the symbiotic relationship continues: your researchers use what you recover as grist for their mill, which lets you go into tougher battles with better equipment to hand, thereby to bring back still richer spoils.

The capsule description of the finished game which I’ve just provided mirrors almost perfectly the proposal which Julian Gollop delivered to MicroProse; the design would change surprisingly little in the process of development. MicroProse thought it sounded just fine as-is.



The contract which the Gollops signed with MicroProse specified that the former would be responsible for all of the design and coding, while the latter would provide the visual and audio assets. MicroProse UK did hold up their end of the bargain, but had an oddly casual attitude toward the project in general. Julian remembers their producer as “very laid back — he would come over once a month, we would go to the pub, talk about the game for a bit, and he would go home.” Otherwise, the Gollops worked largely alone after their first rush of consultations with the MicroProse mother ship had faded into the past. Time dragged on and on while they struggled with this massively complicated game, one half of which was unlike anything they had ever even contemplated before.

X-COM‘s UFOpaedia is a direct equivalent to Civilization‘s innovative Civilopedia, its most obvious single nod to Sid Meier’s equally influential but very, very different game.

As it did so, much happened in the broader world of MicroProse. On the positive side, Sid Meier’s Civilization was released at the end of 1991. But despite this and some other success stories, MicroProse’s financial foundation was growing ever more shaky, as their ambitions outran their core competencies. The company lost millions on an ill-judged attempt to enter the stand-up arcade market, then lost millions more on baroque CRPGs and flashy interactivity-lite adventure games. After an IPO that was supposed to bail them out went badly off the rails, Wild Bill Stealey sold out in June of 1993 to Spectrum Holobyte, another American publisher. The deal seemed to make sense: Spectrum Holobyte had a lot of money, thanks not least to generous venture capitalists, but a rather thin portfolio of games, while MicroProse had a lot of games both out and in the pipeline but had just about run out of money.

Spectrum Holobyte sifted carefully through their new possession’s projects in development, passing judgment on which were potential winners and which certain losers. According to Julian Gollop, Spectrum Holobyte told MicroProse UK in no uncertain terms to cancel X-COM. On the face of it, it wasn’t an unreasonable point of view to take. The Gollops had been working for almost two years by this point, and still had few concrete results to show for their efforts. It really did seem that they were hopelessly out of their depth. Luckily for them, however, Peter Moreland and others in the British office still believed in them. They nodded along with the order to bin X-COM, then quietly kept the project on the books. At this point, it didn’t cost them much of anything to do so; the art was already done, and now it was up to the Gollops to sink or swim with it.

X-COM bobbed up to the surface six months later, when the new, allegedly joint management team — Stealey would soon leave the company, feeling himself to have been thoroughly sidelined — started casting about for a game to feature in Europe in the first quarter of 1994, thereby to make the accountants happy. Peter Moreland piped up sheepishly: “You remember that UFO project you told us to cancel? Well, it’s actually still kicking around…” And so the Gollop brothers, who had been laboring under strangely little external pressure for the past 26 months or so, were now ordered to get their game done already. They managed it, just — UFO: Enemy Unknown shipped in Europe in March of 1994 — but some of the problems in the finished game definitely stem from the deadline that was so arbitrarily imposed from on high.

But if the game could have used a few more months in the oven, it nonetheless shipped in better condition than many other MicroProse games had during the recent stretch of financial difficulties. It garnered immediate rave reviews, while its sales also received a boost from another source. The first episode of The X-Files had aired the previous September in the United States, followed by airings across Europe. Just like that, a game about hostile alien visitors seemed a lot more relevant. Indeed, the game possessed much the same foreboding atmosphere as the show, from its muted color palette to MicroProse composer John Broomhall’s quietly malevolent soundtrack, which he had created in just two months in the final mad rush up to the release deadline. He couldn’t have done a better job if he’d had two years.

X-COM: UFO Defense shipped a few months later in North America, into a cultural zeitgeist that was if anything even more primed for it. Computer Gaming World, the American industry’s journal of record, gave it five stars out of five, and its sales soared well into the six digits. As the quote that opened this article attests, X-COM was in many ways the antithesis of what most publishers believed constituted a hit game in the context of 1994. Its graphics were little more than functional; it had no full-motion video, no real-time 3D rendering, no digitized voices; it fit perfectly well on a few floppy disks, thank you very much, with no need for any new-fangled CD-ROM drive. And yet it sold better than the vast majority of those other “cutting-edge” games. Many took its success as a welcome sign that gaming hadn’t yet lost its soul completely — that good old-fashioned gameplay could still trump production values from time to time.



The original X-COM‘s reputation has only grown more hallowed in the years since its release. It’s become a perennial on best-games-of-all-time lists, even ones whose authors weren’t yet born at the time of its release. For this is a game, so we’re told, that transcends its archaic presentation, that absolutely any student of game design needs to play.

That’s rather ironic in that X-COM is a game that really shouldn’t work at all according to many of the conventional rules of design. For example, it’s one of the most famous of all violators of what’s become known as the Covert Action Rule, as formulated by Sid Meier and named after one of his own less successful designs. The rule states that pacing is as important in a strategy game as it is in any other genre, that “mini-games” which pull the player away from the overarching strategic view need to be short and to the point, as is the case in Meier’s classic Pirates!. If they drag on too long, Meier tells us, the player loses focus on the bigger picture, forgets what she’s been trying to accomplish there, gets pulled out of that elusive state of “flow.”

But, as I already noted, X-COM‘s tactical battles can drag on for an hour or two at a time — and no one seems be bothered by this at all. What gives?

By way of an answer to that question, I would first note that the Covert Action Rule is, like virtually all supposedly hard-and-fast rules of game design, riddled with caveats and exceptions. (Personally, I don’t even agree that violating the yet-to-be-formulated Covert Action Rule was the worst problem of Covert Action itself.) And I would also note that X-COM does at least a couple of things extraordinarily well as compensation, better than any strategy game that came before it. Indeed, one can argue that no earlier grand-strategy game even attempted to do these things — not, at least, to anything like the same extent. Interestingly, both inspired strokes are borrowed from other gaming genres.

The first is the intriguing mystery surrounding the aliens, which is peeled back layer by layer as you progress. As your scientists study the equipment and alien corpses brought back from the battle sites and interrogate the live aliens your soldiers have captured, you learn more and more about where your enemies come from and what motivates them to attack the Earth so relentlessly. It doesn’t take long to reach a point where you look forward to the next piece of this puzzle as excitedly as you do the next cool gun or piece of armor. By the time the whole experience culminates in a desperate attack on the aliens’ home base, you’re all in. Granted, a byproduct of this sense of unfolding discovery is that you may not feel like revisiting the game after you win; for many or most of us, this is a strategy game to play through once rather than over and over again. But on the other hand, considering the fifty hours or more it will take you to get through it once, it’s hard to complain overmuch about that fact. Needless to say, when you do play it for the first time you should meticulously avoid spoilers about What Is Really Going On Here.

Learning more about the alien invaders via an autopsy. The game was ahead of its time; the year after X-COM‘s release, at the height of the X-Files-fueled UFO craze, the Fox television channel would broadcast Alien Autopsy: Fact or Fiction? in the United States. (For the record, it was most assuredly the latter.)

X-COM‘s other, even more brilliant stroke is the sense of identification it builds between you and the soldiers you send into battle. Each soldier has unique strengths and weaknesses, forcing you to carefully consider the role she plays in combat: a burly, fearless character who can carry enough weaponry to outfit your average platoon but couldn’t hit the proverbial broad side of a barn must be handled in a very different way from a slender, nervous sharpshooter. As your soldiers (hopefully) survive missions, their skills improve, CRPG-style. Thus you have plenty of practical reasons to be more loathe to lose a seasoned veteran than a greenhorn fresh out of basic training. And yet this purely zero-sum calculus doesn’t fully explain why each mission is so nail-bitingly tense, so full of agonizing decisions balancing risk against reward.

One of X-COM‘s most defining design choices is also one of its simplest: it lets you name each soldier for yourself. As you play, you form a picture of each of them in your imagination, even though the game itself never describes any of them to you as anything other than a list of numbers. Losing a soldier who’s been around for a while feels weirdly like losing a genuine acquaintance. For here too you can’t help but embellish the thin scaffolding of fact the game provides with your own story of what happened: the grizzled old-timer who went out one time too many, whose nerves just couldn’t handle another firefight; the foolhardy, testosterone-addled youth who threw himself into every battle like he was indestructible, until one day he wasn’t. X-COM provides the merest glimpse of what it must feel like to be an actual commander in war: the overwhelming stress of having the lives of others hanging on your decisions, the guilty second-guessing that inevitably goes on when you lose someone. It has something that games all too often lack: a sense of consequences for your actions. Theoretically at least, the best way to play it is in iron-man mode: no saving and restoring to fix bad outcomes, dead is dead, own your decisions as commander.

Beginning with just a name you choose for yourself and a handful of statistics which the game provides, your imagination will conjure a whole personality for each of your soldiers. Dwight here, for example, likes guitars, Cadillacs, and hillbilly music.

In one of those strange concordances that tend to crop up in many creative fields, X-COM wasn’t the only strategy game of 1994 to bring in CRPG elements to great effect. Ironically, these innovations occurred just as the CRPG genre itself was in its worst doldrums since Ultima I and Wizadry I had first brought it to prominence. Today, even as the CRPG has long since regained its mojo as a gaming genre, CRPG elements have become the special sauce ladled over a wide array of other types of games. X-COM was among the first to show how tasty the end result could be.

I have to say, however, that I find other elements of X-COM less appetizing, and that its strengths don’t quite overcome its weaknesses in my mind sufficiently to win it a place on my personal list of best games ever. My first stumbling block is the game’s learning curve, which is not just steep but unnecessarily so. I’d like to quote Garth Deangelis, who led the team that created XCOM: Enemy Unknown, the critically acclaimed franchise reboot that was released in 2012:

While [the original X-COM] may have been magnificent, it was also a unique beast when it came to beginning a new game. We often joked that the diehards who mastered the game independently belonged to an elite club because by today’s standards the learning curve was like climbing Mount Everest.

As soon as you fire up the original, you’re placed in a Geoscape with the Earth silently looming, and various options to explore within your base — including reading (unexplained) financial reports, approving manufacturing requests (without any context as to what those would mean later on), and examining a blueprint (which hinted at the possibility for base expansion), for example — the player is given no direction.

Even going on your first combat mission can be a bit of a mystery (and when you first step off the Skyranger, the game will kill off a few of your soldiers before you even see your first alien — welcome to X-COM!).

There’s certainly a place for complex games, and complexity will always come complete with a learning curve of some sort. But, again, X-COM‘s curve is just unnecessarily steep. Consider: when you begin a new game, you have two interceptors already in your hangar for bringing down UFOs. Fair enough. Unfortunately, they come equipped with sub-optimal Stingray missiles and borderline-useless cannon. So, one of the first tasks of the experienced player becomes to requisition some more advanced Avalanche missiles, put them on her interceptors, and sell off the old junk. Why can the game not just start you off with a reasonable weapons load-out? A similar question applies to the equipment carried by your individual soldiers, as it does to the well-nigh indefensible layout of your starting base itself, which makes it guaranteed to fall to the first squad of marauding aliens who come calling. The new player is likely to assume, reasonably enough, that the decisions the game has already made for her are good ones. She finds out otherwise only by being kicked in the teeth as a result of them. This is not good game design. The impression created is of a game that is not tough but fair, but rather actively out to get her.

Your starting base layout. By no means should you assume that this is a defensible one. In fact, many players spend a lot of money at the very beginning ripping it up completely and starting all over again. Why should this be necessary?

You’ll never use a large swath of the subpar weapons and equipment included in X-COM, which rather begs the questions what they’re doing in there. The game could have profited greatly from an editor empowered to pare back all of this extraneous nonsense and hone in on its core appeal. Likewise, the user interface in the strategic portion operates on the principle that, if one mouse click is good, ten must be that much better; everything is way more convoluted than it needs to be. Just buying and selling equipment is agonizing.

The tactical game’s interface is also dauntingly complex, but does have somewhat more method to its madness, being the beneficiary of all of Julian Gollop’s earlier experience with this sort of game. Still, even tactical combat, so widely and justly lauded as the beating heart of X-COM, is not without its frustrations. Certainly every X-COM player is all too familiar with the last-alien-on-the-map syndrome, where you sometimes have to spend fifteen or twenty minutes methodically hunting the one remaining enemy, who’s hunkered down in some obscure corner somewhere. The nature of the game is such that you can’t relax even in these situations; getting careless can still get one or more of your precious soldiers killed before you even realize what’s happening. But, although perhaps a realistic depiction of war, this part of the game just isn’t much fun. The problem is frustrating not least because it’s so easily soluble: just have the remaining aliens commit suicide to avoid capture — something entirely in keeping with their nature — when their numbers get too depleted.

All of these niggling problems mark X-COM as the kind of game I have to rant about here all too often: the kind that was never actually played before its release. For all its extended development time, it still needed a few more months filled with play-testing and polishing to reach its full potential. X-COM‘s most infamous bug serves as a reminder of just how little of either it got: its difficulty levels are broken. If you select something other than the “beginner” difficulty, it reverts back to the easiest level after the first combat mission. In one sense, this is a blessing: the beginner difficulty is more than difficult enough for the vast majority of players. On the other, though… how the heck could something as basic as that be overlooked? There’s only one way that I can see: if you barely played the game at all before you put it in a box and shipped it out the door.

To his credit, Julian Gollop himself is well aware of these issues and freely acknowledges them — does so much more freely in fact than some of his game’s biggest fans. He notes the influence of vintage Avalon Hill and SPI board games, some of which were so demanding that just being able to play them at all — never mind playing them well — was an odd sort of badge of honor for the grognards of the 1970s and early 1980s. He would appear to agree with me that there’s a bit too much of their style of complexity-for-its-own-sake in X-COM:

I believe that a good game may have relatively simple rules, but have complex situations arise from them. Strategy games tend to do that very well, you know — even the simplest ones are very good at that. I think it’s possible to have an accessible game which doesn’t have amazingly complex rules, but still has a kind of emerging complexity within what happens — you know, what players do, what players explore. For me, that’s the Holy Grail of game design. So, I don’t think that I would probably go back to making games as complex as [X-COM].

Like poets, game designers often simplify their work as they age, the better to capture the real essence of what they’re trying to express.



But whatever their final evaluation of the first game, most players then and now would agree that few franchises have been as thoroughly botched by their trustees as X-COM was afterward. When the first X-COM became an out-of-left-field hit, MicroProse UK, who had great need of hits at the time to impress the Spectrum Holobyte brass, wanted the Gollops to provide a sequel within a year. Knowing that that amount of time would allow them to do little more than reskin the existing engine, they worked out a deal: they would give their publisher their source code and let them make a quickie sequel in-house, while they themselves developed a more ambitious sequel for later release.

The in-house MicroProse project became 1995’s X-COM: Terror from the Deep, which posited that, forty years after their defeat at the end of the first game, the aliens have returned to try again. The wrinkle this time is that they’ve set up bases under the Earth’s oceans, which you must attack and eradicate. Unfortunately, Terror from the Deep does little to correct the original’s problems; if anything, it makes them worse. Most notably, it’s an even more difficult game than its predecessor, a decision that’s hard to understand on any level. Was anyone really complaining that X-COM was too easy? All in all, Terror from the Deep is exactly the unimaginative quickie sequel which the Gollops weren’t excited about having to make.

Nevertheless, it’s arguably the best of the post-original, pre-reboot generation of X-COM games. X-COM: Apocalypse, the Gollops’ own sequel, was a project on a vastly greater scale than the first two X-COM games, a scale to which they themselves struggled to adapt. It was riven by bureaucratic snafus and constant conflict between developer and publisher, and the resulting process of design-by-fractious-committee turned it into a game that did a lot of different things — turned-based and real-time combat in the very same game! — but did none of them all that well, nor even looked all that good whilst doing them. Julian Gollop today calls it “the worst experience of my entire career” and “a nightmare.” He and Nick cut all ties with MicroProse after its 1997 release.

After that, MicroProse lost the plot entirely, stamping the X-COM label onto games that had virtually nothing in common with the first one. X-COM: Interceptor (1998) was a space simulator in the mode of Wing Commander; Em@il Games: X-COM (1999) was a casual multiplayer networked affair; X-COM: Enforcer (2001) was a mindless shoot-em-up. This last proved to be the final straw;  the X-COM name disappeared for the next eleven years, until XCOM: Enemy Unknown, the reboot by Firaxis Games.

If you ask me, said reboot is in absolute terms a better game than the original, picking up on almost all of its considerable strengths while eliminating most of its weaknesses. But it cannot, of course, lay claim to the same importance in the history of gaming. Despite its flaws, the original X-COM taught designers to personalize strategy games, showed them how to raise the emotional stakes in a genre previously associated only with cool calculation. For that reason, it richly deserves its reputation as one of the most important games of its era.

(Sources: the book Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene; Amstrad Action of October 1989; Computer Gaming World of August 1994, September 1994, April 1995, and July 1995; Crash of Christmas 1988 and May 1989; Game Developer of April 2013; Retro Gamer 13, 68, 81, 104, 106, 112, and 124; Amiga Format of December 1989, June 1994, and November 1994; Amiga Format of December 1989, June 1994, and November 1994; Computer and Video Games of December 1988; Games TM 46; New Computer Express of September 15 1990; Games Machine of July 1988; Your Sinclair of August 1990 and September 1990; Personal Computer News of July 21 1983. Online sources include Julian Gollop’s X-COM postmortem from the 2013 Game Developers Conference, “The Story of X-COM at EuroGamer, and David Jenkins’s interview with Julian Gollop at Metro.

The original X-COM is available for digital purchase at GOG.com, as are most of the other X-COM games mentioned in this article.)

 
44 Comments

Posted by on September 18, 2020 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Master of Orion

 

Given the shadow which the original Master of Orion still casts over the gaming landscape of today, one might be forgiven for assuming, as many younger gamers doubtless do, that it was the very first conquer-the-galaxy grand-strategy game ever made. The reality, however, is quite different. For all that its position of influence is hardly misbegotten for other very good reasons, it was already the heir to a long tradition of such games at the time of its release in 1993. In fact, the tradition dates back to well before computer games as we know them today even existed.

The roots of the strategic space opera can be traced back to the tabletop game known as Diplomacy, designed by Allan B. Calhamer and first published in 1959 by Avalon Hill. Taking place in the years just prior to World War I, it put seven players in the roles of leaders of the various “great powers” of Europe. Although it included a playing board, tokens, and most of the other accoutrements of a typical board game, the real action, at least if you were playing it properly, was entirely social, in the alliances that were forged and broken and the shady deals that were struck. In this respect, it presaged many of the ideas that would later go into Dungeons & Dragons and other role-playing games. It thus represents an instant in gaming history as seminal in its own way as the 1954 publication of Avalon Hill’s Tactics, the canonical first tabletop wargame and the one which touched off the hobby of experiential gaming in general. But just as importantly for our purposes, Diplomacy‘s shifting alliances and the back-stabbings they led to would become an essential part of countless strategic space operas, including Master of Orion 34 years later.

Because getting seven friends together in the same room for the all-day affair that was a complete game of Diplomacy was almost as hard in the 1960s as it is today, inventive gamers developed systems for playing it via post; the first example of this breed would seem to date from 1963. And once players had started modifying the rules of Diplomacy to make it work under this new paradigm, it was a relatively short leap to begin making entirely new play-by-post games with new themes which shared some commonalities of approach with Calhamer’s magnum opus.

Thus in December of 1966, Dan Brannon announced a play-by-post game called Xeno, whose concept sounds very familiar indeed in the broad strokes. Each player started with a cluster of five planets — a tiny toehold in a sprawling, unknown galaxy waiting to be colonized. “The vastness of the playing space, the secrecy of the identity of the other players, the secrecy of the locations of ships and planets, the total lack of information without efforts of investigation, all these factors are meant to create the real problems of a race trying to expand to other planets,” wrote Brannon. Although the new game would be like Diplomacy in that it would presumably still culminate in negotiations, betrayals, and the inevitable final war to determine the ultimate victor, these stages would now be preceded by those of exploration and colonization, until a galaxy that had seemed so unfathomably big at the start proved not to be big enough to accommodate all of its would-be space empires. Certainly all of this too will be familiar to any player of Master of Orion or one of its heirs. Brannon’s game even included a tech tree of sorts, with players able to acquire better engines, weapons, and shields for their ships every eight turns they managed to survive.

In practice, Xeno played out at a pace to which the word “glacial” hardly does justice. The game didn’t really get started until September of 1967, and by a year after that just three turns had been completed. I don’t know whether a single full game of it was ever finished. Nevertheless, it proved hugely influential within the small community of experiential-gaming fanzines and play-by-post enthusiasts. The first similar game, called Galaxy and run by H. David Montgomery, had already appeared before Xeno had processed its third turn.

But the idea was, literally and figuratively speaking, too big for the medium for which it had been devised; it was just too compelling to remain confined to those few stalwart souls with the patience for play-by-post gaming. It soon branched out into two new mediums, each of which offered a more immediate sort of satisfaction.

In 1975, following rejections from Avalon Hill and others, one Howard Thompson formed his own company to publish the face-to-face board game Stellar Conquest, the first strategic space opera to appear in an actual box on store shelves. When Stellar Conquest became a success, it spawned a string of similar board games with titles like Godsfire, Outreach, Second Empire, and Starfall during this, the heyday of experiential gaming on the tabletop. But the big problem with such games was their sheer scope and math-heavy nature, which were enough to test the limits of many a salty old grognard who usually reveled in complexity. They all took at least three or four hours to play in their simplest variants, and a single game of at least one of them — SPI’s Outreach — could absorb weeks of gaming Saturdays. Meanwhile they were all dependent on pages and pages of fiddly manual calculations, in the time before spreadsheet macros or even handheld calculators were commonplace. (One hates to contemplate the plight of the Outreach group who have just spent the last two months resolving who shall become master of the galaxy, only to discover that the victor made a mistake on her production worksheet back on the second turn which invalidated all of the numbers that followed…) These games were, in other words, crying out for computerization.

Luckily, then, that too had already started to happen by the end of the 1970s. One of the reasons that play-by-post games of this type tended to run so sluggishly — beyond, that is, the inherent sluggishness of the medium itself — came down to the same problem as that faced by their tabletop progeny: the burden their size and complexity placed on their administrators. Therefore in 1976, Rick Loomis, the founder of a little company called Flying Buffalo, started running the commercial play-by-post game Starweb on what gaming historian Shannon Appelcline has called “probably the first computer ever purchased exclusively to play games” (or, at least, to administrate them): a $14,000 Raytheon 704 minicomputer. He would continue to run Starweb for more than thirty years — albeit presumably not on the same computer throughout that time.

But the first full-fledged incarnation of the computerized strategic space opera — in the sense of a self-contained game meant to be played locally on a single computer — arrived only in 1983. Called Reach for the Stars, it was the first fruit of what would turn into a long-running and prolific partnership between the Aussies Roger Keating and Ian Trout, who in that rather grandiose fashion that was so typical of grognard culture had named themselves the Strategic Studies Group. Reach for the Stars was based so heavily upon Stellar Conquest that it’s been called an outright unlicensed clone. Nevertheless, it’s a remarkable achievement for the way that it manages to capture that sense of size and scope that is such a huge part of these games’ appeal on 8-bit Apple IIs and Commodore 64s with just 64 K of memory. Although the whole is necessarily rather bare-bones compared to what would come later, the computer players’ artificial intelligence, always a point of pride with Keating and Trout, is surprisingly effective; on the harder difficulty level, the computer can truly give you a run for your money, and seems to do so without relying solely on egregious cheating.

It doesn’t look like much, but the basic hallmarks of the strategic space opera are all there in Reach for the Stars.

Reach for the Stars did very well, prompting updated ports to more powerful machines like the Apple Macintosh and IIGS and the Commodore Amiga as the decade wore on. A modest trickle of other boxed computer games of a similar stripe also appeared, albeit none which did much to comprehensively improve on SSG’s effort: Imperium Galactum, Spaceward Ho!, Armada 2525, Pax Imperia. Meanwhile the commercial online service CompuServe offered up MegaWars III, in which up to 100 players vied for control of the galaxy; it played a bit like one of those years-long play-by-post campaigns of yore compressed into four to six weeks of constant — and expensive, given CompuServe’s hourly dial-up rates — action and intrigue. Even the shareware scene got in on the act, via titles like Anacreon: Reconstruction 4021 and the earliest versions of the cult classic VGA Planets, a game which is still actively maintained and played to this day. And then, finally, along came Master of Orion in 1993 to truly take this style of game to the next level.

Had things gone just a little bit differently, Master of Orion too might have been a shareware release. It was designed in the spare time of Steve Barcia, an electrical engineer living in Austin, Texas, and programmed by Steve himself, his wife Marcia Barcia, and their friend Ken Burd. Steve claims not ever to have played any of the computer games I’ve just mentioned, but, as an avid and longtime tabletop gamer, he was very familiar with Stellar Conquest and a number of its successors. (No surprise there: Howard Thompson and his game were in fact also products of Austin’s vibrant board-gaming scene.)

After working on their computer game, which they called Star Lords, on and off for years, the little band of hobbyist programmers submitted it to MicroProse, whose grand-strategy game of Civilization, a creation of their leading in-house designer Sid Meier, had just taken the world by storm. A MicroProse producer named Jeff Johannigman — himself another member of the Austin gaming fraternity, as it happened, one who had just left Origin Systems in Austin to join MicroProse up in Baltimore — took a shine to the unpolished gem and signed its creators to develop it further. Seeing their hobby about to become a real business, the trio quit their jobs, took the name of SimTex, and leased a cramped office above a gyro joint to finish their game under Johannigman’s remote supervision, with a little additional help from MicroProse’s art department.

A fellow named Alan Emrich was one of most prominent voices in strategy-game criticism at the time; he was the foremost scribe on the subject at Computer Gaming World magazine, the industry’s accepted journal of record, and had just published a book-length strategy guide on Civilization in tandem with Johnny Wilson, the same magazine’s senior editor. Thanks to that project, Emrich was well-connected with MicroProse, and was happy to serve as a sounding board for them. And so, one fateful day very early in 1993, Johannigman asked if he’d like to have a look at a new submission called Star Lords.

As Emrich himself puts it, his initial impressions “were not that great.” He remembers thinking the game looked like “something from the late 1980s” — an eternity in the fast-changing computing scene of the early 1990s. Yet there was just something about it; the more he played, the more he wanted to keep playing. So, he shared Star Lords with his friend Tom Hughes, with whom he’d been playing tabletop and computerized strategy games for twenty years. Hughes had the same experience. Emrich:

After intense, repeated playing of the game, Tom and I were soon making numerous suggestions to [Johannigman], who, in turn, got tired of passing them on to the designer and lead programmer, Steve Barcia. Soon, we were talking to Steve directly. The telephone lines were burning regularly and a lot of ideas went back and forth. All the while, Steve was cooking up a better and better game. It was during this time that the title changed to Master of Orion and the game’s theme and focus crystallized.

I wrote a sneak preview for Computer Gaming World magazine where I indicated that Master of Orion was shaping up to be a good game. It had a lot of promise, but I didn’t think it was up there with Sid Meier’s Civilization, the hobby’s hallmark of strategy gaming at that time. But by the time that story hit the newsstands, I had changed my mind. I found myself still playing the game constantly and was reflecting on that fact when Tom called me. We talked about Master of Orion, of course, and Tom said, “You know, I think this game might become more addicting even than Civilization.” I replied, “You know, I think it already is.”

I was hard on Emrich in earlier articles for his silly assertion that Civilization‘s inclusion of global warming as a threat to progress and women’s suffrage as a Wonder of the World constituted some form of surrender to left-wing political correctness, as I was for his even sillier assertion that the game’s simplistic and highly artificial economic model could somehow be held up as proof for the pseudo-scientific theory of trickle-down economics. Therefore let me be very clear in praising him here: Emrich and Hughes played an absolutely massive role in making Master of Orion one of the greatest strategy games of all time. Their contribution was such that SimTex took the unusual step of adding to the credits listing a “Special Thanks to Alan Emrich and Tom Hughes for their invaluable design critiquing and suggestions.” If anything, that credit would seem to be more ungenerous than the opposite. By all indications, a pair of full-fledged co-designer credits wouldn’t have been out of proportion to the reality of their contribution. The two would go on to write the exhaustive official strategy guide for the game, a tome numbering more than 400 pages. No one could have been more qualified to tackle that project.

As if all that wasn’t enough, Emrich did one more great service for Master of Orion and, one might even say, for gaming in general. In a “revealing sneak preview” of the game, published in the September 1993 issue of Computer Gaming World, he pronounced it to be “rated XXXX.” After the requisite measure of back-patting for such edgy turns of phrase as these, Emrich settled down to explain what he really meant by the label: “XXXX” in this context stood for “EXplore, EXpand, EXploit, and EXterminate.” And thus was a new sub-genre label born. The formulation from the article was quickly shortened to “4X” by enterprising gamers uninterested in making strained allusions to pornographic films. In that form, it would be applied to countless titles going forward, right up to the present day, and retroactively applied to countless titles of the past, including all of the earlier space operas I’ve just described as well as the original Civilization — a game to which the “EXterminate” part of the label fits perhaps less well, but such is life.

Emrich’s article also creates an amusing distinction for the more pedantic ludic taxonomists and linguists among us. Although Master of Orion definitely was not, as we’ve now seen at some length, the first 4X game in the abstract, it was the very first 4X game to be called a 4X game. Maybe this accounts for some of the pride of place it holds in modern gaming culture?

However that may be, though, the lion’s share of the credit for Master of Orion‘s enduring influence must surely be ascribed to what a superb game it is in its own right. If it didn’t invent the 4X space opera, it did in some sense perfect it, at least in its digital form. It doesn’t do anything conceptually new on the face of it — you’re still leading an alien race as it expands through a randomly created galaxy, competing with other races in the fields of economics, technology, diplomacy, and warfare to become the dominant civilization — but it just does it all so well.

A new game of Master of Orion begins with you choosing a galaxy size (from small to huge), a difficulty level (from simple to impossible), and a quantity of opposing aliens to compete against (from one to five). Then you choose which specific race you would like to play; you have ten possibilities in all, drawing from a well-worn book of science-fiction tropes, from angry cats in space to hive-mind-powered insects, from living rocks to pacifistic brainiacs, alongside the inevitable humans. Once you’ve made your choice, you’re cast into the deep end — or rather into deep space — with a single half-developed planet, a colony ship for settling a second planet as soon as you find a likely candidate, two unarmed scout ships for exploring for just such a candidate, and a minimal set of starting technologies.

You must parlay these underwhelming tools into galactic domination hundreds of turns later. You can take the last part of the 4X tag literally and win out by utterly exterminating all of your rivals, but a slightly less genocidal approach is a victory in the “Galactic Council” which meets every quarter-century (i.e., every 25 turns). Here everyone can vote on which of the two most currently populous empires’ leaders they prefer to appoint as ruler of the galaxy, with “everyone” in this context including the two leading emperors themselves. Each empire gets a number of votes determined by its population, and the first to collect two-thirds of the total vote wins outright. (Well, almost… it is possible for you to refuse to respect the outcome of a vote that goes against you, but doing so will cause all of your rivals to declare immediate and perpetual war against you, whilst effectively pooling all of their own resources and technology. Good luck with that!)

A typical game of Master of Orion plays out over three broad stages. The first stage is the land grab, the wide-open exploration and colonization phase that happens before you meet your rival aliens. Here your challenge is to balance the economic development of your existing planets against your need to settle as many new ones as possible to put yourself in a good position for the mid-game. (When exactly do I stop spending my home planet’s resources on improving its own infrastructure and start using them to build more colony ships?) The mid-game begins when you start to bump into your rivals, and comes to entail much jockeying for influence, as the various races begin to sort themselves into rival factions. (The Alkaris, bird-like creatures, loathe the Mrrshans, the aforementioned race of frenzied pussycats, and their loathing is returned in kind. I don’t have strong feelings about either one — but whose side would it most behoove me to choose from a purely strategic perspective?) The endgame is nigh when there is no more room for anyone to expand, apart from taking planets from a rival by force, and the once-expansive galaxy suddenly seems claustrophobic. It often, although by no means always, is marked by a massive war that finally secures somebody that elusive two-thirds majority in the Galactic Council. (I’m so close now! Do I attack those stubbornly intractable Bulrathi to try to knock down their population and get myself over the two-thirds threshold that way, or do I keep trying to sweet-talk and bribe them into voting for me?) The length and character of all of these stages will of course greatly depend on the initial setup you chose; the first stage might be all but nonexistent in a small galaxy with five rivals, while it will go on for a long, long time indeed in a huge galaxy with just one or two opponents. (The former scenario is, for the record, far more challenging.)

And that’s how it goes, generally speaking. Yet the core genius of Master of Orion actually lies in how resistant it is to generalization. It’s no exaggeration to say that there really is no “typical” game; I’ve enjoyed plenty which played out in nothing like the pattern I’ve just described for you. I’ve played games in which I never fired a single shot in anger, even ones where I’ve never built a single armed ship of war, just as I’ve played others where I was in a constant war for survival from beginning to end. Master of Orion is gaming’s best box of chocolates; you never know what you’re going to get when you jump into a new galaxy. Everything about the design is engineered to keep you from falling back on patterns universally applicable to the “typical” game. It’s this quality, more so than any other, that makes Master of Orion so consistently rewarding. If I was to be stranded on the proverbial desert island, I have a pretty good idea of at least one of the games I’d choose to take with me.

I’ll return momentarily to the question of just how Master of Orion manages to build so much variation into a fairly simple set of core rules. I think it might be instructive to do so, however, in comparison with another game, one I’ve already had occasion to mention several times in this article: Civilization.

As I’m so often at pains to point out, game design is, like any creative pursuit, a form of public dialog. Certainly Civilization itself comes with a long list of antecedents, including most notably Walter Bright’s mainframe game Empire, Dani Bunten Berry’s PC game Seven Cities of Gold, and the Avalon Hill board game with which Civilization shares its name. Likewise, Civilization has its progeny, among them Master of Orion. By no means was it the sole influence on the latter; as we’ve seen, Master of Orion was also greatly influenced by the 4X space-opera tradition in board games, especially during its early phases of development.

Still, the mark of Civilization as well can be seen all over its finished design. (After all, Alan Emrich had just literally written the book on Civilization when he started bombarding Barcia with design suggestions…) For example, Master of Orion, unlike all of its space-opera predecessors, on the computer or otherwise, doesn’t bother at all with multiplayer options, preferring to optimize the single-player experience in their stead. One can’t help but feel that it was Civilization, which was likewise bereft of the multiplayer options that earlier grand-strategy games had always included as a matter of course, that empowered Steve Barcia and company to go this way.

At the same time, though, we cannot say that Jeff Johannigman was being particularly accurate when he took to calling Master of OrionCivilization in space” for the benefit of journalists. For all that it’s easy enough to understand what made such shorthand so tempting — this new project too was a grand-strategy game played on a huge scale, incorporating technology, economics, diplomacy, and military conflict — it wasn’t ultimately fair to either game. Master of Orion is very much its own thing. Its interface, for example, is completely different. (Ironically, Barcia’s follow-up to Master of Orion, the fantasy 4X Master of Magic, hews much closer to Civilization in that respect.) In Master of Orion, Civilization‘s influence often runs as much in a negative as a positive direction; that is to say, there are places where the later design is lifting ideas from the earlier one, but also taking it upon itself to correct perceived weaknesses in their implementation.

I have to use the qualifier “perceived” there because the two games have such different personalities. Simply put, Civilization prioritizes its fictional context over its actual mechanics, while Master of Orion does just the opposite. Together they illustrate the flexibility of the interactive digital medium, showing how great games can be great in such markedly different ways, even when they’re as closely linked in terms of genre as these two are.

Civilization explicitly bills itself as a grand journey through human history, from the time in our distant past when the first hunter-gatherers settled down in villages to an optimistic near-future in space. The rules underpinning the journey are loose-goosey, full of potential exploits. The most infamous of these is undoubtedly the barbarian-horde strategy, in which you research only a few minimal technologies necessary for war-making and never attempt to evolve your society or participate in any meaningful diplomacy thereafter, but merely flood the world with miserable hardscrabble cities supporting primitive armies, attacking everything that moves until every other civilization is extinct. At the lower and moderate difficulty levels at least, this strategy works every single time, albeit whilst bypassing most of what the game was meant to be about. As put by Ralph Betza, a contributor to an early Civilization strategy guide posted to Usenet: “You can always play Despotic Conquest, regardless of the world you find yourself starting with, and you can always win without using any of the many ways to cheat. When you choose any other strategy, you are deliberately risking a loss in order to make the game more interesting.”

So very much in Civilization is of limited utility at best in purely mechanical terms. Many or most of the much-vaunted Wonders of the World, for example, really aren’t worth the cost you have to pay for them. But that’s okay; you pay for them anyway because you like the idea of having built the Pyramids of Giza or the Globe Theatre or Project Apollo, just as you choose not to go all Genghis Khan on the world because you’d rather build a civilization you can feel proud of. Perhaps the clearest statement of Civilization‘s guiding design philosophy can be found in the manual. It says that, even if you make it all the way to the end of the game only to see one of your rivals achieve the ultimate goal of mounting an expedition to Alpha Centauri before you do, “the successful direction of your civilization through the centuries is an achievement. You have survived countless wars, the pollution of the industrial age, and the risks of nuclear weapons.” Or, as Sid Meier himself puts it, “a game of Civilization is an epic story.”

We’re happy to preach peace and cooperation, as long as we’re the top dogs… er, birds.

Such sentiments are deeply foreign to Master of Orion; this is a zero-sum game if ever there was one. If you lose the final Galactic Council vote, there’s no attaboy for getting this far, much less any consolation delivered that the galaxy has entered a new era of peaceful cooperation with some other race in the leadership role. Instead the closing cinematic tells you that you’ve left the known galaxy and “set forth to conquer new worlds, vowing to return and claim the renowned title of Master of Orion.” (Better to rule in Hell, right?) There are no Wonders of the World in Master of Orion, and, while there is a tech tree to work through, you won’t find on it any of Civilization‘s more humanistic advances, such as Chivalry or Mysticism, or even Communism or The Corporation. What you get instead are technologies — it’s telling that Master of Orion talks about a “tech tree,” while Civilization prefers the word “advances” — with a direct practical application to settling worlds and making war, divided into the STEM-centric categories of Computers, Construction, Force Fields, Planetology, Propulsion, and Weapons.

So, Civilization is the more idealistic, more educational, perhaps even the nobler of the two games. And yet it often plays a little awkwardly — which awkwardness we forgive because of its aspirational qualities. Master of Orion‘s fictional context is a much thinner veneer to stretch over its mechanics, while words like “idealistic” simply don’t exist in its vocabulary. And yet, being without any high-flown themes to fall back on, it makes sure that its mechanics are absolutely tight. These dichotomies can create a dilemma for a critic like yours truly. If you asked me which game presents a better argument for gaming writ large as a potentially uplifting, ennobling pursuit, I know which of the two I’d have to point to. But then, when I’m just looking for a fun, challenging, intriguing game to play… well, let’s just say that I’ve played a lot more Master of Orion than Civilization over the last quarter-century. Indeed, Master of Orion can easily be read as the work of a designer who looked at Civilization and was unimpressed with its touchy-feely side, then set out to make a game that fixed all the other failings which that side obscured.

By way of a first example, let’s consider the two games’ implementation of an advances chart — or a tech tree, whichever you prefer. Arguably the most transformative single advance in Civilization is Railroads; they let you move your military units between your cities almost instantaneously, which makes attacks much easier and quicker to mount for warlike players and enables the more peaceful types to protect their holdings with a much smaller (and thus less expensive) standing army. The Railroads advance is so pivotal that some players build their entire strategy around acquiring it as soon as possible, by finding it on the advances chart as soon as the game begins in 4000 BC and working their way backward to find the absolute shortest path for reaching it. This is obviously problematic from a storytelling standpoint; it’s not as if the earliest villagers set about learning the craft of Pottery with an eye toward getting their hands on Railroads 6000 years later. More importantly, though, it’s damaging to the longevity of the game itself, in that it means that players can and will always employ that same Railroads strategy just as soon as they figure out what a winner it is. Here we stumble over one of the subtler but nonetheless significant axioms of game design: if you give players a hammer that works on every nail, many or most of them will use it — and only it — over and over again, even if it winds up decreasing their overall enjoyment. It’s for this reason that some players continue to use even the barbarian-horde strategy in Civilization, boring though it is. Or, to take an outside example: how many designers of CRPGs have lovingly crafted dozens of spells with their own unique advantages and disadvantages, only to watch players burn up everything they encounter with a trusty Fireball?

Master of Orion, on the other hand, works hard at every turn to make such one-size-fits-all strategies impossible — and nowhere more so than in its tech tree. When a new game begins, each race is given a randomized selection of technologies that are possible for it to research, constituting only about half of the total number of technologies in the game. Thus, while a technology roughly equivalent to Civilization‘s Railroads does exist in Master of Orion — Star Gates — you don’t know if this or any other technology is actually available to you until you advance far enough up the tree to reach the spot where it ought to be. You can’t base your entire strategy around a predictable technology progression. While you can acquire technologies that didn’t make it into your tree by trading with other empires, bullying them into giving them to you, or attacking their planets and taking them, that’s a much more fraught, uncertain path to go down than doing the research yourself, one that requires a fair amount of seat-of-your-pants strategy in its own right. Any way you slice it, in other words, you have to improvise.

We’ve been lucky here in that Hydrogen Fuel Cells, the first range-extending technology and a fairly cheap one, is available in our tree. If it wasn’t, and if we didn’t have a lot of stars conveniently close by, we’d have to dedicate our entire empire to attaining a more advanced and thus more expensive range-extending technology, lest we be left behind in the initial land grab. But this would of course mean neglecting other aspects of our empire’s development. Trade-offs like this are a constant fact of life in Master of Orion.

This one clever design choice has repercussions for every other aspect of the game. Take, for instance, the endlessly fascinating game-within-a-game of designing your fleet of starships. If the tech tree was static, players would inevitably settle upon a small set of go-to designs that worked for their style of play. As it is, though, every new ship is a fresh balancing act, its equipment calibrated to maximize your side’s technological strengths and mitigate its weaknesses, while also taking into careful account the strengths and weaknesses of the foe you expect to use it against, about which you’ve hopefully been compiling information through your espionage network. Do you build a huge number of tiny, fast, maneuverable fighters, or do you build just a few lumbering galactic dreadnoughts? Or do you build something in between? There are no universally correct answers, just sets of changing circumstances.

Another source of dynamism are the alien races you play and those you play against. The cultures in Civilization have no intrinsic strengths and weaknesses, just sets of leader tendencies when played by the computer; for your part, you’re free to play the Mongols as pacifists, or for that matter the Russians as paragons of liberal democracy and global cooperation. But in Master of Orion, each race’s unique affordances force you to play it differently. Likewise, each opposing race’s affordances in combination with those of your own force you to respond differently to that race when you encounter it, whether on the other side of a diplomats’ table or on a battlefield in space. Further, most races have one technology they’re unusually good at researching and one they’re unusually bad at. Throw in varying degrees of affinity and prejudice toward the other races, and, again, you’ve got an enormous amount of variation which defies cookie-cutter strategizing. (It’s worth noting that there’s a great deal of asymmetry here; Steve Barcia and his helpers didn’t share so many modern designers’ obsession with symmetrical play balance above all else. Some races are clearly more powerful than others: the brainiac Psilons get a huge research bonus, the insectoid Klackons get a huge bonus in worker productivity, and the Humans get huge bonuses in trade and diplomacy. Meanwhile the avian Alkaris, the feline Mrrshan, and the ursine Bulrathis have bonuses which only apply during combat, and can be overcome fairly easily by races with other, more all-encompassing advantages.)

There are yet more touches to bring yet more dynamism. Random events occur from time to time in the galaxy, some of which can change everything at a stroke: a gigantic space amoeba might show up and start eating stars, forcing everyone to forget their petty squabbles for a while and band together against this apocalyptic threat. And then there’s the mysterious star Orion, from which the game takes it name, which houses the wonders of a long-dead alien culture from the mythical past. Taking possession of it might just win the game for you — but first you’ll have to defeat its almost inconceivably powerful Guardian.

One of the perennial problems of 4X games, Civilization among them, is the long anticlimax, which begins at that point when you know you’re going to conquer the world or be the first to blast off for Alpha Centauri, but well before you actually do so. (What Civilization player isn’t familiar with the delights of scouring the map for that one remaining rival city tucked away on some forgotten island in some forgotten corner?) Here too Master of Orion comes with a mitigating idea, in the form of the Galactic Council whose workings I’ve already described. It means that, as soon as you can collect two-thirds of the vote — whether through wily diplomacy or the simpler expedient of conquering until two-thirds of the galaxy’s population is your own — the game ends and you get your victory screen.

Indeed, one of the overarching design themes of Master of Orion is its determination to minimize the boring stuff. It must be admitted, of course, that boredom is in the eye of the beholder. Non-fans have occasionally dismissed the whole 4X space-opera sub-genre as “Microsoft Excel in space,” and Master of Orion too requires a level of comfort with — or, better yet, a degree of fascination with — numbers and ratios; you’ll spend at least as much time tinkering with your economy as you will engaging in space battles. Yet the game does everything it can to minimize the pain here as well. While hardly a simple game in absolute terms, it is quite a streamlined example of its type; certainly it’s much less fiddly than Civilization. Planet management is abstracted into a set of five sliding ratio bars, allowing you decide what percentage of that planet’s total output should be devoted to building ships, building defensive installations, building industrial infrastructure, cleaning up pollution, and researching new technologies. Unlike in Civilization, there is no list of specialized structures to build one at a time, much less a need to laboriously develop the land square by square with a specialized unit. Some degree of micro-management is always going to be in the nature of this type of game, but managing dozens of planets in Master of Orion is far less painful than managing dozens of cities in Civilization.

The research screen as well operates through sliding ratio bars which let you decide how much effort to devote to each of six categories of technology. In other words, you’re almost always researching multiple advances at once in Master of Orion, whereas in Civilization you only research one at a time. Further, you can never predict for sure when a technology will arrive; while each has a base cost in research points, “paying” it leads only to a slowly increasing randomized chance of acquiring the technology on any given turn. (That’s the meaning of the “17%” next to Force Fields in the screenshot above.) You also receive bonuses for maintaining steady research over a long run of turns, rather than throwing all of your research points into one technology, then into something else, etc. All of this as well serves to make the game more unpredictable and dynamic.

In short, Master of Orion tries really, really hard to work with you rather than against you, and succeeds to such a degree that it can sometimes feel like the game is reading your mind. A reductionist critic of the sort I can be on occasion might say that there are just two types of games: those that actually got played before their release and those that didn’t. With only rare exceptions, this distinction, more so than the intrinsic brilliance of the design team or any other factor, is the best predictor of the quality of the end result. Master of Orion is clearly a game that got played, and played extensively, with all of the feedback thus gathered being incorporated into the final design. The interface is about as perfect as the technical limitations of 1993 allow it to be; nothing you can possibly want to do is more than two clicks away. And the game is replete with subtle little conveniences that you only come to appreciate with time — like, just to take one example, the way it asks if you want to automatically adjust the ecology spending on every one of your planets when you acquire a more efficient environmental-cleanup technology. This lived-in quality can only be acquired the honest, old-fashioned way: by giving your game to actual players and then listening to what they tell you about it, whether the points they bring up are big or small, game-breaking or trivial.

This thoroughgoing commitment to quality is made all the more remarkable by our knowledge of circumstances inside MicroProse while Master of Orion was going through these critical final phases of its development. When the contract to publish the game was signed, MicroProse was in desperate financial straits, having lost bundles on an ill-advised standup-arcade game along with expensive forays into adventure games and CRPGs, genres far from their traditional bread and butter of military simulations and grand-strategy games. Although other projects suffered badly from the chaos, Master of Orion, perhaps because it was a rather low-priority project entrusted largely to an outside team located over a thousand miles away, was given the time and space to become its best self. It was still a work in progress on June 21, 1993, when MicroProse’s mercurial, ofttimes erratic founder and CEO “Wild Bill” Stealey sold the company to Spectrum Holobyte, a publisher with a relatively small portfolio of extant games but a big roll of venture capital behind them.

Master of Orion thus became one of the first releases from the newly conjoined entity on October 1, 1993. Helped along by the evangelism of Alan Emrich and his pals at Computer Gaming World, it did about as well as such a cerebral title, almost completely bereft of audiovisual bells and whistles, could possibly do in the new age of multimedia computing; it became the biggest strategy hit since Civilization, and the biggest 4X space opera to that point, in any medium. Later computerized iterations on the concept, including its own sequels, doubtless sold more copies in absolute numbers, but the original Master of Orion has gone on to become one of the truly seminal titles in gaming history, almost as much so as the original Civilization. It remains the game to which every new 4X space opera — and there have been many of them, far more than have tried to capture the more elusively idealistic appeal of Civilization — must be compared.

Sometimes a status such as that enjoyed by Master of Orion arrives thanks to an historical accident or a mere flashy technical innovation, but that is definitively not the case here. Master of Orion remains as rewarding as ever in all its near-infinite variation. Personally, I like to embrace its dynamic spirit for everything it’s worth by throwing a (virtual) die to set up a new game, letting the Universe decide what size galaxy I play in, how many rivals I play with, and which race I play myself. The end result never fails to be enjoyable, whether it winds up a desperate free-for-all between six alien civilizations compressed into a tiny galaxy with just 24 stars, or a wide-open, stately game of peaceful exploration in a galaxy with over 100 of them. In short, Master of Orion is the most inexhaustible well of entertainment I’ve ever found in the form of a single computer game — a timeless classic that never fails to punish you for playing lazy, but never fails to reward you for playing well. I’ve been pulling it out to try to conquer another random galaxy at least once every year or two for half my life already. I suspect I’ll still be doing so until the day I die.

(Sources: the books Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay, Designers & Dragons, Volume 1: The 1970s by Shannon Appelcline, and Master of Orion: The Official Strategy Guide by Alan Emrich and Tom E. Hughes, Jr.; Computer Gaming World of December 1983, June/July 1985, October 1991, June 1993, August 1993, September 1993, December 1993, and October 1995; Commodore Disk User of May 1988; Softline of March 1983. Online sources include “Per Aspera Ad Astra” by Jon Peterson from ROMchip, Alan Emrich’s historical notes from the old Master of Orion III site, a Steve Barcia video interview which originally appeared in the CD-ROM magazine Interactive Entertainment., and the Civilization Usenet FAQ, lasted updated by “Dave” in 1994.

Master of Orion I and II are available for purchase together from GOG.com. I highly recommend a tutorial, compiled many years ago by Sirian and now available only via archive.org, as an excellent way for new players to learn the ropes.)

 
 

Tags: , ,

Darklands

Darklands may well have been the most original single CRPG of the 1990s, but its box art was planted firmly in the tacky CRPG tradition. I’m not sure that anyone in Medieval Germany really looked much like these two…

Throughout the 1980s and well into the 1990s, the genres of the adventure game and the CRPG tended to blend together, in magazine columns as well as in the minds of ordinary gamers. I thus considered it an early point of order for this history project to attempt to identify the precise differences between the genres. Rather than addressing typical surface attributes — a CRPG, many a gamer has said over the years, is an adventure game where you also have to kill monsters — I tried to peek under the hood and identify what really makes the two genres tick. At bottom, I decided, the difference was one of design philosophy. The adventure game focuses on set-piece, handcrafted puzzles and other unique interactions, simulating the world that houses them only to the degree that is absolutely necessary. (This latter is especially true of the point-and-click graphic adventures that came to dominate the field after the 1980s; indeed, throughout gaming history, the trend in adventure games has been to become less rather than more ambitious in terms of simulation.) The CRPG, meanwhile, goes in much more for simulation, to a large degree replacing set-piece behaviors with systems of rules which give scope for truly emergent experiences that were never hard-coded into the design.

Another clear difference between the two genres, however, is in the scope of their fictions’ ambitions. Since the earliest days of Crowther and Woods and Scott Adams, adventure games have roamed widely across the spectrum of storytelling; Infocom alone during the 1980s hit on most of the viable modern literary genres, from the obvious (fantasy, science fiction) to the slightly less obvious (mysteries, thrillers) to the downright surprising (romance novels, social satires). CRPGs, on the other hand, have been plowing more or less the same small plot of fictional territory for decades. How many times now have groups of stalwart men and ladies set forth to conquer the evil wizard? While we do get the occasional foray into science fiction — usually awkwardly hammered into a frame of gameplay conventions more naturally suited to heroic fantasy — it’s for the most part been J.R.R. Tolkien and Dungeons & Dragons, over and over and over again.

This seeming lack of adventurousness (excuse the pun!) among CRPG designers raises some interesting questions. Can the simulation-oriented approach only be made to work within a strictly circumscribed subset of possible virtual worlds? Or is the lack of variety in CRPGs down to a simple lack of trying? An affirmative case for the latter question might be made by Origin Systems’s two rather wonderful Worlds of Ultima games of the early 1990s, which retained the game engine from the more traditional fantasy CRPG Ultima VI but moved it into settings inspired by the classic adventure tales of Arthur Conan Doyle and H.G. Wells. Sadly, though, Origin’s customers seemed not to know what to make of Ultima games not taking place in a Renaissance Faire world, and both were dismal commercial failures — thus providing CRPG makers with a strong external motivation to stick with high fantasy, whatever the abstract limits of the applicability of the CRPG formula to fiction might be.

Our subject for today — Darklands, the first CRPG ever released by MicroProse Software — might be described as the rebuttal to the case made by the Worlds of Ultima games, in that its failings point to some of the intrinsic limits of the simulation-oriented approach. Then again, maybe not; today, perhaps even more so than when it was new, this is a game with a hardcore fan base who love it with a passion, even as other players, like the one who happens to be writing this article, see it as rather collapsing under the weight of its ambition and complexity. Whatever your final verdict on it, it’s undeniable that Darklands is overflowing with original ideas for a genre which, even by the game’s release year of 1992, had long since settled into a set of established expectations. By upending so many of them, it became one of the most intriguing CRPGs ever made.



Darklands was the brainchild of Arnold Hendrick, a veteran board-game, wargame, tabletop-RPG, and console-videogame designer who joined MicroProse in 1985, when it was still known strictly as a maker of military simulations. As the first MicroProse employee hired only for a design role — he had no programming or other technical experience whatsoever — he began to place his stamp on the company’s products immediately. It was Hendrick who first had the germ of an idea that Sid Meier, MicroProse’s star programmer/designer, turned into Pirates!, the first MicroProse game to depart notably from the company’s established formula. In addition to Pirates!, for which he continued to serve as a scenario designer and historical consultant even after turning the lead-designer reins over to Meier, Hendrick worked on other games whose feet were more firmly planted in MicroProse’s wheelhouse: titles like Gunship, Project Stealth Fighter, Red Storm Rising, M1 Tank Platoon, and Silent Service II.

“Wild” Bill Stealey, the flamboyant head of MicroProse, had no interest whatsoever in any game that wasn’t a military flight simulator. Still, he liked making money even more than he liked flying virtual aircraft, and by 1990 he wasn’t sure how much more he could grow his company if it continued to make almost nothing but military simulations and the occasional strategic wargame. Meanwhile he had Pirates! and Railroad Tycoon, the latter being Sid Meier’s latest departure from military games, to look at as examples of how successful non-traditional MicroProse games could be. Not knowing enough about other game genres to know what else might be a good bet for his company, he threw the question up to his creative and technical staff: “Okay, programmers, give me what you want to do, and tell me how much money you want to spend. We’ll find a way to sell it.”

And so Hendrick came forward with a proposal to make a CRPG called Darklands, to be set in the Germany of the 15th century, a time and place of dark forests and musty monasteries, Walpurgis Night and witch covens. It could become, Hendrick said, the first of a whole new series of historical CRPGs that, even as they provided MicroProse with an entrée into one of the most popular genres out there, would also leverage their reputation for making games with roots in the real world.

The typical CRPG, then as now, took place in a version of Medieval times that had only ever existed in the imagination of a modern person raised on Tolkien and Dungeons & Dragons. It ignored how appallingly miserable and dull life was for the vast majority of people who lived through the historical reality of the Middle Ages, with its plagues, wars, filth, hard labor, and nearly universal illiteracy. Although he was a dedicated student of history, with a university degree in the field, Hendrick too was smart enough to realize that there wasn’t much of a game to be had by hewing overly close to this mundane historical reality. But what if, instead of portraying a Medieval world as his own contemporaries liked to imagine it to have been, he conjured up the world of the Middle Ages as the people who had lived in it had imagined it to be? God and his many saints would take an active role in everyday affairs, monsters and devils would roam the forests, alchemy would really work, and those suspicious-looking folks who lived in the next village really would be enacting unspeakable rituals in the name of Satan every night. “This is an era before logic or science,” Hendrick wrote, “a time when anything is possible. In short, if Medieval Germans believed something to be true, in Darklands it might actually be true.”

He wanted to incorporate an interwoven tapestry of Medieval imagination and reality into Darklands: a magic system based on Medieval theories about alchemy; a pantheon of real saints to pray to, each able to grant her own special favors; a complete, lovingly detailed map of 15th-century Germany and lands adjacent, over which you could wander at will; hundreds of little textual vignettes oozing with the flavor of the Middle Ages. To make it all go, he devised a set of systems the likes of which had never been seen in a CRPG, beginning with a real-time combat engine that let you pause it at any time to issue orders; its degree of simulation would be so deep that it would include penetration values for various weapons against various materials (thus ensuring that a vagabond with rusty knife could never, ever kill a full-fledged knight in shining armor). The character-creation system would be so detailed as to practically become a little game in itself, asking you not so much to roll up each character as live out the life story that brought her to this point: bloodline, occupations, education (such as it was for most in the Middles Ages), etc.

Character creation in Darklands is really, really complicated. And throughout the game, the spidery font superimposed on brown-sauce backgrounds will make your eyes bleed.

All told, it was one heck of a proposition for a company that had never made a CRPG before. Had Stealey been interested enough in CRPGs to realize just how unique the idea was, he might have realized as well how doubtful its commercial prospects were in a market that seemed to have little appetite for any CRPG that didn’t hew more or less slavishly to the Dungeons & Dragons archetype. But Stealey didn’t realize, and so Darklands got the green light in mid-1990. What followed was a tortuous odyssey; it became the most protracted and expensive development project MicroProse had ever funded.

We’ve seen in some of my other recent articles how companies like Sierra and Origin, taking stock of escalating complexity in gameplay and audiovisuals and their inevitable companion of escalating budgets, began to systematize the process of game development around this time. And we’ve at least glimpsed as well how such systematization could be a double-edged sword, leading to creatively unsatisfied team members and final products with something of a cookie-cutter feel.

MicroProse, suffice to say, didn’t go that route. Stealey took a hands-off approach to all projects apart from his beloved flight simulators, allowing his people to freelance their way through them. For all the drawbacks of rigid hierarchies and strict methodologies, the Darklands project could have used an injection of exactly those things. It was plagued by poor communication and outright confusion from beginning to end, as Arnold Hendrick and his colleagues improvised like mad in the process of making a game that was like nothing any of them had ever tried to make before.

Hendrick today forthrightly acknowledges that his own performance as project leader was “terrible.” Too often, the right hand didn’t know what the left was doing. An example cited by Hendrik involves Jim Synoski, the team’s first and most important programmer. For some months at the beginning of the project, he believed he was making essentially a real-time fighting game; while that was in fact some of what Darklands was about, it was far from the sum total of the experience. Once made aware at last that his combat code would need to interact with many other modules, he managed to hack the whole mess together, but it certainly wasn’t pretty. It seems there wasn’t so much as a design document for the team to work from — just a bunch of ideas in Hendrick’s head, imperfectly conveyed to everyone else.

The first advertisement for Darklands appeared in the March 1991 issue of Computer Gaming World. The actual product wouldn’t materialize until eighteen months later.

It’s small wonder, then, that Darklands went so awesomely over time and over budget; the fact that MicroProse never cancelled it likely owes as much to the sunk-cost fallacy as anything else. Hendrick claims that the game cost as much as $3 million to make in the end — a flabbergasting number that, if correct, would easily give it the crown of most expensive computer game ever made at the time of its release. Indeed, even a $2 million price tag, the figure typically cited by Stealey, would also qualify it for that honor. (By way of perspective, consider that Origin Systems’s epic CRPG Ultima VII shipped the same year as Darklands with an estimated price tag of $1 million.)

All of this was happening at the worst possible time for MicroProse. Another of Stealey’s efforts to expand the company’s market share had been an ill-advised standup-arcade version of F-15 Strike Eagle, MicroProse’s first big hit. The result, full of expensive state-of-the-art graphics hardware, was far too complex for the quarter-eater market; it flopped dismally, costing MicroProse a bundle. Even as that investment was going up in smoke, Stealey, acting again purely on the basis of his creative staff’s fondest wishes, agreed to challenge the likes of Sierra by making a line of point-and-click graphic adventures. Those products too would go dramatically over time and over budget.

Stealey tried to finance these latest products by floating an initial public offering in October of 1991. By June of 1992, on the heels of an announcement that not just Darklands but three other major releases as well would not be released that quarter — more fruit of Stealey’s laissez-faire philosophy of game development — the stock tumbled to almost 25 percent below its initial price. A stench of doom was beginning to surround the company, despite such recent successes as Civilization.

Games, like most creative productions, generally mirror the circumstances of their creation. This fact doesn’t bode well for Darklands, a project which started in chaos and ended, two years later, in a panicked save-the-company scramble.


Pirates!

Darklands

If you squint hard enough at Darklands, you can see its roots in Pirates!, the first classic Arnold Hendrick helped to create at MicroProse. As in that game, Darklands juxtaposes menu-driven in-town activities, written in an embodied narrative style, with more free-form wanderings over the territories that lie between the towns. But, in place of the straightforward menu of six choices in Pirates!, your time in the towns of Darklands becomes a veritable maze of twisty little passages; you start the game in an inn, but from there can visit a side street or a main street, which in turn can lead you to the wharves or the market, dark alleys or a park, all with yet more things to see and do. Because all of these options are constantly looping back upon one another — it’s seldom clear if the side street from this menu is the same side street you just visited from that other menu — just trying to buy some gear for your party can be a baffling undertaking for the beginner.

Thus, in spite of the superficial interface similarities, we see two radically opposing approaches to game design in Pirates! and Darklands. The older game emphasizes simplicity and accessibility, being only as complex as it needs to be to support the fictional experience it wants to deliver. But Darklands, for its part, piles on layer after layer of baroque detail with gleeful abandon. One might say that here the complexity is the challenge; learning to play the entirety of Darklands at all requires at least as much time and effort as getting really, truly good at a game like Pirates!.

The design dialog we see taking place here has been with us for a long time. Dave Arneson and Gary Gygax, the co-creators of the first incarnation of tabletop Dungeons & Dragons, parted ways not long afterward thanks largely to a philosophical disagreement about how their creation should evolve. Arneson saw the game as a fairly minimalist framework to enable a shared storytelling session, while Gygax saw it as something more akin to the complex wargames on which he’d cut his teeth. Gygax, who would go on to write hundreds of pages of fiddly rules for Advanced Dungeons & Dragons, his magnum opus, was happily cataloging and quantifying every variant of pole arm used in Medieval times when an exasperated Arneson finally lost his cool: “It’s a pointy thing on the end of a stick!” Your appreciation for Darklands must hinge on whether you are a Gary Gygax or a Dave Arneson in spirit. I know to which camp I belong; while there is a subset of gamers who truly enjoy Darkland‘s type of complexity — and more power to them for it — I must confess that I’m not among them.

In an interview conducted many years after the release of Darklands, Arnold Hendrick himself put his finger on what I consider to be its core problem: “Back then, game systems were often overly complicated, and attention to gameplay was often woefully lacking. These days, there’s a much better balance between gameplay and the human psychology of game players and the game systems underlying that gameplay.” Simply put, there are an awful lot of ideas in Darklands which foster complexity, but don’t foster what ought to be the ultimate arbitrator in game design: Fun. Modern designers often talk about an elusive sense of “flow” — a sense by the player that all of a game’s parts merge into a harmonious whole which makes playing for hours on end all too tempting. For this player at least, Darklands is the polar opposite of this ideal. Not only is it about as off-putting a game as I’ve ever seen at initial startup, but it continues always, even after a certain understanding has begun to dawn, to be a game of disparate parts: a character-generation game, a combat game, a Choose Your Own Adventure-style narrative, a game of alchemical crafting. There are enough original ideas here for ten games, but it never becomes clear why they absolutely, positively all need to be in this one. Darklands, in other words, is kind of a muddle.

Your motivation for adventuring in Medieval Germany in the first place is one of Darklands‘s original ideas in CRPG design. Drawing once again comparisons to Pirates!, Darklands dispenses with any sort of overarching plot as a motivating force. Instead, like your intrepid corsair of the earlier game, your party of four has decided simply “to bring everlasting honor and glory to your names.” If you play for long enough, something of a larger plot will eventually begin to emerge, involving a Satan-worshiping cult and a citadel dedicated to the demon Baphomet, but even after rooting out the cult and destroying the citadel the game doesn’t end.

In place of an overarching plot, Darklands relies on incidents and anecdotes, from a wandering knight challenging you to a duel to a sinkhole that swallows up half your party. While these are the products of a human writer (presumably Arnold Hendrick for the most part), their placements in the world are randomized. To improve your party’s reputation and earn money, you undertake a variety of quests of the “take item A to person B” or “go kill monster C” variety. All of this too is procedurally generated. Indeed, you begin a new game of Darklands by choosing the menu option “Create a New World.” Although the geography of Medieval Germany won’t change from game to game, most of what you’ll find in and around the towns is unique to your particular created world. It all adds up to a game that could literally, as MicroProse’s marketers didn’t hesitate to declare, go on forever.

But, as all too commonly happens with these things, it’s a little less compelling in practice than it sounds in theory. I’ve gone on record a number of times now with my practical objections to generative narratives. Darklands too often falls prey to the problems that are so typical of the approach. The quests you pick up, lacking as they do any larger relationship to a plot or to the world, are the very definition of FedEx quests, bereft of any interest beyond the reputation and money they earn for you. And, while it can sometimes surprise you with an unexpectedly appropriate and evocative textual vignette, the game more commonly hews to the predictable here as well. Worse, it has a dismaying tendency to show you the same multiple-choice vignettes again and again, pulling you right out of the fiction.

And yet the vignettes are actually the most narratively interesting parts of the game; it will be some time before you begin to see them at all. As in so many other vintage CRPGs, the bulk of your time at the beginning of Darklands is spent doing boring things in the name of earning the right to eventually do less boring things. In this case, you’ll likely have to spend several hours roaming the vacant back streets of whatever town you happen to begin in, seeking out and killing anonymous bands of robbers, just to build up your party enough to leave the starting town.

The open-ended structure works for Pirates! because that game dispenses with this puritanical philosophy of design. It manages to be great fun from the first instant by keeping the pace fast and the details minimal, even as it puts a definite time limit on your career, thus tempting you to play again and again in order to improve on your best final score. Darklands, by contrast, doesn’t necessarily end even when your party is too old to adventure anymore (aging becomes a factor after about age thirty); you can just make new characters and continue where the old ones left off, in the same world with the same equipment, quests, and reputation. Darklands, then, ends only when you get tired of it. Just when that exact point arrives will doubtless differ markedly from player to player, but it’s guaranteed to be anticlimactic.

The ostensible point of Darklands‘s enormously complex systems of character creation, alchemy, religion, and combat is to evoke its chosen time and place as richly as possible. One might even say the same about its lack of an overarching epic plot; such a thing doesn’t exist in the books of history and legend to which the game is so determined to be so faithful. Yet I can’t help but feel that this approach — that of trying to convey the sense of a time and place through sheer detail — is fundamentally misguided. Michael Bate, a designer of several games for Accolade during the 1980s, coined the term “aesthetic simulations” for historical games that try to capture the spirit of their subject matter rather than every piddling detail. Pirates! is, yet again, a fine example of this approach, as is the graceful, period-infused but not period-heavy-handed writing of the 1992 adventure game The Lost Files of Sherlock Holmes.

The writing in Darklands falls somewhat below that standard. It isn’t terrible, but it is a bit graceless, trying to make up for in concrete detail what it isn’t quite able to conjure in atmosphere. So, we get money that is laboriously explicated in terms of individual pfenniges, groschen, and florins, times of day described in terms that a Medieval monk would understand (Matins, Latins, Prime, etc.), and lots of off-putting-to-native-English-speakers German names, but little real sense of being in Medieval Germany.

Graphically as well, the game is… challenged. Having devoted most of their development efforts to 3D vehicular simulators during the 1980s, MicroProse’s art department plainly struggled to adapt to the demands of other genres. Even an unimpeachable classic like Sid Meier’s Civilization achieves its classic status despite rather than because of its art; visually, it’s a little garish compared to what other studios were putting out by this time. But Darklands is much more of a visual disaster, a conflicting mishmash of styles that sometimes manage to look okay in isolation, such as in the watercolor-style backgrounds to many of the textual vignettes. Just as often, though, it verges on the hideous; the opening movie is so absurdly amateurish that, according to industry legend, some people actually returned the game after seeing it, thinking they must have gotten a defective disk or had an incompatible video card.

One of Darklands‘s more evocative vignettes, with one of its better illustrations as a backdrop. Unfortunately, you’re likely to see this same vignette and illustration several times, with a decided sense of diminishing returns.

But undoubtedly the game’s biggest single problem, at the time of its release and to some extent still today, was all of the bugs. Even by the standards of an industry at large which was clearly struggling to come to terms with the process of making far more elaborate games than had been seen in the previous decade, Darklands stood out upon its belated release in August of 1992 for its woefully under-baked state. Whether this was despite or because of its extended development cycle remains a question for debate. What isn’t debatable, however, is that it was literally impossible to complete Darklands in its initial released state, and that, even more damningly, a financially pressured MicroProse knew this and released it anyway. To their credit, the Darklands team kept trying to fix the game after its release, with patch after patch to its rickety code base. The patches eventually numbered at least nine in all, a huge quantity for long-suffering gamers to acquire at a time when they could only be distributed on physical floppy disks or via pricey commercial online services like CompuServe. After about a year, the team managed to get the game into a state where it only occasionally did flaky things, although even today it remains far from completely bug-free.

By the time the game reached this reasonably stable state, however, the damage had been done. It sold fairly well in its first month or two, but then came a slew of negative reviews and an avalanche of returns that actually exceeded new sales for some time; Darklands thus managed the neat trick of continuing to be a drain on MicroProse’s precarious day-to-day finances even after it had finally been released. Hendrick had once imagined a whole line of similar historical CRPGs; needless to say, that didn’t happen.

Combined with the only slightly less disastrous failure of the new point-and-click graphic-adventure line, Darklands was directly responsible for the end of MicroProse as an independent entity. In December of 1993, with the company’s stock now at well under half of its IPO price and the creditors clamoring, a venture-capital firm arranged a deal whereby MicroProse was acquired by Spectrum Holobyte, known virtually exclusively for a truly odd pairing of products: the home-computer version of the casual game Tetris and the ultra-hardcore flight simulator Falcon. The topsy-turvy world of corporate finance being what it was, this happened despite the fact that MicroProse’s total annual sales were still several times that of Spectrum Holobyte.

Stealey, finding life unpleasant in a merged company where he was no longer top dog, quit six months later. His evaluation of the reasons for MicroProse’s collapse was incisive enough in its fashion:

You have to be known for something. We were known for two things [military simulators and grand-strategy games], but we tried to do more. I think that was a big mistake. I should have been smarter than that. I should have stuck with what we were good at.



I’ve been pretty hard on Darklands in this article, a stance for which I don’t quite feel a need to apologize; I consider it a part of my duty as your humble scribe to call ’em like I see ’em. Yet there is far more to Darklands‘s legacy than a disappointing game which bankrupted a company. Given how rare its spirit of innovation has been in CRPG design, plenty of players in the years since its commercial vanishing performance have been willing to cut it a lot of slack, to work hard to enjoy it on its own terms. For reasons I’ve described at some length now, I can’t manage to join this group, but neither can I begrudge them their passion.

But then, Darklands has been polarizing its players from the very beginning. Shortly after the game’s release, Scorpia, Computer Gaming World magazine’s famously opinionated adventure-game columnist, wrote a notably harsh review of it, concluding that it “might have been one of the great ones” but instead “turns out to be a game more to be avoided than anything else.” Johnny L. Wilson, the magazine’s editor-in-chief, was so bothered by her verdict that he took the unusual step of publishing a sidebar response of his own. It became something of a template for future Darklands apologies by acknowledging the game’s obvious flaws yet insisting that its sheer uniqueness nevertheless made it worthwhile. (“The game is as repetitive as Scorpia and some of the game’s online critics have noted. One comes across some of the same encounters over and over. Yet only occasionally did I find this disconcerting.”) He noted as well that he personally hadn’t seen many of the bugs and random crashes which Scorpia had described in her review. Perhaps, he mused, his computer was just an “immaculate contraption” — or perhaps Scorpia’s was the opposite. In response to the sidebar, Wilson was castigated by his magazine’s readership, who apparently agreed with Scorpia much more than with him and considered him to have undermined his own acknowledged reviewer.

The reader response wasn’t the only interesting postscript to this episode. Wilson:

Later, after 72 hours of playing around with minor quests and avoiding the main plot line of Darklands, I decided it was time to finish the game. I had seven complete system crashes in less than an hour and a half once I decided to jump in and finish the game. I didn’t really have an immaculate contraption, I just hadn’t encountered the worst crashes because I hadn’t filled my upper memory with the system-critical details of the endgame. Scorpia hadn’t overreacted to the crashes. I just hadn’t seen how bad it was because I was fooling around with the game instead of trying to win. Since most players would be trying to win, Scorpia’s review was more valid than my sidebar. Ah, well, that probably isn’t the worst thing I’ve ever done when I thought I was being fair.

This anecdote reveals what may be a deciding factor — in addition to a tolerance for complexity for its own sake — as to whether one can enjoy Darklands or not. Wilson had been willing to simply inhabit its world, while the more goal-oriented Scorpia approached it as she would any other CRPG — i.e., as a game that she wanted to win. As a rather plot-focused, goal-oriented player myself, I naturally sympathize more with her point of view.

In the end, then, the question of where the point of failure lies in Darklands is one for the individual player to answer. Is Darklands as a whole a very specific sort of failure, a good idea that just wasn’t executed as well as it might have been? Or does the failure lie with the CRPG format itself, which this game stretched beyond the breaking point? Or does the real failure lie with the game’s first players, who weren’t willing to look past the bugs and other occasional infelicities to appreciate what could have been a whole new type of CRPG? I know where I stand, but my word is hardly the final one.

Given the game’s connection to the real world and its real cultures, so unusual to the CRPG genre, perhaps the most interesting question of all raised by Darklands is that of the appropriate limits of gamefication. A decade before Darklands‘s release, the Dungeons & Dragons tabletop RPG was embroiled in a controversy engendered by God-fearing parents who feared it to be an instrument of Satanic indoctrination. In actuality, the creators of the game had been wise enough to steer well clear of any living Western belief system. (The Deities & Demigods source book did include living native-American, Chinese, Indian, and Japanese religions, which raises some troublesome questions of its own about cultural appropriation and respect, but wasn’t quite the same thing as what the angry Christian contingent was complaining about.)

It’s ironic to note that much of the content which Evangelical Christians believed to be present in Dungeons & Dragons actually is present in Darklands, including the Christian God and Satan and worshipers of both. Had Darklands become successful enough to attract the attention of the same groups who objected so strongly to Dungeons & Dragons, there would have been hell to pay. Arnold Hendrick had lived through the earlier controversy from an uncomfortably close vantage point, having been a working member of the tabletop-game industry at the time it all went down. In his designer’s notes in Darklands‘s manual, he thus went to great pains to praise the modern “vigorous, healthy, and far more spiritual [Catholic] Church whose quiet role around the globe is more altruistic and beneficial than many imagine.” Likewise, he attempted to separate modern conceptions of Satanism and witchcraft from those of Medieval times. Still, the attempt to build a wall between the Christianity of the 15th century and that of today cannot be entirely successful; at the end of the day, we are dealing with the same religion, albeit in two very different historical contexts.

Opinions vary as to whether the universe in which we live is entirely mechanistic, reducible to the interactions of concrete, understandable, computable physical laws. But it is clear that a computer simulation of a world must be exactly such a thing. In short, a simulation leaves no room for the ineffable. And yet Darklands chooses to grapple, to an extent unrivaled by almost any other game I’m aware of, with those parts of human culture that depend upon a belief in the ineffable. By bringing Christianity into its world, it goes to a place virtually no other game has dared approach. Its vending-machine saints reduce a religion — a real, living human faith — to a game mechanic. Is this okay? Or are there areas of the human experience which ought not to be turned into banal computer code? The answer must be in the eye — and perhaps the faith — of the beholder.

Darklands‘s real-time-with-pause combat system. The interface here is something of a disaster, and the visuals too leave much to be desired, but the core idea is sound.

By my lights, Darklands is more of a collection of bold ideas than a coherent game, more of an experiment in the limits of CRPG design than a classic example of same. Still, in a genre which is so often in thrall to the tried and true, its willingness to experiment can only be applauded.

For sometimes experiments yield rich rewards, as the most obvious historical legacy of this poor-selling, obscure, bug-ridden game testifies. Ray Muzyka and Greg Zeschuk, the joint CEOs of Bioware at the time that studio made the Baldur’s Gate series of CRPGs, have acknowledged lifting the real-time-with-pause combat systems in those huge-selling and much-loved games directly out of Darklands. Since the Baldur’s Gate series’s heyday around the turn of the millennium, dozens if not hundreds of other CRPGs have borrowed the same system second-hand from Bioware. Such is the way that innovation diffuses itself through the culture of game design. So, the next time you fire up a Steam-hosted extravaganza like Pillars of Eternity, know that part of the game you’re playing owes its existence to Darklands. Lumpy and imperfect though it is in so many ways, we could use more of its spirit of bold innovation today — in CRPG design and, indeed, across the entire landscape of interactive entertainment.

(Sources: the book Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay; Computer Gaming World of March 1991, February 1992, May 1992, September 1992, December 1992, January 1993, and June 1994; Commodore Magazine of September 1987; Questbusters of November 1992; Compute! of October 1993; PC Zone of September 2001; Origin Systems’s internal newsletter Point of Origin of January 17 1992; New York Times of June 13 1993. Online sources include Matt Barton’s interview with Arnold Hendrick, Just Adventure‘s interview with Johnny L. Wilson, and Arnold Hendrick’s discussion of Darklands in the Steam forum.

Darklands is available for purchase on GOG.com.)

 
 

Tags: , ,

The Designer’s Designer

Dan Bunten delivers the keynote at the 1990 Game Developers Conference.

Dan Bunten and his little company Ozark Softscape could look back on a tremendous 1984 as that year came to an end. Seven Cities of Gold had been a huge success, Electronic Arts’s biggest game of the year, doing much to keep the struggling publisher out of bankruptcy court by selling well over 100,000 copies. Bunten himself had become one the most sought-after interviewees in the industry. Everyone who got the chance to speak with him seemed to agree that Seven Cities of Gold was only the beginning, that he was destined for even greater success.

As it turned out, though, 1984 would be the high-water mark for Bunten, at least in terms of that grubbiest but most implacable metric of success in games: quantity of units shifted. The years that followed would be frustrating as often as they would be inspiring, as Bunten pursued a vision that seemed at odds with every trend in the industry, all the while trying to thread the needle between artistic fulfillment and commercial considerations.


In the wake of Seven Cities of Gold‘s success, EA badly wanted a follow-up with a similar theme, so much so that they offered Bunten a personal bonus of $5000 to make it Ozark’s next project. The result was Heart of Africa, a game which at first glance looks like precisely the sequel EA was asking for but that actually plays quite differently. Instead of exploring the Americas as Hernán Cortés during the 1600s, it has you exploring Africa as an intrepid Victorian adventurer (“Livingston, I presume?”). In keeping with the changed time and location, your goal isn’t to conquer the land for your country — Africa had, for better or for worse, already been thoroughly partitioned among the European nations by 1890, the year in which the game takes place — but simply to discover and to map. In the best tradition of Victorian adventure novels like King Solomon’s Mines, your ultimate goal is to find the tomb of a mythical Egyptian pharaoh. Bunten later admitted that the differences from Heart of Africa‘s predecessor weren’t so much a product of original design intent as improvisation after he had bumbled into an historical context that just wouldn’t work as a more faithful sequel.

Indeed, Bunten in later years dismissed Heart of Africa, his most adventure-like game ever and his last ever that was single-player only, as nothing more than “a game done to please EA”: “I honestly didn’t want to do the project.” Its biggest problem hinges on the fact that its environment is randomly generated each time you start a new game, itself an attempt to remedy the most obvious failing of adventure games as a commercial proposition: their lack of replayability. Yet the random maps can never live up to what a hand-crafted map, designed for challenge and dramatic effect, might have been; the “story” in Heart of Africa is all too clearly just a bunch of shifting interchangeable parts. Bunten later acknowledged that “the attempt to make a replayable adventure game made for a shallow product (which seems true in every other case designers have tried it as well). I guess that if elements are such that they can be randomly shifted then they [aren’t] substantive enough to make for a compelling game. So, even though I don’t like linear games, they seem necessary to have the depth a good story needs.”

Heart of Africa did quite well for EA upon its release in 1985 — well enough, in fact, to become Bunten’s third most successful game of all time. Yet the whole experience left a bad taste in his mouth. He came away from the project determined to return to the guiding vision behind his first game for EA, the commercially unsuccessful but absolutely brilliant M.U.L.E.: a vision of computer games that people played together rather than alone. In the future, he would continue to compromise at times on the style and subject matter of his games in order to sell them to his publishers, but he would never again back away from his one great principle. All of his games henceforward would be multiplayer — first, foremost, and in one case exclusively. In fact, that one case would be his very next game.

The success of his previous two games having opened something of a window of opportunity with EA, Bunten charged ahead on what he would later describe as his single “most experimental game.” Robot Rascals is a multiplayer scavenger hunt in which two physical decks of cards are integral to the game. Each player controls a robot, and must use it to collect the four items shown on the cards in her hand and return with them to home base in order to win. The game lives on the razor’s edge of pure chaos, the product both of random events generated by the computer and of a second deck of cards — the “specials” — which among other things can force players to draw new item cards, trash their old cards, or trade cards among one another; thus everyone’s goals are shifting almost constantly. As always in a Dan Bunten game, there are lots of thoughtful features here, from ways to handicap the game for players of different ages or skill levels to three selectable levels of overall complexity. He designed it to be “a game that anyone could play” rather than one limited to “special-interest groups like role-playing people or history buffs.” It can be a lot of fun, even if it’s not quite on the level of M.U.L.E. (then again, what is, right?). But this latest bid to make computer games acceptable family entertainment wound up selling hardly at all upon its release in 1986, ending Bunten’s two-game commercial hot streak.

By this point in Bunten’s career, changes in his personal life were beginning to have a major impact on the games he made. In 1985, while still working on Heart of Africa, he had divorced his second wife and married his third, with all the painful complications such disruptions entail when one is leaving children behind with the former spouse. In 1986, he and his new wife moved from Little Rock, Arkansas, to Hattiesburg, Mississippi, so she could complete a PhD. This event marked the effective end of Ozark Softscape as anything but a euphemism for Dan Bunten himself and whatever programmers and artists he happened to contract work out to. The happy little communal house/office where Dan and Bill Bunten, Jim Rushing, and Alan Watson had created games, with a neighborhood full of eager testers constantly streaming through the living room, was no more; only Watson continued to work on Bunten’s games from Robot Rascals on, and then more as just another hired programmer than a valued design voice. Even after moving back to Little Rock in 1988, Bunten would never be able to recapture the communal alchemy of 1982 to 1985.

Coupled with these changes were other, still more ominous ones in Dan Bunten himself. Those who knew him during these years generally refer only vaguely to his “problems,” and this discretion of course does them credit; I too have no desire to psychoanalyze the man. What does seem clear, however, is that he was growing increasingly unhappy as time wore on. He became more demanding of his colleagues, difficult enough to work with that many of them decided it just wasn’t worth it, even as he became more erratic in his own habits, perhaps due to an alcohol intake that struck many as alarming.

Yet Bunten was nothing if not an enigmatic personality. At the same time that close friends were worrying about his moodiness and his drinking, he could show up someplace like The Computer Game Developers Conference and electrify the attendees with his energy and ideas. Certainly his eyes could still light up when he talked about the games he was making and wanted to make. The worrisome questions were how much longer he would be allowed to make those games in light of their often meager sales, and, even more pressingly, why his eyes didn’t seem to light up about much else in his life anymore.

But, to return to the firmer ground of the actual games he was continuing to make: Modem Wars, his next one, marked the beginning of a new chapter in his tireless quest to get people playing computer games together. “We’ve failed at gathering people around the computer,” Bunten said before starting work on it. “We’re going to have to connect them out of the back by connecting their computers to each other.” He would make, in other words, a game played by two people on two separate computers, connected via modem.

Modem Wars was known as Sport of War until just prior to its release by EA in 1988, and in many ways that was a better title. Its premise is a new version of Bunten’s favorite sport of football, played not by individual athletes but by infantry, artillery, and even aircraft, if you can imagine such a thing. One might call it a mashup between two of his early designs for SSI: the strategic football simulator Computer Quarterback and the proto-real-time-strategy game Cytron Masters.

It’s the latter aspect that makes Modem Wars especially visionary. The game was nothing less than an online real-time-strategy death match years before the world had heard of such a thing. While a rudimentary artificial intelligence was provided for single-player play, it was made clear by the game’s very title that this was strictly a tool for learning to play rather than the real point of the endeavor. Daniel Hockman’s review of Modem Wars for Computer Gaming World ironically describes the qualities of online real-time strategy as a potential “problem” and “marketing weakness” — the very same qualities which a later generation would take as the genre’s main attractions:

A sizable number of gamers are not used to thinking in real-time situations. They can spend hours ordering tens of thousands of men into mortal combat, but they wimp out when they have to think under fire. They want to play chess instead of speed chess. They want to analyze instead of act. As the enemy drones zero in on their comcen, they throw up their hands in frustration when it’s knocked out before they can extract themselves from the maelstrom of fire that has engulfed them.

Whether because gamers really were daunted by this need to think on their feet or, more likely, because of the relative dearth of fast modems and stable online connections in 1988, Modem Wars became another crushing commercial disappointment for Bunten. EA declared themselves “hesitant” to keep pursuing this direction in the wake of the game’s failure. Rather than causing Bunten to turn away from multiplayer gaming, this loss of faith caused him to turn away from EA.

In the summer of 1989, MicroProse Software announced that they had signed a five-year agreement with Bunten, giving them first rights to all of the games he made during that period. The great hidden driver behind the agreement was MicroProse’s own star designer Sid Meier, who had never hidden his enormous admiration for Bunten’s work. Bunten doubtless hoped that a new, more supportive publisher would mark the beginning of a new, more commercially successful era in his career. And in the beginning at least, such optimism would, for once, prove well-founded.

Known at first simply as War!, then as War Room, and finally as Command H.Q., Bunten’s first game for MicroProse was aptly described by its designer as being akin to an abstract, casual board game of military strategy, like Risk or Axis & Allies. The big wrinkle was that this beer-and-pretzels game was to be played in real time rather than turns. But, perhaps in response to complaints about his previous game like those voiced by Daniel Hockman above, the pace is generally far less frenetic this time around. Not only can the player select an overall speed, but the program itself actually takes charge to speed up the action when not much is happening and slow it down when things heat up. Although a computer opponent is provided, the designer’s real focus was once more on modem-to-modem play.

But, whatever its designer’s preferences, MicroProse notably de-emphasized the multiplayer component in their advertising upon Command H.Q.‘s release in 1990, and this, combined with a more credible artificial intelligence for the computer opponent, gave it more appeal to the traditional wargame crowd than Modem Wars had demonstrated. Ditto a fair measure of evangelizing done by Computer Gaming World, with whom Bunten had always had a warm relationship, having even authored a regular column there for a few years in the mid-1980s. The magazine’s lengthy review concluded by saying, “This is the game we’ve all been waiting for”; they went on to publish two more lengthy articles on Command H.Q. strategy, and made it their “Wargame of the Year” for 1990. For all these reasons, Command H.Q. sold considerably better than had Bunten’s last couple of games; one report places its total sales at around 75,000 units, enough to make it his second most successful game ever.

With that to buoy his spirits, Bunten made big plans for his next game, Global Conquest. “Think of it as Command H.Q. meets Seven Cities of Gold meets M.U.L.E.,” he said. Drawing heavily from Command H.Q. in particular, as well as the old grand-strategy classic Empire, he aimed to make a globe-spanning strategy game where economics would be as important as military maneuvers. He put together a large and vocal group of play testers on CompuServe, and tried to incorporate as many of their suggestions as possible, via a huge options panel that allowed players to customize virtually every aspect of the game, from the rules themselves to the geography and topography of the planet they were fighting over, all the way down to the look of the icons representing the individual units. This time, up to four humans could play against one another in a variety of ways: they could all play together by taking turns on one computer, or they could each play on their own computer via a local-area network, or four players could share two computers that were connected via modem. The game was turn-based, but with an interesting twist designed to eliminate analysis paralysis: when the first player mashed the “next turn” button, everyone else had just twenty seconds to finish up their own turns before the execution phase began.

In later years, Dan Bunten himself had little good to say about what would turn out to be his last boxed game. In fact, he called it his absolute “worst game” of all the ones he had made. While play-testing in general is a wonderful thing, and every designer should do as much of it as possible, a designer also needs to keep his own vision for what kind of game he wants to make at the forefront. In the face of prominent-in-their-own-right, opinionated testers like Computer Gaming World‘s longtime wargame scribe Alan Emrich, Bunten failed to do this, and wound up creating not so much a single coherent strategy game as a sort of strategy-game construction set that baffled more than it delighted. “This game was a hodgepodge rather than an integration,” he admitted several years later. “It was just the opposite of the KISS doctrine. It was a kitchen-sink design. It had everything. Build your own game by struggling through several options menus.” He acknowledged as well that the mounting unhappiness in his personal life, which had now led to a divorce from his third wife, was making it harder and harder to do good work.

Released in 1992, Global Conquest under-performed commercially as well. In addition to the game’s intrinsic failings, it didn’t help matters that MicroProse had just five months prior released Sid Meier’s Civilization, another exercise in turn-based grand strategy on a global scale, also heavily influenced by Empire, that managed to be far more thematically and texturally ambitious while remaining more focused and playable as a game — albeit without the multiplayer element that was so important to Bunten.

But of course, there’s more to a game than whether it’s played by one person or more than one, and it strikes me as reasonable to question whether Bunten was beginning to lose his way as a designer in other respects even as he stuck so obstinately to his multiplayer guns. Setting aside their individual strengths and failings, the final three boxed games of Bunten’s career, with their focus on “wars” and “command” and “conquest,” can feel a little disheartening when compared to what came before. Games like M.U.L.E., Robot Rascals, and to some extent even Seven Cities of Gold and Heart of Africa had a different, friendlier, more welcoming personality. This last, more militaristic trio feels like a compromise, the product of a Dan Bunten who said that, if he couldn’t bring multiplayer gaming to the masses, he would settle for the grognard crowd, indulging their love for guns and tanks and bombs. So be it. Now, though, he was about to give that same crowd the shock of their lives.

In November of 1992, just months after completing the supremely masculine wargame Global Conquest, Dan Bunten had sexual-reassignment surgery, becoming the woman Danielle “Dani” Bunten Berry. (For continuity’s sake, I’ll generally continue to refer to her by the shorthand of “Bunten” rather than “Berry” for the remainder of this article.) It’s not for us to speculate about the personal trauma that must have accompanied such a momentous decision. What we can and should take note of, however, is that it was an unbelievably brave decision. For all that we still have a long way to go today when it comes to giving transsexuals the rights and respect they deserve, the early 1990s were a far less enlightened time than even our own on this issue. And it wasn’t as if Bunten could take comfort in the anything-goes anonymity of a New York City or San Francisco.  Dan Bunten had lived, and as Dani Bunten now continued to live, in the intensely conservative small-town atmosphere of Little Rock, Arkansas. Many of those closest to her disowned her, including her mother and her ex-wives, making it heartbreakingly difficult for her to maintain a relationship with her children. She had remained in Little Rock all these years, at no small cost to her career prospects, largely because of these ties of blood, which she had believed to be indissoluble. This rejection, then, must have felt like the bitterest of betrayals.

Dan Bunten with his beverage of choice.

The games industry as well, with its big-breasted damsels in distress and its machine-gun-toting male heroes, wasn’t exactly notable for its enlightened attitudes toward sex and gender. Many of Bunten’s old friends and colleagues would see her for the first time after her surgery and convalescence at the Game Developers Conference scheduled for April of 1993, and they looked forward to that event with almost as much trepidation as Bunten herself must have felt. It was all just so very unexpected. To whatever extent they had carried around a mental image of a man who would choose to become a woman, Dan Bunten didn’t fit the profile at all. He had been the games’ industry own Ozark Mountains boy, a true son of the South, always ready with his “folksy mountain humor” (read, “dirty jokes”). His rangy frame stood six feet two inches tall. He loved nothing more than a rough-and-tumble game of back-lot football, unless it be beer and poker afterward. As his three ex-wives and three children attested, he had certainly seemed to like women, but no one had ever imagined that he liked them enough to want to be one. What were they supposed to say to him — er, to her — now?

They needn’t have worried. Dani Bunten handled her coming-out party with the same low-key grace and humor she would display for the rest of her life as a woman. She said that she had made the switch to do her part to redress the gender imbalance inside the industry, and to help improve the aesthetics of game designers to match the improving aesthetics of their games. The tension dissipated, and soon everyone got into the spirit of the thing. A straw poll named Dani Bunten the game designer most likely to appear on the Oprah Winfrey Show. A designer named Gordon Walton had a typical experience: “I was put off when she made the change to become Dani, until the minute I spoke to her. It was clear to me she was much happier as Dani, and if anything an even more incredible person.” Another GDC regular remembered the “unhappy man” from the 1992 event, “sitting on the hallway floor drinking and smoking,” and contrasted him with the “happy woman” he now saw.

No one with any interest in the inner workings of those strangest of creatures, their fellow humans, could fail to be fascinated by Bunten’s dispatches from both sides of the gender divide. “Aren’t there things you’ve always wanted to know about women but were afraid to ask?” she said. “Well, now’s your chance!”

I had to learn a lot to actually “count” as a woman! I had to learn how to walk, speak, dress as a woman. Those little things which are necessary so that other people don’t [feel] alienated.There’s a little summary someone gave me to make clear what being a woman means: as a woman you have to sing when you speak, dance when you walk, and you have to open your heart… I know how stereotypical that sounds, but it is true! Speech for a man is something completely different: the melody of speech is fast, monotone, and decreases at the end of a sentence. Sometimes, this still happens to me, and people are always irritated. Female speech is a little bit like song – we have a lot more melody and different speech patterns. Walking is really a bit like dancing: slower and connected, with a lot of subtle movements. I enjoyed it at once.

She had few filters when talking about the nitty-gritty details:

One of the saddest changes I had to deal with after my operation was the fact that I couldn’t aim anymore when urinating. Boys — I have two little sons and a daughter — simply love to aim.

Bunten said that, in keeping with her new identity, she didn’t feel much desire to design any more wargames; this led to the end of her arrangement with MicroProse. By way of compensation, Electronic Arts that year released a nicely done “commemorative edition” of Seven Cities of Gold, complete with dramatically upgraded graphics and sound to suit the times. Bunten had little to nothing to do with the project, but it sold fairly well, and perhaps helped to remind her of her roots.

In the same spirit, Bunten’s first real project after her transformation became a new version of M.U.L.E. EA’s founder Trip Hawkins had always named that game as one of his all-time favorites, and had frequently stated how disappointed he was that it had never gotten the attention it deserved. Now, Hawkins had left his day-to-day management role at EA to run 3DO, a spin-off company peddling a multimedia set-top box for the living room. Hawkins thought M.U.L.E. would be perfect for the platform, and recruited Bunten to make it happen. It was a dream project; showing excellent taste, she still regarded M.U.L.E. as the best thing she had ever done. But the dream quickly began to sour.

3DO first requested that, instead of taking turns managing their properties on the map, players all be allowed to do so simultaneously. Bunten somewhat reluctantly agreed. And then:

As soon as I added the simultaneity, it instantly put into their heads, “Why can’t we shoot at each other?” And I said, “No guns.” And they said, “What about bombs? Can we drop a bomb in front of you? It won’t hurt you. It will be a cartoon thing, it will just slow you down.” And I said, “You don’t get it. It’s changing the whole notion of how this thing works!”

[3DO is] staking its future on the idea of a new generation of hardware and therefore, you’d assume, a new generation of software, but they said, “No, our market is still 18 to 35, male. We need something with action, something with intensity.” Chrome and sizzle. Ugh.

In the end, Bunten walked out, disappointed enough that she seriously considered getting out of games altogether, going so far as to apply for jobs as the industrial engineer Dan Bunten had once been before his first personal computer came along.

Instead she found a role with a new company called Mpath as a design and strategy consultant. The goal of that venture was to bring multiplayer gaming to the new frontier of the World Wide Web, and its founders included her fellow game designer Brian Moriarty, of Infocom and LucasArts fame. She also studied the elusive concept of “games for girls” in association with a think tank set up by Microsoft co-founder Paul Allen; some of her proposals would later come to market as the products of Purple Moon, Brenda Laurel’s brief-lived but important publisher of games for girls aged 8 to 14.

Offers to do conventional boxed games as sole designer, however, weren’t forthcoming; how much that was down to lingering personal prejudices against her for her changed sex and how much to the fact that the games she wanted to make just weren’t considered commercially viable must always be open for debate. Refusing as usual to be a victim, Bunten said that her “priorities had shifted” since her change anyway: “I don’t identify myself with the job as strongly as before.” Deciding that, for her, heaven was other people after a life spent programming computers, she devoured anthropology texts and riffed on Karl Jung’s theories of a collective unconscious. “Literature, anthropology, and even dance,” she noted, “have a good deal more to teach designers about human drives and abilities than the technologists of either end of California, who know silicon and celluloid but not much else.” So, she bided her time as a designer, waiting for a more inclusive ludic future to arrive. At the 1997 GDC, she described a prescient vision of “small creative shops” freed from the inherent conservatism of the “distribution trap” by the magic of the Internet.

That future would indeed come to pass — but, sadly, not in time for Dani Bunten Berry to see it. Shortly after delivering that speech, she went to see her doctor about a persistent cough, whereupon she was diagnosed with an advanced case of lung cancer. In one of those cruel ironies which always seem to dog the lives of us poor mortals, she had finally kicked a lifelong habit of heavy smoking just a few months before.

She appeared in public for the last time in May of 1998. The occasion was, once again, the Game Developers Conference, where she had always shone so. She struggled audibly for breath as she gave the last presentation of her life, entitled “Do Online Games Still Suck?,” but her passion carried her through. At the end of the conference, at a special ceremony held aboard the Queen Mary in Long Beach Harbor, she was presented with the first ever GDC Lifetime Achievement Award. The master of ceremonies for that evening was her friend and colleague Brian Moriarty, who knew, like everyone else in attendance, that the end was near. He closed his heartfelt tribute thus:

It is no exaggeration to characterize tonight’s honoree as the world’s foremost authority on multiplayer computer games. Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes the prototypes of the defining art form of the 21st century.

As one of those historians, I can only heartily concur with his assessment.

It would be nice to say that Dani Bunten passed peacefully to her rest. But, as anyone with any experience with lung cancer will recognize, that just isn’t how the disease works. Throughout her life, she had done nothing the easy way, and her death — ugly, painful, and slow — was no exception. On the brighter side, she did reconcile to some extent with her mother and other family members and friends who had rejected her. The end came on July 3, 1998. Rather incredibly in light of the prodigious, multifaceted life she had lived, she was just 49 years old.

It’s a life which resists pigeonholing or sloganeering. Bunten herself explicitly rejected the role of transgender advocate, inside or outside of the games industry. Near the end of her life, she expressed regret for her decision to change her physical sex, saying she could have found ways to live in a more gender-fluid way without taking such a drastic step. Whether this was a reasoned evaluation or a product of the pain and trauma of terminal illness must remain, like so much else about her, an enigma.

What is clear, however, is that Bunten, through the grace and humor with which she handled her transition and through her refusal to go away and hide thereafter as some might have wished, taught others in the games industry who were struggling with similar issues of identity that a new gender need not mean a decisive break with every aspect of one’s past — that a prior life in games could continue to be a life in games even with a different pronoun attached. She did this in a quieter way than the speechifying some might have wished for from her, but, nevertheless, do it she did. Jessica Mulligan, who transitioned from male to female a few years after her, remembers meeting Bunten shortly before her own sexual-reassignment surgery, hoping to hear some “profound words on The Transition”: “While I was looking for spiritual guidance, she was telling me where to shop for shoes. Talk about keeping someone honest! Every change in our personal lives is profound to us. You still have to pay attention to the nuts and bolts or the change is meaningless.”

Danielle Bunten Berry does her makeup.

For some, of course — even for some with generally good intentions — Danielle Bunten Berry’s transgenderism will always be the defining aspect of her life, her career in games a mere footnote to that other part of her story. But that’s not how she would have wanted it. She regarded her games as her greatest legacy after her children, and would doubtless want to be remembered as a game designer above all else.

Back in 1989, after Modem Wars had failed in the marketplace, Electronic Arts decided that the lack of “a network of people to play” was a big reason for its failure. The great what-if question pertaining to Bunten’s career is what she might have done in partnership with an online network like CompuServe, which could have provided stable connectivity along with an eager group of players and all the matchmaking and social intrigue anyone could ask for. She finally began to explore this direction late in her life, through her work with Mpath. But what might have happened if she had made the right connections — forgive the pun! — earlier? We can only speculate.

As it is, though, it’s true that, in terms of units shifted and profits generated, there have been far more impressive careers. She suffered the curse of any pioneer who gets too far out in front of the culture. All of her eleven games combined probably sold no more than 400,000 copies at the outside, a figure some prominent designers’ new games can easily better on their first week today. Certainly her commercial disappointments far outnumber her successes. But then, sales aren’t the only metric by which to measure success.

Dani Bunten, one might say, is the designer’s designer. Greg Costikyan once told what happened when he offered to introduce Warren Spector — one of those designers who can sell more games in a week than Bunten did in a lifetime — to her back in the day: “He regretfully refused; he had loved M.U.L.E. so much he was afraid he wouldn’t know what to say. He would sound like a blithering fanboy and be embarrassed.” Chris Crawford calls the same title simply “the best computer-game design of all time.” Brenda Laurel dedicated Purple Moon’s output to Bunten. Sid Meier was so taken with Seven Cities of Gold that Pirates!, Railroad Tycoon, and Civilization, his trilogy of masterpieces, can all be described as extensions in one way or another of what Bunten first wrought. And Seven Cities of Gold was only Meier’s second favorite Bunten game: he loved M.U.L.E. so much that he was afraid to even try to improve on it.

Ironically, the very multiplayer affordances that Bunten so steadfastly refused to give up on, much to the detriment of her income, continue to make it difficult for her games to be seen at their best today. M.U.L.E. can be played as its designer really intended it only on an Atari 8-bit computer — real or emulated — with four vintage joysticks plugged in and four players holding onto them in a single living room; that is, needless to say, not a trivial thing to arrange in this day and age. Likewise, the need to have the exceedingly rare physical cards to hand has made it impossible for most people to even try out Robot Rascals today. (It took me months to track down a pricey German edition on eBay.) And Bunten’s final run of boxed games, reliant on ancient modem hookups as they are, are even more difficult to play with others today than they were in their own time.

Dani Bunten didn’t have an easy life, internally or externally. She remained always an enigma — the life of the party who goes home alone, the proverbial stranger among her best friends. One person who knew her after she became a woman claimed she still had a “shadowed, slightly haunted look, even when she was smiling.” Given the complicated emotions that are still stirred up in so many of us by transgenderism, that may have been projection. On the other hand, though, it may have been perception. Even Bunten’s childhood had been haunted by the specter of familial discord and possibly abuse, to such an extent that she refused to talk much about it. But she did once tell Greg Costikyan that she grew up loving games mainly because it was only when playing them that her family wasn’t “totally dysfunctional.”

I think that for Dani Bunten games were most of all a means of communication, a way of punching through that bubble of ego and identity that isolates all of us to one degree or another, and that perhaps isolated her more so than most. Thus her guiding vision became, as Sid Meier puts it, “the family gathered around the computer.” After all, it’s a small step to go from communicating to connecting, from connecting to loving. She openly stated that she had made Robot Rascals for her own family most of all: “They’ve never played my games. I think they found them too esoteric or complex. I wanted something that I could enjoy with them, that they’d all be able to relate to.” The tragedy for her — perhaps a key to the essential sadness many felt at Bunten’s core, whether she was living as a man or a woman — is that reality never quite lived up to that Norman Rockwell dream of the happy family gathered around a computer; her daughter, the duly appointed caretaker of her legacy, still calls M.U.L.E. “boring and tedious” today. But the dream remains, and her games have given those of us privileged to discover them great joy and comfort in the midst of lives that have admittedly — hopefully! — been far easier than that of their creator. And so I’ll close, in predictable but unavoidable fashion, with Danielle Bunten Berry’s most famous quote — a quote predictable precisely because it so perfectly sums up her career: “No one on their death bed ever said, ‘I wish I had spent more time alone with my computer!'” Words to live by, my fellow gamers. Words to live by.

Danielle Bunten Berry, 1949-1998.

(Sources: Compute! of March 1989, December 1989, April 1990, January 1992, and December 1993; Questbusters of May 1986; Commodore Power Play of June/July 1986; Commodore Magazine of July 1987, October 1988, and June 1989; Ahoy! of March 1987; Computer Gaming World of January/February 1987, May 1988, February 1989, February 1990, December 1990, February 1991, March 1991, May 1991, April 1992, June 1992, August 1992, June 1993, August 1993, July 1994, September 1995, and October 1998; Family Computing of January 1987; Compute!’s Gazette of August 1989; The One of April 1991; Game Players PC Entertainment of September 1992; Game Developer of February/March 1995, July 1998, September 1998, and October 1998; Electronic Arts’s newsletter Farther of Winter 1986; Power Play of January 1995; Arkansas Times of February 8 2012. Online sources include the archived contents of the old World of Mule site, the archived contents of a Danielle Bunten Berry tribute site, the Salon article “Get Behind the M.U.L.E.”, and Bunten’s interview at Halcyon Days.)

 
33 Comments

Posted by on November 16, 2018 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Game of Everything, Part 10: Civilization and the Limits of Progress

To listen to what Sid Meier says about his most famous achievement today, my writing all of these articles on Civilization has been like doing a deep reading of an episode of The Big Bang Theory; there just isn’t a whole lot of there there. Meier claims that the game presents at best a children’s-book view of history, that the only real considerations that went into it were what would be fun and what wouldn’t. I don’t want to criticize him for that stance here, any more than I want to minimize the huge place that fun or the lack thereof really did fill in the decisions that he and his partner Bruce Shelley made about Civilization. I understand why he says what he says: he’s a commercial game designer, not a political pundit, and he has no desire to wade into controversy — and possibly shrink his customer base — by taking public positions on the sorts of fractious topics I’ve been addressing over the course of these articles. If he should need further encouragement to stay well away from those topics, he can find it in the many dogmatic academic critiques of Civilization which accuse it of being little more than triumphalist propaganda. He’d rather spend his time talking about game design, which strikes me as perfectly reasonable.

Having said all that, it’s also abundantly clear to me that Civilization reflects a much deeper and more earnest engagement with the processes of history than Meier is willing to admit these days. This is, after all, a game which cribs a fair amount of its online Civilopedia directly from Will Durant, author of the eleven-volume The Story of Civilization, the most ambitious attempt to tell the full story of human history to date. And it casually name-drops the great British historian Arnold J. Toynbee, author of the twelve-volume A Study of History, perhaps the most exhaustive — and certainly the most lengthy — attempt ever to construct a grand unified theory of history. These are not, needless to say, books which are widely read by children. There truly is a real theory of history to be found in Civilization as well, one which, if less thoroughly worked-out than what the likes of Toynbee have presented in book form, is nevertheless worth examining and questioning at some length.

The heart of Civilization‘s theory of history is of course the narrative of progress. In fact, the latter is so central to the game that it’s joined it as the second of our lodestars throughout this series of articles. And so, as we come to the end of the series, it seems appropriate to look at what the game and the narrative of progress have to say about one another one last time, this time in the context of a modern society like the ones in which we live today. Surprisingly given how optimistic the game’s take on history generally is, it doesn’t entirely ignore the costs that have all too clearly been shown to be associated with progress in this modern era of ours.

Meier and Shelley were already working on Civilization when the first international Earth Day was held on April 22, 1990, marking the most important single event in the history of the environmental movement since the publication of Rachel Carson’s Silent Spring back in 1962. Through concerts, radio and television programs, demonstrations, and shrewd publicity stunts like a Mount Everest “Peace Climb” including American, Soviet, and Chinese climbers roped together in symbolic co-dependence, Earth Day catapulted the subject of global warming among other environmental concerns into the mass media, in some cases for the first time.

Whether influenced by this landmark event or not, Civilization as well manifests a serious concern for the environment in the later, post-Industrial Revolution stages of the game. Coal- and oil-fired power plants increase the productivity of your factories dramatically, but also spew pollution into the air which you must struggle to clean up. Nuclear power plants, while the cheapest, cleanest, and most plentiful sources of energy most of the time, can occasionally melt down with devastating consequences to your civilization. Large cities generate pollution of their own even absent factories and power plants, presumably as a result of populations that have discovered the joy of automobiles. Too much pollution left uncleaned will eventually lead not only to sharply diminished productivity for your civilization but also to global warming, making Civilization one of the first works of popular entertainment to acknowledge the growing concern surrounding the phenomenon already among scientists of the early 1990s.

In fighting your rearguard action against these less desirable fellow travelers on the narrative of progress, you have various tools at your disposal. To clean up pollution that’s already occurred, you can build and deploy settler units to the affected areas. To prevent some pollution from occurring at all, you can invest in hydroelectric plants in general and/or the Wonder of the World that is the Hoover Dam. And/or you can build mass-transit systems to wean your people away from their precious cars, and/or build recycling centers to prevent some of their trash from winding up in landfills.

Interestingly, the original Civilization addresses the issues of environment and ecology that accompany the narrative of progress with far more earnestness than any of its sequels — another fact that rather gives the lie to Meier’s assertion that the game has little to do with the real world. Although even the first game’s implementation of pollution is far from unmanageable by the careful player, it’s something that most players just never found to be all that much fun, and this feedback caused the designers who worked on the sequels to gradually scale back its effects.

In the real world as well, pollution and the threat of global warming aren’t much fun to talk or think about — so much so that plenty of people, including an alarming number of those in positions of power, have chosen to stick their heads in the sand and pretend they don’t exist. None of us enjoy having our worldviews questioned in the uncomfortable ways that discussions of these and other potential limits of progress — progress as defined in terms of Francis Fukuyama’s explicit and Civilization‘s implicit ideals of liberal, capitalistic democracy — tend to engender.

As Adam Smith wrote in the pivotal year of 1776 and the subsequent centuries of history quite definitively proved, competitive free markets do some things extraordinarily well. The laws of supply and demand conspire to ensure that a society’s resources are allocated to those things its people actually need and want, while the profit motive drives innovation in a way no other economic system has ever come close to equaling. The developed West’s enormous material prosperity — a prosperity unparalleled in human history — is thanks to capitalism and its kissing cousin, democracy.

Yet unfettered capitalism, that Platonic ideal of libertarian economists, has a tendency to go off the rails if not monitored and periodically corrected by entities who are not enslaved by the profit motive. The first great crisis of American capitalism could be said to have taken place as early as the late 1800s, during the “robber baron” era of monopolists who discovered a way to cheat the law of supply and demand by cornering entire sectors of the market to themselves. Meanwhile the burgeoning era of mass production and international corporations, so dramatically different from Adam Smith’s world of shopkeepers and village craftsmen, led to the mass exploitation of labor. The response from government was an ever-widening net of regulations to keep corporations honest, while the response from workers was to unionize for the same purpose. Under these new, more restrictive conditions, capitalism continued to hum along, managing to endure another, still greater crisis of confidence in the form of the Great Depression, which led to the idea of a taxpayer-funded social safety net for the weak and the unlucky members of society.

The things that pure capitalism doesn’t do well, like providing for the aforementioned weak and unlucky who lack the means to pay for goods and services, tend to fall under the category that economists call “externalities”: benefits and harms that aren’t encompassed by Adam Smith’s supposedly all-encompassing law of supply and demand. In Smith’s world of shopkeepers, what was best for the individual supplier was almost always best for the public at large: if I sold you a fine plow horse for a reasonable price, I profited right then and there, and also knew that you were likely to tell your friends about it and to come back yourself next year when you needed another. If I sold you a lame horse, on the other hand, I’d soon be out of business. But if I’m running a multinational oil conglomerate in the modern world, that simple logic of capitalism begins to break down in the face of a much more complicated web of competing concerns. In this circumstance, the best thing for me to do in order to maximize my profits is to deny that global warming exists and do everything I can to fight the passage of laws that will hurt my business of selling people viscous black gunk to burn inside dirty engines. This, needless to say, is not in the public’s long-term interest; it’s an externality that could quite literally spell the end of human civilization. So, government must step in — hopefully! — to curb the burning of dirty fuels and address the effects of those fossil fuels that have already been burned.

But externalities are absolutely everywhere in our modern, interconnected, globalized world of free markets. Just as there’s no direct financial benefit in an unfettered free market for a doctor to provide years or decades worth of healthcare to a chronically sick person who lacks the means to pay for it, there’s no direct financial harm entailed in a factory dumping its toxic effluent into the nearest lake. There is, of course, harm in the abstract, but that harm is incurred by the people unlucky enough to live by the lake rather than by the owners of the factory. The trend throughout the capitalist era has therefore been for government to step in more and more; every successful capitalist economy in the world today is really a mixed economy, to a degree that would doubtless have horrified Adam Smith. As externalities continue to grow in size and scope, governments are forced to shoulder a bigger and bigger burden in addressing them. At what points does that burden become unbearable?

One other internal contradiction of modern capitalism, noticed by Karl Marx already in the nineteenth century, has come to feel more real and immediate than ever before in the years since the release of Civilization. The logic of modern finance demands yearly growth — ever greater production, ever greater profits. Just holding steady isn’t good enough; if you doubt my word, consider what your pension fund will look like come retirement time if the corporations in which you’ve invested it are content to merely hold steady. Up to this point, capitalism’s efficiency as an economic system has allowed it to deliver this growth over a decade-by-decade if not always year-by-year basis. But the earth’s resources are not unlimited. At some point, constant growth — the constant demand for more, more, more — must become unsustainable. What happens to capitalism then?

Exactly the future that believers in liberal democracy and capitalism claim to be the best one possible — that the less-developed world remakes itself in the mold of North America and Western Europe — would appear to be literally impossible in reality. The United States alone, home to 6 percent of the world’s population, consumes roughly 35 percent of its resources. One doesn’t need to be a statistician or an ecologist to understand that the rest of the world simply cannot become like the United States without ruining a global ecosystem that already threatens to collapse under the weight of 7.5 billion souls — twice the number of just thirty years ago. Humans are now the most common mammal on the planet, outnumbering even the ubiquitous mice and rats. Two-thirds of the world’s farmland is already rated as “somewhat” or “strongly” degraded by the Organization for Economic Cooperation and Development. Three-quarters of the world’s biodiversity has been lost since 1900, and 50 percent of all remaining plant and animal species are expected to go extinct before 2100. And hovering over it all is the specter of climate change; the polar ice caps have melted more in the last 20 years than they did in the previous 12,000 years since the end of the last ice age.

There’s no doubt about it: these are indeed uncomfortable conversations to have. Well before the likes of Brexit and President Donald Trump, even before the events of September 11, 2001, Western society was losing the sense of triumphalism that had marked the time of the original Civilization, replacing it with a jittery sense that humanity was packed too closely together on an overcrowded and overheating little planet, that the narrative of progress was rushing out of control toward some natural limit point that was difficult to discern or describe. The first clear harbinger of the generalized skittishness to come was perhaps the worldwide angst that accompanied the turn of the millennium — better known as “Y2K,” a fashionable brand name for disaster that smacked of Hollywood, thereby capturing the strange mixture of gloom and mass-media banality that would come to characterize much of post-millennial life. The historian of public perception David Lowenthal, writing in 2015:

Events spawned media persistently catastrophic in theme and tone, warning of the end of history, the end of humanity, the end of nature, the end of everything. Millennial prospects in 2000 were lacklustre and downbeat; Y2K seemed a portent of worse to come. Not even post-Hiroshima omens of nuclear annihilation unleashed such a pervasive glum foreboding. Today’s angst reflects unexampled loss of faith in progress: fears that our children will be worse off than ourselves, doubts that neither government nor industry, science nor technology, can set things right.

The turn of the millennium had the feeling of an end time, yet none of history’s more cherished eschatologies seemed to be coming true: not Christianity’s Rapture, not Karl Marx’s communist world order, not Wilhelm Friedrich Hegel or Francis Fukuyama’s liberal-democratic end of history, certainly not Sid Meier and Bruce Shelley’s trip to Alpha Centauri. Techno-progressives began to talk more and more of a new secular eschatology in the form of the so-called Singularity, the point where, depending on the teller, artificial intelligence would either merge with human intelligence to create a new super-species fundamentally different from the humans of prior ages, or our computers would simply take over the world, wiping out their erstwhile masters or relegating them to the status of pets. And that was one of the more positive endgames for humanity that came to be batted around. Others nursed apocalyptic visions of a world ruined by global warming and the rising sea levels associated with it — a secular version of the Biblical Flood — or completely overrun by Islamic Jihadists, those latest barbarians at the gates of civilization heralding the next Dark Ages. Our television and movies turned increasingly dystopic, with anti-heroes and planet-encompassing disasters coming to rule our prime-time entertainment.

The last few years in particular haven’t been terribly good ones for believers in the narrative of progress and the liberal-democratic world order it has done so much to foster. The Arab Spring, touted for a time as a backward region’s belated awakening to progress, collapsed without achieving much of anything at all. Britain is leaving the European Union; the United States elected Donald Trump; Russia is back to relishing the role of the Evil Empire, prime antagonist to the liberal-democratic West; China has gone a long way toward consummating a marriage once thought impossible: the merging of an autocratic, human-rights-violating government with an economy capable of competing with the best that democratic capitalism can muster. Our politicians issue mealy-mouthed homages to “realism” and “transactional diplomacy,” ignoring the better angels of our nature. Everywhere nativism and racism seem to be on the rise. Even in the country where I live now, the supposed progressive paradise of Denmark, the Danish People’s Party has won considerable power in the government by sloganeering that “Denmark is not a multicultural society,” by drawing lines between “real” Danes and those of other colors and other religions. In my native land of the United States, one side of the political discourse, finding itself unable to win a single good-faith argument on the merits, has elected to simply lie about the underlying facts, leading some to make the rather chilling assertion that we now live in a “post-truth” world. (How ironic that the American right, long the staunchest critic of postmodernism, should have been the ones to turn its lessons about the untenability of objective truth into an electoral strategy!)

And then there’s the incoming fire being taken by the most sacred of all of progress’s sacred cows, as The Economist‘s latest Democracy Index announces that it “continues its disturbing retreat.” In an event redolent with symbolism, the same index in 2016 changed the classification of the United States, that beacon of democracy throughout its history, from a “Full Democracy” to a “Flawed Democracy.” Functioning as both cause and symptom of this retreat is the old skepticism about whether democracy is just too chaotic to efficiently run a country, whether people who can so easily be duped by Facebook propaganda and email chain letters can really be trusted to decide their countries’ futures.

Looming over such discussions of democracy and its efficacy is the specter of China. When Mao Zedong’s Communist Party seized power there in 1949, the average Chinese citizen earned just $448 per year in inflation-adjusted terms, making it one of the poorest countries in the world. Mao’s quarter-century of orthodox communist totalitarianism, encompassing the horrors of the Great Leap Forward and the Cultural Revolution, managed to improve that figure only relatively slowly; average income had increased to $978 by 1978. But, following Mao’s death, his de-facto successor Deng Xiaoping began to depart from communist orthodoxy, turning from a centrally-managed economy to the seemingly oxymoronic notion of “market-oriented communism” — effectively a combination of authoritarianism with capitalism. Many historians and economists — not least among them Francis Fukuyama — have always insisted that a non-democracy simply cannot compete with a democracy on economic terms over a long span of time. Yet the economy of the post-Mao China has seemingly grown at a far more impressive rate than they allow to be possible, with average income reaching $6048 by 2006, then $16,624 by 2017. China today would seem to be a compelling rebuttal to all those theories about the magic conjunction of personal freedoms and free markets.

But is it really? We should be careful not to join some of our more excitable pundits in getting ahead of the real facts of the case. China’s economic transformation, remarkable as it’s been, has only elevated it to the 79th position among all the world’s nations in terms of GDP per capita. Its considerable economic clout in the contemporary world, in other words, has a huge amount to do with the fact that it’s the most populous country in the world. Further, the heart of its economy is manufacturing, as is proved by all of those “Made in China” tags on hard goods of every description that are sold all over the world. China is still a long, long way from joining the vanguard of post-industrial knowledge economies. To a large extent, economic innovation still comes from the latter; China then does the grunt work of manufacturing the products that the innovators design.

Of course, authoritarianism does have its advantages. China’s government, which doesn’t need to concern itself with elections every set number of years, can set large national projects in motion, such as a green energy grid spanning the entire country or even a manned trip to Mars, and see them methodically through over the course of decades if need be. But can China under its current system of government produce a truly transformative, never-seen-or-imagined-anything-like-it product like the Apple iPhone and iPad, the World Wide Web, or the Sony Walkman? It isn’t yet clear to me that it can transcend being an implementor of brilliant ideas — thanks to all those cheap and efficient factories — to being an originator of same. So, personally, I’m not quite ready to declare the death of the notion that a country requires democracy to join the truly top-tier economies of the world. The next few decades should be very interesting in one way or another — whether because China does definitively disprove that notion, because its growth tops out, or, most desirably, because a rising standard of living there and the demands of a restive middle class bring an end at last to China’s authoritarian government.

Still, none of these answers to The China Puzzle will do anything to help us with the fundamental limit point of the capitalistic world order: the demand for infinite economic growth in a world of decidedly finite resources. Indeed, the Chinese outcome I just named as the most desirable — that of a democratic, dynamic China free of the yoke of its misnamed Communist Party — only causes our poor, suffering global ecosystem to suffer that much more under the yoke of capitalism. For this reason, economists today have begun to speak more and more of a “crisis of capitalism,” to question whether Adam Smith’s brilliant brainchild is now entering its declining years. For a short time, the “Great Recession” of 2007 and 2008, when some of the most traditionally rock-solid banks and corporations in the world teetered on the verge of collapse, seemed like it might be the worldwide shock that signaled the beginning of the end. Desperate interventions by governments all over the world managed to save the capitalists from themselves at the last, but even today, when the economies of most Western nations are apparently doing quite well, the sense of unease that was engendered by that near-apocalypse of a decade ago has never fully disappeared. The feeling remains widespread that something has to give sooner or later, and that that something might be capitalism as we know it today.

But what would a post-capitalist world look like? Aye, there’s the rub. Communism, capitalism’s only serious challenger over the course of the last century, would seem to have crashed and burned a long time ago as a practical way of ordering an economy. Nor, based on the horrid environmental record of the old Soviet bloc, is it at all clear that it would have proved any better a caretaker of our planet than capitalism even had it survived.

One vision for the future, favored by the anarchist activists whom we briefly met in an earlier article, entails a deliberate winding down of the narrative of progress before some catastrophe or series of catastrophes does it for us. It’s claimed that we need to abandon globalization and re-embrace localized, self-sustaining ways of life; it’s thus perhaps not so much a complete rejection of capitalism as a conscious return to Adam Smith’s era of shopkeepers and craftsman. The prominent American anarchist Murray Bookchin dreams of a return to “community, decentralization, self-sufficiency, mutual aid, and face-to-face democracy” — “a serious challenge to [globalized] society with its vast, hierarchical, sexist, class-ruled state apparatus and militaristic history.” Globalization, he and other anarchists note, often isn’t nearly as efficient as its proselytizers claim. In fact, the extended international supply chains it fosters for even the most basic foodstuffs are often absurdly wasteful in terms of energy and other resources, and brittle to boot, vulnerable to the slightest shock to the globalized system. Why should potatoes which can be grown in almost any back garden in the world need to be shipped in via huge, fuel-guzzling jet airplanes and forty-ton semis? Locally grown agriculture, anarchists point out, can provide eight units of food energy for every one unit of fossil-fuel energy needed to bring it to market, while in many cases exactly the opposite ratio holds true for internationally harvested produce.

But there’s much more going on here philosophically than a concern with the foodstuff supply chain. Modern anarchist thought reflects a deep discomfort with consumer culture, a strand of philosophy we’ve met before in the person of Jean-Jacques Rousseau and his “noble savage.” In truth, Rousseau noted, the only things a person really, absolutely needs to survive are food and shelter. All else is, to paraphrase the Bible, vanity, and all too often brings only dissatisfaction. Back in the eighteenth century, Rousseau could already describe the collector who is never satisfied by the collection he’s assembled, only dissatisfied by its gaps.

What would he make of our times? Today’s world is one of constant beeping enticements — cars, televisions, stereos, computers, phones, game consoles — that bring only the most ephemeral bursts of happiness before we start craving the upgraded model. The anarchist activist Peter Harper:

People aspire to greater convenience and comfort, more personal space, easy mobility, a sense of expanding possibilities. This is the modern consumerist project: what modern societies are all about. It is a central feature of mainstream politics and economics that consumerist aspirations are not seriously challenged. On the contrary, the implied official message is “Hang on in there: we will deliver.” The central slogan is brutally simple: MORE!

Harper claims that, as the rest of the world continues to try and fail to find happiness in the latest shiny objects, anarchists will win them over to their cause by example. For those who reject materialist culture “will quite visibly be having a good time: comfortable, with varied lives and less stress, healthy and fit, having rediscovered the elementary virtues of restraint and balance.”

Doubtless we could all use a measure of restraint and balance in our lives, but the full anarchist project for happiness and sustainability through a deliberate deconstruction of the fruits of progress is so radical — entailing as it does the complete dissolution of nation-states and a return to decentralized communal living — that it’s difficult to fully envision how it could happen absent the sort of monumental precipitating global catastrophe that no one can wish for. While human nature will always be tempted to cast a wistful eye back to an imagined simpler, more elemental past, another, perhaps nobler part of our nature will always look forward with an ambitious eye to a bolder, more exciting future. The oft-idealized life of a tradesman prior to the Industrial Revolution, writes Francis Fukuyama, “involved no glory, dynamism, innovation, or mastery; you just plied the same traditional markets or crafts as your father and grandfather.” For many or most people that may be a fine life, and more power to them. But what of those with bigger dreams, who would spur humanity on to bigger and better things? That is to say, what of the authors of the narrative of progress of the past, present, and future, who aren’t willing to write the whole thing off as fun while it lasted and return to the land? The builders among us will never be satisfied with a return to some agrarian idyll.

The world’s current crisis of faith in progress and in the liberal-democratic principles that are so inextricably bound up with it isn’t the first or the worst of its kind. Not that terribly long ago, Nazi Germany and Imperial Japan posed a far more immediate and tangible threat to liberal democracy all over the world than anything we face today; the American Nazi party was once strong enough to rent and fill Madison Square Garden, a fact which does much to put the recent disconcerting events in Charlottesville in perspective. And yet liberal democracy got through that era all right in the end.

Even in 1983, when the Soviet Union was already teetering on the verge of economic collapse, an unknowing Jean-François Revel could write that “democracy may, after all, turn out to have been an historical accident, a brief parenthesis that is closing before our eyes.” The liberal West’s periods of self-doubt have always seemed to outnumber and outlast its periods of triumphalism, and yet progress has continued its march. During the height of the fascist era, voting rights in many democratic countries were being expanded to include all of their citizens at long last; amidst the gloominess about the future that has marked so much of post-millennial life, longstanding prejudices toward gay and lesbian people have fallen away so fast in the developed West that it’s left even many of our ostensibly progressive politicians scrambling to keep up.

Of course, the fact still remains that our planet’s current wounds are real, and global warming may in the long run prove to be the most dangerous antagonist humanity has ever faced. If we’re unwilling to accept giving up the fruits of progress in the name of healing our planet, where do we go from here? One thing that is clear is that we will have to find different, more sustainable ways of ordering our economies if progress is to continue its march. Capitalism is often praised for its ability to sublimate what Friedrich Nietzsche called the megalothymia of the most driven souls among us — the craving for success, achievement, recognition, victory — into the field of business rather than the field of battle. Would other megalothymia sublimators, such as sport, be sufficient in a post-capitalist world? What would a government/economy look like that respects people’s individual freedoms but avoids the environment-damaging, resource-draining externalities of capitalism? No one — certainly not I! — can offer entirely clear answers to these questions today. This is not so much a tribute to anything unique about our current times as it is a tribute to the nature of history itself. Who anticipated Christianity? Who anticipated that we would use the atomic bomb only twice? Who, for that matter, anticipated a President Donald Trump?

One possibility, at least in the short term, is to rejigger the rules of capitalism to bring its most problematic externalities back under the umbrella of the competitive marketplace. Experiments in cap-and-trade, which turn environment-ruining carbon emissions into a scarce commodity that corporations can exchange among themselves, have shown promising results.

But in the longer term, still more than just our economics will have to change. Because the problems of ecology and environment are global problems of a scope we’ve never faced before, we will need to think of ourselves more and more as a global society in order to solve them. In time, the nation-states in which we still invest so much patriotic fervor today may need to go the way of the scattered, self-sufficient settlements of a few dozens or hundreds that marked the earliest stages of the earliest civilizations. In time, the seeds that were planted with the United Nations in the aftermath of the bloodiest of all our stupid, pointless wars may flower into a single truly global civilization.

Really, though, I can’t possibly predict how humanity will progress its way out of its current set of predicaments. I can only have faith in the smarts and drive that have brought us this far. The best we can hope for is probably to muddle through by the skin of our teeth — but then, isn’t that what we’ve always been doing? The first civilizations began as improvised solutions to the problem of a changing climate, and we’ve been making it up as we go along ever since. So, maybe the first truly global civilization will also arise as, you guessed it, an improvised solution to the problem of a changing climate. Even if we’ve met our match with our latest nemesis of human-caused climate change, perhaps it really is better to burn out than to fade away. Perhaps it’s better to go down swinging than to survive at the cost of the grand dream of an eventual trip to Alpha Centauri.

The game which has the fulfillment of that dream as its most soul-stirring potential climax has been oft-chided for promoting a naive view of history — for being Western- and American-centric, for ignoring the plights of the vast majority of the people who have ever inhabited this planet of ours, for ignoring the dangers of the progress it celebrates. It is unquestionably guilty of all these things in whole or in part, and guilty of many more sins against history besides. But I haven’t chosen to emphasize overmuch its many problems in this series of articles because I find its guiding vision of a human race capable of improving itself down through the millennia so compelling and inspiring. Human civilization needs it critics, but it needs its optimists perhaps even more. So, may the optimistic outlook of the narrative of progress last as long as our species, and may we always have to go along with it the optimism of the game of Civilization — or of a Civilization VI, Civilization XVI, or Civilization CXVI — to exhort us to keep on keeping on.

(Sources: the books Civilization, or Rome on 640K A Day by Johnny L. Wilson and Alan Emrich, The End of History and the Last Man by Francis Fukuyama, Democracy: A Very Short Introduction by Bernard Crick, Anarchism: A Very Short Introduction by Colin Ward, Environmental Economics: A Very Short Introduction by Stephen Smith, Globalization: A Very Short Introduction by Manfred B. Steger, Economics: A Very Short Introduction by Partha Dasgupta, Global Economic History: A Very Short Introduction by Robert C. Allen, Capital by Karl Marx, The Social Contract by Jean-Jacques Rousseau, The Genealogy of Morals by Friedrich Nietzsche, Lectures on the Philosophy of History by Georg Wilhelm Friedrich Hegel, The Wealth of Nations by Adam Smith, How Democracies Perish by Jean-François Revel, and The Past is a Foreign Country by David Lowenthall.)

 

Tags: , , ,