RSS

The 640 K Barrier

There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.

The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.

— with my apologies to The Right Stuff[1]Yes, that is quite possibly the nerdiest thing I’ve ever written.

The idea that the original IBM PC, the machine that made personal computing safe for corporate America, was a hastily slapped-together stopgap has been vastly overstated by popular technology pundits over the decades since its debut back in August of 1981. Whatever the realities of budgets and scheduling with which its makers had to contend, there was a coherent philosophy behind most of the choices they made that went well beyond “throw this thing together as quickly as possible and get it out there before all these smaller companies corner the market for themselves.” As a design, the IBM PC favored robustness, longevity, and expandability, all qualities IBM had learned the value of through their many years of experience providing businesses and governments with big-iron solutions to their most important data–processing needs. To appreciate the wisdom of IBM’s approach, we need only consider that today, long after the likes of the Commodore Amiga and the original Apple Macintosh architecture, whose owners so loved to mock IBM’s unimaginative beige boxes, have passed into history, most of our laptop and desktop computers — including modern Macs — can trace the origins of their hardware back to what that little team of unlikely business-suited visionaries accomplished in an IBM branch office in Boca Raton, Florida.

But of course no visionary has 20-20 vision. For all the strengths of the IBM PC, there was one area where all the jeering by owners of sexier machines felt particularly well-earned. Here lay a crippling weakness, born not so much of the hardware found in that first IBM PC as the operating system the marketplace chose to run on it, that would continue to vex programmers and ordinary users for two decades, not finally fading away until Microsoft’s release of Windows XP in 2001 put to bed the last legacies of MS-DOS in mainstream computing. MS-DOS, dubbed the “quick and dirty” operating system during the early days of its development, is likely the piece of software in computing history with the most lopsided contrast between the total number of hours put into its development and the total number of hours it spent in use, on millions and millions of computers all over the world. The 640 K barrier, the demon all those users spent so much time and energy battling for so many years, was just one of the more prominent consequences of corporate America’s adoption of such a blunt instrument as MS-DOS as its standard. Today we’ll unpack the problem that was memory management under MS-DOS, and we’ll also examine the problem’s multifarious solutions, all of them to one degree or another ugly and imperfect.


 

The original IBM PC was built around an Intel 8088 microprocessor, a cost-reduced and somewhat crippled version of an earlier chip called the 8086. (IBM’s decision to use the 8088 instead of the 8086 would have huge importance for the expansion buses of this and future machines, but the differences between the two chips aren’t important for our purposes today.) Despite functioning as a 16-bit chip in most ways, the 8088 had a 20-bit address space, meaning it could address a maximum of 1 MB of memory. Let’s consider why this limitation should exist.

Memory, whether in your brain or in your computer, is of no use to you if you can’t keep track of where you’ve put things so that you can retrieve them again later. A computer’s memory is therefore indexed by bytes, with every single byte having its own unique address. These addresses, numbered from 0 to the upper limit of the processor’s address space, allow the computer to keep track of what is stored where. The biggest number that can be represented in 20 bits is 1,048,575, or 1 MB. Thus this is the maximum amount of memory which the 8088, with its 20-bit address bus, can handle. Such a limitation hardly felt like a deal breaker to the engineers who created the IBM PC. Indeed, it’s difficult to overemphasize what a huge figure 1 MB really was when they released the machine in 1981, in which year the top-of-the-line Apple II had just 48 K of memory and plenty of other competing machines shipped with no more than 16 K.

A processor needs to address other sorts of memory besides the pool of general-purpose RAM which is available for running applications. There’s also ROM memory — read-only memory, burned inviolably into chips — that contains essential low-level code needed for the computer to boot itself up, along with, in the case of the original IBM PC, an always-available implementation of the BASIC programming language. (The rarely used BASIC in ROM would be phased out of subsequent models.) And some areas of RAM as well are set aside from the general pool for special purposes, like the fully 128 K of addresses given to video cards to keep track of the onscreen display in the original IBM PC. All of these special types of memory must be accessed by the CPU, must be given their own unique addresses to facilitate that, and must thus be subtracted from the address space available to the general pool.

IBM’s engineers were quite generous in drawing the boundary between their general memory pool and the area of addresses allocated to special purposes. Focused on expandability and longevity as they were, they reserved big chunks of “special” memory for purposes that hadn’t even been imagined yet. In all, they reserved the upper three-eighths of the available addresses for specialized purposes actual or potential, leaving the lower five-eighths — 640 K — to the general pool. In time, this first 640 K of memory would become known as “conventional memory,” the remaining 384 K — some of which would be ROM rather than RAM — as “high memory.” The official memory map which IBM published upon the debut of the IBM PC looked like this:

It’s important to understand when looking at a memory map like this one that the existence of a logical address therein doesn’t necessarily mean that any physical memory is connected to that address in any given real machine. The first IBM PC, for instance, could be purchased with as little as 16 K of conventional memory installed, and even a top-of-the-line machine had just 256 K, leaving most of the conventional-memory space vacant. Similarly, early video cards used just 32 K or 64 K of the 128 K of address space offered to them in high memory. The 640 K barrier was thus only a theoretical limitation early on, one few early users or programmers ever even noticed.

That blissful state of affairs, however, wouldn’t last very long. As IBM’s creations — joined, soon enough, by lots of clones — became the standard for American business, more and more advanced applications appeared, craving more and more memory alongside more and more processing power. Already by 1984 the 640 K barrier had gone from a theoretical to a very real limitation, and customers were beginning to demand that IBM do something about it. In response, IBM that year released the PC/AT, built around Intel’s new 80286 microprocessor, which boasted a 24-bit address space good for 16 MB of memory. To unlock all that potential extra memory, IBM made the commonsense decision to extend the memory map above the specialized high-memory area that ended at 1 MB, making all addresses beyond 1 MB a single pool of “extended memory” available for general use.

Problem solved, right? Well, no, not really — else this would be a much shorter article. Due more to software than hardware, all of this potential extended memory proved not to be of much use for the vast majority of people who bought PC/ATs. To understand why this should be, we need to examine the deadly embrace between the new processor and the old operating system people were still running on it.

The 80286 was designed to be much more than just a faster version of the old 8086/8088. Developing the chip before IBM PCs running MS-DOS had come to dominate business computing, Intel hadn’t allowed the need to stay compatible with that configuration to keep them from designing a next-generation chip that would help to take computing to where they saw it as wanting to go. Intel believed that microcomputers were at the stage at which the big institutional machines had been a couple of decades earlier, just about ready to break free of what computer scientist Brian L. Stuart calls the “Triangle of Ones”: one user running one program at a time on one machine. At the very least, Intel believed, the second leg of the Triangle must soon fall; everyone recognized that multitasking — running several programs at a time and switching freely between them — was a much more efficient way to do complex work than laboriously shutting down and starting up application after application. But unfortunately for MS-DOS, the addition of multitasking complicates the life of an operating system to an absolutely staggering degree.

Operating systems are of course complex subjects worthy of years or a lifetime of study. We might, however, collapse their complexities down to a few fundamental functions: to provide an interface for the user to work with the computer and manage her programs and files; to manage the various tasks running on the computer and allocate resources among them; and to act as a buffer or interface between applications and the underlying hardware of the computer. That, anyway, is what we expect at a minimum of our operating systems today. But for a computer ensconced within the Triangle of Ones, the second and third functions were largely moot: with only one program allowed to run at a time, resource-management concerns were nonexistent, and, without the need for a program to be concerned about clashing with other programs running at the same time, bare-metal programming — manipulating the hardware directly, without passing requests through any intervening layer of operating-system calls — was often considered not only acceptable but the expected approach. In this spirit, MS-DOS provided just 27 function calls to programmers, the vast majority of them dealing only with disk and file management. (Compare that, my fellow programmers, with the modern Windows or OS X APIs!) For everything else, banging on the bare metal was fine.

We can’t even begin here to address all of the complications that are introduced when we add multitasking into the equation, asking the operating system in the process to fully embrace all three of the core functions listed above. Memory management alone, the one aspect we will look deeper into today, becomes complicated enough. A program which is sharing a machine with other programs can no longer have free run of the memory map, placing whatever it wants to wherever it wants to; to do so risks overwriting the code or data of another program running on the system. Instead the operating system must demand that individual programs formally request the memory they’d like to use, and then must come up with a way to keep a program, whether due to bugs or malice, from running roughshod over areas of memory that it hasn’t been granted.

Or perhaps not. The Commodore Amiga, the platform which pioneered multitasking on personal computers in 1985, didn’t so much solve the latter part of this problem as punted it away. An application program is expected to request from the Amiga’s operating system any memory that it requires. The operating system then returns a pointer to a block of memory of the requested size, and trusts the application not to write to  memory outside of these bounds. Yet nothing besides the programmer’s skill and good nature absolutely prevents such unauthorized memory access from happening. Every application on the Amiga, in other words, can write to any address in the machine’s memory, whether that address be properly allocated to it or not. Screen memory, free memory, another program’s data, another program’s code — all are fair game to the errant program. Such unauthorized memory access will almost always eventually result in a total system crash. A non-malicious programmer who wants her program to be a good citizen would of course never intentionally write to memory she hasn’t properly requested, but bugs of this nature are notoriously easy to create and notoriously hard to track down, and on the Amiga a single instance of one can bring down not only the offending program but the entire operating system. With all due respect to the Amiga’s importance as the first multitasking personal computer, this is obviously not the ideal way to implement it.

A far more sustainable approach is to take the extra step of tracking and protecting the memory that has been allocated to each program. Memory protection is usually accomplished using  what’s known as virtual memory: when a program requests memory, it’s returned not a true address within the system’s memory pool but rather a virtual address that’s translated back into the real address to which it corresponds every time the program accesses its data. Each program is thus effectively sandboxed from everything else, allowed to read from and write to only its own data. Only the lowest levels of the operating system have global access to the memory pool as a whole.

Implementing such memory protection in software alone, however, must be an untenable drain on the resources available to systems engineers in the 1980s — a fact which does everything to explain its absence from the Amiga. Intel therefore decided to give software a leg up via hardware. They built into the 80286 a memory-management unit that could automatically translate from virtual to real memory addresses and vice versa, making this constantly ongoing process fairly transparent even to the operating system.

Nevertheless, the operating system must know about this capability, must in fact be written very differently if it’s to run on a CPU with memory protection built into its circuitry. Intel recognized that it would take time for such operating systems to be created for the new chip, and recognized that compatibility with the earlier 8086/8088 chips would be a very good thing to have in the meantime. They therefore built two possible operating modes into the 80286. In “protected mode” — the mode they hoped would eventually come to be used almost universally — the chip’s full potential would be realized, including memory protection and the ability to address up to 16 MB of memory. In “real mode,” the 80286 would function essentially like a turbocharged 8086/8088, with no memory-protection capabilities and with the old limitation on addressable memory of 1 MB still in place. Assuming that in the early days at least the new chip would need to run on operating systems with no knowledge of its full capabilities, Intel made the 80286 default to real mode on startup. An operating system which did know about the 80286 and wanted to bring out its full potential could switch it to protected mode at boot-up and be off to the races.

It’s at the intersection between the 80286 and the operating system that Intel’s grand plans for the future of their new chip went awry. An overwhelming percentage of the early 80286s were used in IBM PC/ATs and clones, and an overwhelming percentage of those machines were running MS-DOS. Microsoft’s erstwhile “quick and dirty” operating system knew nothing of the 80286’s full capabilities. Worse, trying to give it knowledge of those capabilities would have to entail a complete rewrite which would break compatibility with all existing MS-DOS software. Yet the whole reason MS-DOS was popular in the first place — it certainly wasn’t because of a generous feature set, a friendly interface, or any aesthetic appeal — was that very same huge base of business software. Getting users to make the leap to some hypothetical new operating system in the absence of software to run on it would be as difficult as getting developers to write programs for an operating system with no users. It was a chicken-or-the-egg situation, and neither chicken nor egg was about to stick its neck out anytime soon.

IBM was soon shipping thousands upon thousands of PC/ATs every month, and the clone makers were soon shipping even more 80286-based machines of their own. Yet at least 95 percent of those machines were idling along at only a fraction of their potential, thanks to the already creakily archaic MS-DOS. For all these users, the old 640 K barrier remained as high as ever. They could stuff their machines full of extended memory if they liked, but they still couldn’t access it. And of course the multitasking that the 80286 was supposed to have enabled remained as foreign a concept to MS-DOS as a GPS unit to a Model T. The only solution IBM offered those who complained about the situation was to run another operating system. And indeed, there were a number of alternatives to MS-DOS available for the PC/AT and other 80286-based machines, including several variants of the old institutional-computing favorite Unix — one of them even from Microsoft — and new creations like Digital Research’s Concurrent DOS, which struggled with mixed results to wedge in some degree of MS-DOS compatibility. Still, the only surefire way to take full advantage of MS-DOS’s huge software base was to run the real — in more ways than one now! — MS-DOS, and this is what the vast majority of people with 80286-equipped machines wound up doing.

Meanwhile the very people making the software which kept MS-DOS the only viable choice for most users were feeling the pinch of being confined to 640 K more painfully almost by the month. Finally Lotus Corporation —  makers of the Lotus 1-2-3 spreadsheet package that ruled corporate America, the greatest single business-software success story of their era — decided to use their clout to do something about it. They convinced Intel to join them in devising a scheme for breaking the 640 K barrier without abandoning MS-DOS. What they came up with was one mother of an ugly kludge — a description the scheme has in common with virtually all efforts to break through the 640 K barrier.

Looking through the sparsely populated high-memory area which the designers of the original IBM PC had so generously carved out, Lotus and Intel realized it should be possible on almost any extant machine to identify a contiguous 64 K chunk of those addresses which wasn’t being used for anything. This chunk, they decided, would be the gateway to potentially many more megabytes installed elsewhere in the machine. Using a combination of software and hardware, they implemented what’s known as a bank-switching scheme. The 64 K chunk of high-memory addresses was divided into four segments of 16 K, each of which could serve as a lens focused on a 16 K segment of additional memory above and beyond 1 MB. When the processor accessed the addresses in high memory, the data it would actually access would be the data at whatever sections of the additional memory their lenses were currently pointing to. The four lenses could be moved around at will, giving access, albeit in a roundabout way, to however much extra memory the user had installed. The additional memory unlocked by the scheme was dubbed “expanded memory.”  The name’s unfortunate similarity to “extended memory” would cause much confusion over the years to come; from here on, we’ll call it by its common acronym of “EMS.”

All those gobs of extra memory wouldn’t quite come for free: applications would have to be altered to check for the existence of EMS memory and make use of it, and there would remain a distinct difference between conventional memory and EMS memory with which programmers would always have to reckon. Likewise, the overhead of constantly moving those little lenses around made EMS memory considerably slower to access than conventional memory. On the brighter side, though, EMS worked under MS-DOS with only the addition of a single device driver during startup. And, since the hardware mechanism for moving the lenses around was completely external to the CPU, it would even work on machines that weren’t equipped with the new 80286.

This diagram shows the different types of memory available on PCs of the mid-1980s. In blue, we see the original 1 MB memory map of the IBM PC. In green, we see a machine equipped with additional extended memory. And in orange we see a machine equipped with additional expanded memory.

Shortly before the scheme made its official debut at a COMDEX trade show in May of 1985, Lotus and Intel convinced a crucial third partner to come aboard: Microsoft. “It’s garbage! It’s a kludge!” said Bill Gates. “But we’re going to do it.” With the combined weight of Lotus, Intel, and Microsoft behind it, EMS took hold as the most practical way of breaking the 640 K barrier. Imperfect and kludgy though it was, software developers hurried to add support for EMS memory to whatever programs of theirs could practically make use of it, while hardware manufacturers rushed EMS memory boards onto the market. EMS may have been ugly, but it was here today and it worked.

At the same time that EMS was taking off, however, extended memory wasn’t going away. Some hardware makers — most notably IBM themselves — didn’t want any part of EMS’s ugliness. Software makers therefore continued to probe at the limits of machines equipped with extended memory, still looking for a way to get at it from within the confines of MS-DOS. What if they momentarily switched the 80286 into protected mode, just for as long as they needed to manipulate data in extended memory, then went back into real mode? It seemed like a reasonable idea — except that Intel, never anticipating that anyone would want to switch modes on the fly like this, had neglected to provide a way to switch an 80286 in protected mode back into real mode. So, proponents of extended memory had to come up with a kludge even uglier than the one that allowed EMS memory to function. They could force the 80286 back into real mode, they realized, by resetting it entirely, just as if the user had rebooted her computer. The 80286 would go through its self-check again — a process that admittedly absorbed precious milliseconds — and then pick back up where it left off. It was, as Microsoft’s Gordon Letwin memorably put it, like “turning off the car to change gears.” It was staggeringly kludgy, it was horribly inefficient, but it worked in its fashion. Given the inefficiencies involved, the scheme was mostly used to implement virtual disks stored in the extended memory, which wouldn’t be subject to the constant access of an application’s data space.

In 1986, the 32-bit 80386, Intel’s latest and greatest chip, made its public bow at the heart of the Compaq Deskpro 386 rather than an IBM machine, a landmark moment signaling the slow but steady shift of business computing’s power center from IBM to Microsoft and the clone makers using their operating system. While working on the new chip, Intel had had time to see how the 80286 was actually being used in the wild, and had faced the reality that MS-DOS was likely destined to be cobbled onto for years to come rather than replaced in its entirety with something better. They therefore made a simple but vitally important change to the 80386 amidst its more obvious improvements. In addition to being able to address an inconceivable total of 4 GB of memory in protected mode thanks to its 32-bit address space, the 80386 could be switched between protected mode and real mode on the fly if one desired, without needing to be constantly reset.

In freeing programmers from that massive inefficiency, the 80386 cracked open the door that much further to making practical use of extended memory in MS-DOS. In 1988, the old EMS consortium of Lotus, Intel, and Microsoft came together once again, this time with the addition to their ranks of the clone manufacturer AST; the absence of IBM is, once again, telling. Together they codified a standard approach to extended memory on 80386 and later processors, which corresponded essentially to the scheme I’ve already described in the context of the 80286, but with a simple command to the 80386 to switch back to real mode replacing the resets. They called it the eXtended Memory Specification; memory accessed in this way soon became known universally as “XMS” memory. Under XMS as under EMS, a new device driver would be loaded into MS-DOS. Ordinary real-mode programs could then call this driver to access extended memory; the driver would do the needful switching to protected mode, copy blocks of data from extended memory into conventional memory or vice versa, then switch the processor back to real mode when it was time to return control to the program. It was still inelegant, still a little inefficient, and still didn’t use the capabilities of Intel’s latest processors in anything like the way Intel’s engineers had intended them to be used; true multitasking still remained a pipe dream somewhere off in a shadowy future. Owners of sexier machines like the Macintosh and Amiga, in other words, still had plenty of reason to mock and scoff. In most circumstances, working with XMS memory was actually slower than working with EMS memory. The primary advantage of XMS was that it let programs work with much bigger chunks of non-conventional memory at one time than the four 16 K chunks that EMS allowed. Whether any given program chose EMS or XMS came to depend on which set of advantages and disadvantages best suited its purpose.

The arrival of XMS along with the ongoing use of EMS memory meant that MS-DOS now had two competing memory-management solutions. Buyers now had to figure out not only whether they had enough extra memory to run a program but whether they had the right kind of extra memory. Ever accommodating, hardware manufacturers began shipping memory boards that could be configured as either EMS or XMS memory — whatever the application you were running at the moment happened to require.

The next stage in the slow crawl toward parity with other computing platforms in the realm of memory management would be the development of so-called “DOS extenders,” software to allow applications themselves to run in protected mode, thus giving them direct access to extended memory without having to pass their requests through an inefficient device driver. An application built using a DOS extender would only need to switch the processor to real mode when it needed to communicate with the operating system. The development of DOS extenders was driven by Microsoft’s efforts to turn Windows, which like seemingly everything else in business computing ran on top of MS-DOS, into a viable alternative to the command line and a viable challenger to the Macintosh. That story is thus best reserved for a future article, when we look more closely at Windows itself. As it is, the story that I’ve told so far today moves us nicely into the era of computer-gaming history we’ve reached on the blog in general.

In said era, the MS-DOS machines that had heretofore been reserved for business applications were coming into homes, where they were often used to play a new generation of games taking advantage of the VGA graphics, sound cards, and mice sported by the latest systems. Less positively, all of the people wanting to play these new games had to deal with the ramifications of a 640 K barrier that could still be skirted only imperfectly. As we’ve seen, both EMS and XMS imposed to one degree or another a performance penalty when accessing non-conventional memory. What with games being the most performance-sensitive applications of all, that made that first 640 K of lightning-fast conventional memory most precious of all for them.

In the first couple of years of MS-DOS’s gaming dominance, developers dealt with all of the issues that came attached to using memory beyond 640 K by the simple expedient of not using any memory beyond 640 K. But that solution was compatible neither with developers’ growing ambitions for their games nor with the gaming public’s growing expectations of them.

The first harbinger of what was to come was Origin Systems’s September 1990 release Wing Commander, which in its day was renowned — and more than a little feared — for pushing the contemporary state of the art in hardware to its limits. Even Wing Commander didn’t go so far as to absolutely require memory beyond 640 K, but it did use it to make the player’s audiovisual experience snazzier if it was present. Setting a precedent future games would largely follow, it was quite inflexible in its approach, demanding EMS — as opposed to XMS — memory. In the future, gamers would have to become all too familiar with the differences between the two standards, and how to configure their machines to use one or the other. Setting another precedent, Wing Commander‘s “installation guide” included a section on “memory usage” that was required reading in order to get things working properly. In the future, such sections would only grow in length and complexity, and would need to be pored over by long-suffering gamers with far more concentrated attention than anything in the manual having anything to do with how to actually play the games they purchased.

In Accolade’s embarrassing Leisure Suit Larry knockoff Les Manley in: Lost in LA, the title character explains EMS and XMS memory to some nubile companions. The ironic thing was that anyone who wished to play the latest games on an MS-DOS machine really did need to know this stuff, or at least have a friend who did.

Thus began the period of almost a decade, remembered with chagrin but also often with an odd sort of nostalgia by old-timers today, in which gamers spent hours monkeying about with MS-DOS’s “config.sys” and “autoexec.bat” files and swapping in and out various third-party utilities in the hope of squeezing out that last few kilobytes of conventional memory that Game X needed to run. The techniques they came to employ were legion.

In the process of developing Windows, Microsoft had discovered that the kernel of MS-DOS itself, a fairly tiny program thanks to its sheer age, could be stashed into the first 64 K of memory beyond 1 MB and still accessed like conventional memory on an 80286 or later processor in real mode thanks to what was essentially an undocumented technical glitch in the design of those processors. Gamers thus learned to include the line “DOS=HIGH” in their configuration files, freeing up a precious block of conventional memory. Likewise, there was enough unused space scattered around in the 384 K of high memory on most machines to stash many or all of MS-DOS’s device drivers there instead of in conventional memory. Thus “DOS=HIGH” soon became “DOS=HIGH,UMB,” the second parameter telling the computer to make use of these so-called “upper-memory blocks” and thereby save that many kilobytes more.

These were the most basic techniques, the starting points. Suffice to say that things got a lot more complicated from there, turning into a baffling tangle of tweaks, some saving mere bytes rather than kilobytes of conventional memory, but all of them important if one was to hope to run games that by 1993 would be demanding 604 K of 640 K for their own use. That owners of machines which by that point typically contained memories in the multi-megabytes should have to squabble with the operating system over mere handfuls of bytes was made no less vexing by being so comically absurd. And every new game seemed to up the ante, seemed to demand that much more conventional memory. Those with a sunnier disposition or a more technical bent of mind took the struggle to get each successive purchase running as the game before the game got started, as it were. Everyone else gnashed their teeth and wondered for the umpteenth time if they might not have been better off buying a console where games Just Worked. The only thing that made it all worthwhile was the mixture of relief, pride, and satisfaction that ensued when you finally got it all put together just right and the title screen came up and the intro music sprang to life — if, that is, you’d managed to configure your sound card properly in the midst of all your other travails. Such was the life of the MS-DOS gamer.

Before leaving the issue of the 640 K barrier behind in exactly the way that all those afflicted by it for so many years were so conspicuously unable to do, we have to address Bill Gates’s famous claim, allegedly made at a trade show in 1981, that “640 K ought to be enough for anybody.” The quote has been bandied about for years as computer-industry legend, seeming to confirm as it does the stereotype of Bill Gates as the unimaginative dirty trickster of his industry, as opposed to Steve Jobs the guileless visionary (the truth is, needless to say, far more complicated). Sadly for the stereotypers, however, the story of the quote is similar to all too many legends in the sense that it almost certainly never happened. Gates himself, for one, vehemently denies ever having said any such thing. Fred Shapiro, for another, editor of The Yale Book of Quotations, conducted an exhaustive search for a reputable source for the quote in 2008, going so far as to issue a public plea in The New York Times for anyone possessing knowledge of such a source to contact him. More than a hundred people did so, but none of them could offer up the smoking gun Shapiro sought, and he was left more certain than ever that the comment was “apocryphal.” So, there you have it. Blame Bill Gates all you want for the creaky operating system that was the real root cause of all of the difficulties I’ve spent this article detailing, but don’t ever imagine he was stupid enough to say that. “No one involved in computers would ever say that a certain amount of memory is enough for all time,” said Gates in 2008. Anyone doubting the wisdom of that assertion need only glance at the history of the IBM PC.

(Sources: the books Upgrading and Repairing PCs, 3rd edition by Scott Mueller and Principles of Operating Systems by Brian L. Stuart; Computer Gaming World of June 1993; Byte of January 1982, November 1984, and March 1992; Byte‘s IBM PC special issues of Fall 1985 and Fall 1986; PC Magazine of May 14 1985, January 14 1986, May 30 1989, June 13 1989, and June 27 1989; the episode of the Computer Chronicles television show entitled “High Memory Management”; the online article “The ‘640K’ quote won’t go away — but did Gates really say it?” on Computerworld.)

Footnotes

Footnotes
1 Yes, that is quite possibly the nerdiest thing I’ve ever written.
 
 

Tags: , , ,

Ultima VI

After Richard Garriott and his colleagues at Origin Systems finished each Ultima game — after the manic final crunch of polishing and testing, after the release party, after the triumphant show appearances and interviews in full Lord British regalia — there must always arise the daunting question of what to do next. Garriott had set a higher standard for the series than that of any of its competitors almost from the very beginning, when he’d publicly declared that no Ultima would ever reuse the engine of its predecessor, that each new entry in the series would represent a significant technological leap over what had come before. And just to add to that pressure, starting with Ultima IV he’d begun challenging himself to make each new Ultima a major thematic statement that also built on what had come before. Both of these bars became harder and harder to meet as the series advanced.

As if that didn’t present enough of a burden, each individual entry in the series came with its own unique psychological hurdles for Garriott to overcome. For example, by the time he started thinking about what Ultima V should be he’d reached the limits of what a single talented young man like himself could design, program, write, and draw all by himself on his trusty Apple II. It had taken him almost a year — a rather uncomfortable year for his brother Robert and the rest of Origin’s management — to accept that reality and to begin to work in earnest on Ultima V with a team of others.

The challenge Garriott faced after finishing and releasing that game in March of 1988 was in its way even more emotionally fraught: the challenge of accepting that, just as he’d reached the limits of what he could do alone on the Apple II a couple of years ago, he’d now reached the limits of what any number of people could do on Steve Wozniak’s humble little 8-bit creation. Ultima V still stands today as one of the most ambitious things anyone has ever done on an Apple II; it was hard at the time and remains hard today to imagine how Origin could possibly push the machine much further. Yet that wasn’t even the biggest problem associated with sticking with the platform; the biggest problem could be seen on each monthly sales report, which showed the Apple II’s numbers falling off even faster than those of the Commodore 64, the only other viable 8-bit computer remaining in the American market.

After serving as the main programmer on Ultima V, John Miles’s only major contribution to Ultima VI was the opening sequence. The creepy poster of a pole-dancing centaur hanging on the Avatar’s wall back on Earth has provoked much comment over the years…

Garriott was hardly alone at Origin in feeling hugely loyal to the Apple II, the only microcomputer he’d ever programmed. While most game developers in those days ported their titles to many platforms, almost all had one which they favored. Just as Epyx knew the Commodore 64 better than anyone else, Sierra had placed their bets on MS-DOS, and Cinemaware was all about the Commodore Amiga, Origin was an Apple II shop through and through. Of the eleven games they’d released from their founding in 1983 through to the end of 1988, all but one had been born and raised on an Apple II.

Reports vary on how long and hard Origin tried to make Ultima VI work on the Apple II. Richard Garriott, who does enjoy a dramatic story even more than most of us, has claimed that Origin wound up scrapping nine or even twelve full months of work; John Miles, who had done the bulk of the programming for Ultima V and was originally slated to fill the same role for the sequel, estimated to me that “we probably spent a few months on editors and other utilities before we came to our senses.” At any rate, by March of 1989, the one-year anniversary of Ultima V‘s release, the painful decision had been made to switch not only Ultima VI but all of Origin’s ongoing and future projects to MS-DOS, the platform that was shaping up as the irresistible force in American computer gaming. A slightly petulant but nevertheless resigned Richard Garriott slapped an Apple sticker over the logo of the anonymous PC clone now sitting on his desk and got with the program.

Richard Garriott with an orrery, one of the many toys he kept at the recently purchased Austin house he called Britannia Manor.

Origin was in a very awkward spot. Having frittered away a full year recovering from the strain of making the previous Ultima, trying to decide what the next Ultima should be, and traveling down the technological cul de sac that was now the Apple II, they simply had to have Ultima VI finished — meaning designed and coded from nothing on an entirely new platform — within one more year if the company was to survive. Origin had never had more than a modestly successful game that wasn’t an Ultima; the only way their business model worked was if Richard Garriott every couple of years delivered a groundbreaking new entry in their one and only popular franchise and it sold 200,000 copies or more.

John Miles, lacking a strong background in MS-DOS programming and the C language in which all future Ultimas would be coded, was transferred off the team to get himself up to speed and, soon enough, to work on middleware libraries and tools for the company’s other programmers. Replacing him on the project in Origin’s new offices in Austin, Texas, were Herman Miller and Cheryl Chen, a pair of refugees from the old offices in New Hampshire, which had finally been shuttered completely in January of 1989. It was a big step for both of them to go from coding what until quite recently had been afterthought MS-DOS versions of Origin’s games to taking a place at the center of the most critical project in the company. Fortunately, both would prove more than up to the task.

Just as Garriott had quickly learned to like the efficiency of not being personally responsible for implementing every single aspect of Ultima V, he soon found plenty to like about the switch to MS-DOS. The new platform had four times the memory of the Apple II machines Origin had been targeting before, along with (comparatively) blazing-fast processors, hard drives, 256-color VGA graphics, sound cards, and mice. A series that had been threatening to burst the seams of the Apple II now had room to roam again. For the first time with Ultima VI, time rather than technology was the primary restraint on Garriott’s ambitions.

But arguably the real savior of Ultima VI was not a new computing platform but a new Origin employee: one Warren Spector, who would go on to join Garriott and Chris Roberts — much more on him in a future article — as one of the three world-famous game designers to come out of the little collective known as Origin Systems. Born in 1955 in New York City, Spector had originally imagined for himself a life in academia as a film scholar. After earning his Master’s from the University of Texas in 1980, he’d spent the next few years working toward his PhD and teaching undergraduate classes. But he had also discovered tabletop gaming at university, from Avalon Hill war games to Dungeons & Dragons. When a job as a research archivist which he’d thought would be his ticket to the academic big leagues unexpectedly ended after just a few months, he wound up as an editor and eventually a full-fledged game designer at Steve Jackson Games, maker of card games, board games, and RPGs, and a mainstay of Austin gaming circles. It was through Steve Jackson, like Richard Garriott a dedicated member of Austin’s local branch of the Society for Creative Anachronism, that Spector first became friendly with the gang at Origin; he also discovered Ultima IV, a game that had a profound effect on him. He left Austin in March of 1987 for a sojourn in Wisconsin with TSR, the makers of Dungeons & Dragons, but, jonesing for the warm weather and good barbecue of the city that had become his adopted hometown, he applied for a job with Origin two years later. Whatever role his acquaintance with Richard Garriott and some of the other folks there played in getting him an interview, it certainly didn’t get him a job all by itself; Spector claims that Dallas Snell, Robert Garriott’s right-hand man running the business side of the operation, grilled him for an incredible nine hours before judging him worthy of employment. (“May you never have to live through something like this just to get a job,” he wishes for all and sundry.) Starting work at Origin on April 12, 1989, he was given the role of producer on Ultima VI, the high man on the project totem pole excepting only Richard Garriott himself.

Age 33 and married, Spector was one of the oldest people employed by this very young company; he realized to his shock shortly after his arrival that he had magazine subscriptions older than Origin’s up-and-coming star Chris Roberts. A certain wisdom born of his age, along with a certain cultural literacy born of all those years spent in university circles, would serve Origin well in the seven years he would remain there. Coming into a company full of young men who had grand dreams of, as their company’s tagline would have it, “creating worlds,” but whose cultural reference points didn’t usually reach much beyond Lord of the Rings and Star Wars, Spector was able to articulate Origin’s ambitions for interactive storytelling in a way that most of the others could not, and in time would use his growing influence to convince management of the need for a real, professional writing team to realize those ambitions. In the shorter term — i.e., in the term of the Ultima VI project — he served as some badly needed adult supervision, systematizing the process of development by providing everyone on his team with clear responsibilities and by providing the project as a whole with the when and what of clear milestone goals. The project was so far behind that everyone involved could look forward to almost a year of solid crunch time as it was; Spector figured there was no point in making things even harder by letting chaos reign.

On the Ultima V project, it had been Dallas Snell who had filled the role of producer, but Snell, while an adept organizer and administrator, wasn’t a game designer or a creative force by disposition. Spector, though, proved himself capable of tackling the Ultima VI project from both sides, hammering out concrete design documents from the sometimes abstracted musings of Richard Garriott, then coming up with clear plans to bring them to fruition. In the end, the role he would play in the creation of Ultima VI was as important as that of Garriott himself. Having learned to share the technical burden with Ultima V — or by now to pass it off entirely; he never learned C and would never write a single line of code for any commercial game ever again — Garriott was now learning to share the creative burden as well, another necessary trade-off if his ever greater ambitions for his games were to be realized.

If you choose not to import an Ultima V character into Ultima VI, you go through the old Ultima IV personality test, complete with gypsy soothsayer, to come up with your personal version of the Avatar. By this time, however, with the series getting increasingly plot-heavy and the Avatar’s personality ever more fleshed-out within the games, the personality test was starting to feel a little pointless. Blogger Chet Bolingbroke, the “CRPG Addict,” cogently captured the problems inherent in insisting that all of these disparate Ultima games had the same hero:
 
Then there’s the Avatar. Not only is it unnecessary to make him the hero of the first three games, as if the Sosarians and Britannians are so inept they always need outside help to solve their problems, but I honestly think the series should have abandoned the concept after Ultima IV. In that game, it worked perfectly. The creators were making a meta-commentary on the very nature of playing role-playing games. The Avatar was clearly meant to be the player himself or herself, warped into the land through the “moongate” of his or her computer screen, represented as a literal avatar in the game window. Ultima IV was a game that invited the player to act in a way that was more courageous, more virtuous, more adventurous than in the real world. At the end of the game, when you’re manifestly returned to your real life, you’re invited to “live as an example to thine own people”–to apply the lesson of the seven virtues to the real world. It was brilliant. They should have left it alone.
 
Already in Ultima V, though, they were weakening the concept. In that game, the Avatar is clearly not you, but some guy who lives alone in his single-family house of a precise layout. But fine, you rationalize, all that is just a metaphor for where you actually do live. By Ultima VI, you have some weird picture of a pole-dancing centaur girl on your wall, you’re inescapably a white male with long brown hair.

Following what had always been Richard Garriott’s standard approach to making an Ultima, the Ultima VI team concentrated on building their technology and then building a world around it before adding a plot or otherwise trying to turn it all into a real game with a distinct goal. Garriott and others at Origin would always name Times of Lore, a Commodore 64 action/CRPG hybrid written by Chris Roberts and published by Origin in 1988, as the main influence on the new Ultima VI interface, the most radically overhauled version of same ever to appear in an Ultima title. That said, it should be noted that Times of Lore itself lifted many or most of its own innovations from The Faery Tale Adventure, David Joiner’s deeply flawed but beautiful and oddly compelling Commodore Amiga action/CRPG of 1987. By way of completing the chain, much of Times of Lore‘s interface was imported wholesale into Ultima VI; even many of the onscreen icons looked exactly the same. The entire game could now be controlled, if the player liked, with a mouse, with all of the keyed commands duplicated as onscreen buttons; this forced Origin to reduce the “alphabet soup” that had been previous Ultima interfaces, which by Ultima V had used every letter in the alphabet plus some additional key combinations, to ten buttons, with the generic “use” as the workhorse taking the place of a multitude of specifics.

Another influence, one which Origin was for obvious reasons less eager to publicly acknowledge than that of Times of Lore, was FTL’s landmark 1987 CRPG Dungeon Master, a game whose influence on its industry can hardly be overstated. John Miles remembers lots of people at Origin scrambling for time on the company’s single Atari ST in order to play it soon after its release. Garriott himself has acknowledged being “ecstatic” for his first few hours playing it at all the “neat new things I could do.” Origin co-opted  Dungeon Master‘s graphical approach to inventory management, including the soon-to-be ubiquitous “paper doll” method of showing what characters were wearing and carrying.

Taking a cue from theories about good interface design dating back to Xerox PARC and Apple’s Macintosh design team, The Faery Tale Adventure, Times of Lore, and Dungeon Master had all abandoned “modes”: different interfaces — in a sense entirely different programs — which take over as the player navigates through the game. The Ultima series, like most 1980s CRPGs, had heretofore been full of these modes. There was one mode for wilderness travel; another for exploring cities, towns, and castles; another, switching from a third-person overhead view to a first-person view like Wizardry (or, for that matter, Dungeon Master), for dungeon delving. And when a fight began in any of these modes, the game switched to yet another mode for resolving the combat.

Ultima VI collapsed all of these modes down into a single unified experience. Wilderness, cities, and dungeons now all appeared on a single contiguous map on which combat also occurred, alongside everything else possible in the game; Ultima‘s traditionally first-person dungeons were now displayed using an overhead view like the rest of the game. From the standpoint of realism, this was a huge step back; speaking in strictly realistic terms, either the previously immense continent of Britannia must now be about the size of a small suburb or the Avatar and everyone else there must now be giants, building houses that sprawled over dozens of square miles. But, as we’ve had plenty of occasion to discuss in previous articles, the most realistic game design doesn’t always make the best game design. From the standpoint of creating an immersive, consistent experience for the player, the new interface was a huge step forward.

As the world of Britannia had grown more complex, the need to give the player a unified window into it had grown to match, in ways that were perhaps more obvious to the designers than they might have been to the players. The differences between the first-person view used for dungeon delving and the third-person view used for everything else had become a particular pain. Richard Garriott had this to say about the problems that were already dogging him when creating Ultima V, and the changes he thus chose to make in Ultima VI:

Everything that you can pick up and use [in Ultima V] has to be able to function in 3D [i.e., first person] and also in 2D [third person]. That meant I had to either restrict the set of things players can use to ones that I know I can make work in 3D or 2D, or make them sometimes work in 2D but not always work in 3D or vice versa, or they will do different things in one versus the other. None of those are consistent, and since I’m trying to create an holistic world, I got rid of the 3D dungeons.

Ultima V had introduced the concept of a “living world” full of interactive everyday objects, along with characters who went about their business during the course of the day, living lives of their own. Ultima VI would build on that template. The world was still constructed, jigsaw-like, from piles of tile graphics, an approach dating all the way back to Ultima I. Whereas that game had offered 16 tiles, however, Ultima VI offered 2048, all or almost all of them drawn by Origin’s most stalwart artist, Denis Loubet, whose association with Richard Garriott stretched all the way back to drawing the box art for the California Pacific release of Akalabeth. Included among these building blocks were animated tiles of several frames — so that, for instance, a water wheel could actually spin inside a mill and flames in a fireplace could flicker. Dynamic, directional lighting of the whole scene was made possible by the 256 colors of VGA. While Ultima V had already had a day-to-night cycle, in Ultima VI the sun actually rose in the east and set in the west, and torches and other light sources cast a realistic glow onto their surroundings.

256 of the 2048 tiles from which the world of Ultima VI was built.

In a clear signal of where the series’s priorities now lay, other traditional aspects of CRPGs were scaled back, moving the series further from its roots in tabletop Dungeons & Dragons. Combat, having gotten as complicated and tactical as it ever would with Ultima V, was simplified, with a new “auto-combat” mode included for those who didn’t want to muck with it at all; the last vestiges of distinct character races and classes were removed; ability scores were boiled down to just three numbers for Strength, Dexterity, and Intelligence. The need to mix reagents in order to cast spells, one of the most mind-numbingly boring aspects of a series that had always made you do far too many boring things, was finally dispensed with; I can’t help but imagine legions of veteran Ultima players breathing a sigh of relief when they read in the manual that “the preparation of a spell’s reagents is performed at the moment of spellcasting.” The dodgy parser-based conversation system of the last couple of games, which had required you to try typing in every noun mentioned by your interlocutor on the off chance that it would elicit vital further information, was made vastly less painful by the simple expedient of highlighting in the text those subjects into which you could inquire further.

Inevitably, these changes didn’t always sit well with purists, then or now. Given the decreasing interest in statistics and combat evinced by the Ultima series as time went on, as well as the increasing emphasis on what we might call solving the puzzles of its ever more intricate worlds, some have accused later installments of the series of being gussied-up adventure games in CRPG clothing; “the last real Ultima was Ultima V” isn’t a hard sentiment to find from a vocal minority on the modern Internet. What gives the lie to that assertion is the depth of the world modeling, which makes these later Ultimas flexible in ways that adventure games aren’t. Everything found in the world has, at a minimum, a size, a weight, and a strength. Say, then, that you’re stymied by a locked door. There might be a set-piece solution for the problem in the form of a key you can find, steal, or trade for, but it’s probably also possible to beat the door down with a sufficiently big stick and a sufficiently strong character, or if all else fails to blast it open with a barrel of dynamite. Thus your problems can almost never become insurmountable, even if you screw up somewhere else. Very few other games from Ultima VI‘s day made any serious attempt to venture down this path. Infocom’s Beyond Zork tried, somewhat halfheartedly, and largely failed at it; Sierra’s Hero’s Quest was much more successful at it, but on nothing like the scale of an Ultima. Tellingly, almost all of the “alternate solutions” to Ultima VI‘s puzzles emerge organically from the simulation, with no designer input whatsoever. Richard Garriott:

I start by building a world which you can interact with as naturally as possible. As long as I have the world acting naturally, if I build a world that is prolific enough, that has as many different kinds of natural ways to act and react as possible, like the real world does, then I can design a scenario for which I know the end goal of the story. But exactly whether I have to use a key to unlock the door, or whether it’s an axe I pick up to chop down the door, is largely irrelevant.

The complexity of the world model was such that Ultima VI became the first installment that would let the player get a job to earn money in lieu of the standard CRPG approach of killing monsters and taking their loot. You can buy a sack of grain from a local farmer, take the grain to a mill and grind it into flour, then sell the flour to a baker — or sneak into his bakery at night to bake your own bread using his oven. Even by the standards of today, the living world inside Ultima VI is a remarkable achievement — not to mention a godsend to those of us bored with killing monsters; you can be very successful in Ultima VI whilst doing very little killing at all.

A rare glimpse of Origin’s in-house Ultima VI world editor, which looks surprisingly similar to the game itself.

Plot spoilers begin!

It wasn’t until October of 1989, just five months before the game absolutely, positively had to ship, that Richard Garriott turned his attention to the Avatar’s reason for being in Britannia this time around. The core idea behind the plot came to him during a night out on Austin’s Sixth Street: he decided he wanted to pitch the Avatar into a holy war against enemies who, in classically subversive Ultima fashion, turn out not to be evil at all. In two or three weeks spent locked together alone in a room, subsisting on takeout Chinese food, Richard Garriott and Warren Spector created the “game” part of Ultima VI from this seed, with Spector writing it all down in a soy-sauce-bespattered notebook. Here Spector proved himself more invaluable than ever. He could corral Garriott’s sometimes unruly thoughts into a coherent plan on the page, whilst offering plenty of contributions of his own. And he, almost uniquely among his peers at Origin, commanded enough of Garriott’s respect — was enough of a creative force in his own right — that he could rein in the bad and/or overambitious ideas that in previous Ultimas would have had to be attempted and proved impractical to their originator. Given the compressed development cycle, this contribution too was vital. Spector:

An insanely complicated process, plotting an Ultima. I’ve written a novel, I’ve written [tabletop] role-playing games, I’ve written board games, and I’ve never seen a process this complicated. The interactions among all the characters — there are hundreds of people in Britannia now, hundreds of them. Not only that, but there are hundreds of places and people that players expect to see because they appeared in five earlier Ultimas.

Everybody in the realm ended up being a crucial link in a chain that adds up to this immense, huge, wonderful, colossal world. It was a remarkably complicated process, and that notebook was the key to keeping it all under control.

The chain of information you follow in Ultima VI is, it must be said, far clearer than in any of the previous games. Solving this one must still be a matter of methodically talking to everyone and assembling a notebook full of clues — i.e., of essentially recreating Garriott and Spector’s design notebook — but there are no outrageous intuitive leaps required this time out, nor any vital clues hidden in outrageously out-of-the-way locations. For the first time since Ultima I, a reasonable person can reasonably be expected to solve this Ultima without turning it into a major life commitment. The difference is apparent literally from your first moments in the game: whereas Ultima V dumps you into a hut in the middle of the wilderness — you don’t even know where in the wilderness — with no direction whatsoever, Ultima VI starts you in Lord British’s castle, and your first conversation with him immediately provides you with your first leads to run down. From that point forward, you’ll never be at a total loss for what to do next as long as you do your due diligence in the form of careful note-taking. Again, I have to attribute much of this welcome new spirit of accessibility and solubility to the influence of Warren Spector.

Ultima VI pushes the “Gargoyles are evil!” angle hard early on, going so far as to have the seemingly demonic beasts nearly sacrifice you to whatever dark gods they worship. This of course only makes the big plot twist, when it arrives, all the more shocking.

At the beginning of Ultima VI, the Avatar — i.e., you — is called back to Britannia from his homeworld of Earth yet again by the remarkably inept monarch Lord British to deal with yet another crisis which threatens his land. Hordes of terrifyingly demonic-looking Gargoyles are pouring out of fissures which have opened up in the ground everywhere and making savage war upon the land. They’ve seized and desecrated the eight Shrines of Virtue, and are trying to get their hands on the Codex of Ultimate Wisdom, the greatest symbol of your achievements in Ultima IV.

But, in keeping with the shades of gray the series had begun to layer over the Virtues with Ultima V, nothing is quite as it seems. In the course of the game, you discover that the Gargoyles have good reason to hate and fear humans in general and you the Avatar in particular, even if those reasons are more reflective of carelessness and ignorance on the part of you and Lord British’s peoples than they are of malice. To make matters worse, the Gargoyles are acting upon a religious prophecy — conventional religion tends to take a beating in Ultima games — and have come to see the Avatar as nothing less than the Antichrist in their own version of the Book of Revelation. As your understanding of their plight grows, your goal shifts from that of ridding the land of the Gargoyle scourge by violent means to that of walking them back from attributing everything to a foreordained prophecy and coming to a peaceful accommodation with them.

Ultima VI‘s subtitle, chosen very late in the development process, is as subtly subversive as the rest of the plot. Not until very near the end of the game do you realize that The False Prophet is in fact you, the Avatar. As the old cliché says, there are two sides to every story. Sadly, the big plot twist was already spoiled by Richard Garriott in interviews before Ultima VI was even released, so vanishingly few players have ever gotten to experience its impact cold.

When discussing the story of Ultima VI, we shouldn’t ignore the real-world events that were showing up on the nightly news while Garriott and Spector were writing it. Mikhail Gorbachev had just made the impossibly brave decision to voluntarily dissolve the Soviet empire and let its vassal states go their own way, and just like that the Cold War had ended, not in the nuclear apocalypse so many had anticipated as its only possible end game but rather in the most blessed of all anticlimaxes in human history. For the first time in a generation, East was truly meeting West again, and each side was discovering that the other wasn’t nearly as demonic as they had been raised to believe. On November 10, 1989, just as Garriott and Spector were finishing their design notebook, an irresistible tide of mostly young people burst through Berlin’s forbidding Checkpoint Charlie to greet their counterparts on the other side, as befuddled guards, the last remnants of the old order, looked on and wondered what to do. It was a time of extraordinary change and hope, and the message of Ultima VI resonated with the strains of history.

Plot spoilers end.

When Garriott and Spector emerged from their self-imposed quarantine, the first person to whom they gave their notebook was an eccentric character with strong furry tendencies who had been born as David Shapiro, but who was known to one and all at Origin as Dr. Cat. Dr. Cat had been friends with Richard Garriott for almost as long as Denis Loubet, having first worked at Origin for a while when it was still being run out of Richard’s parents’ garage in suburban Houston. A programmer by trade — he had done the Commodore 64 port of Ultima V — Dr. Cat was given the de facto role of head writer for Ultima VI, apparently because he wasn’t terribly busy with anything else at the time. Over the next several months, he wrote most of the dialog for most of the many characters the Avatar would need to speak with in order to finish the game, parceling the remainder of the work out among a grab bag of other programmers and artists, whoever had a few hours or days to spare.

Origin Systems was still populating the games with jokey cameos drawn from Richard Garriott’s friends, colleagues, and family as late as Ultima VI. Thankfully, this along with other aspects of the “programmer text” syndrome would finally end with the next installment in the series, for which a real professional writing team would come aboard. More positively, do note the keyword highlighting in the screenshot above, which spared players untold hours of aggravating noun-guessing.

Everyone at Origin felt the pressure by now, but no one carried a greater weight on his slim shoulders than Richard Garriott. If Ultima VI flopped, or even just wasn’t a major hit, that was that for Origin Systems. For all that he loved to play His Unflappable Majesty Lord British in public, Garriott was hardly immune to the pressure of having dozens of livelihoods dependent on what was at the end of the day, no matter how much help he got from Warren Spector or anyone else, his game. His stress tended to go straight to his stomach. He remembers being in “constant pain”; sometimes he’d just “curl up in the corner.” Having stopped shaving or bathing regularly, strung out on caffeine and junk food, he looked more like a homeless man than a star game designer — much less a regal monarch — by the time Ultima VI hit the homestretch. On the evening of February 9, 1990, with the project now in the final frenzy of testing, bug-swatting, and final-touch-adding, he left Origin’s offices to talk to some colleagues having a smoke just outside. When he opened the security door to return, a piece of the door’s apparatus — in fact, an eight-pound chunk of steel — fell off and smacked him in the head, opening up an ugly gash and knocking him out cold. His panicked colleagues, who at first thought he might be dead, rushed him to the emergency room. Once he had had his head stitched up, he set back to work. What else was there to do?

Ultima VI shipped on time in March of 1990, two years almost to the day after Ultima V, and Richard Garriott’s fears (and stomach cramps) were soon put to rest; it became yet another 200,000-plus-selling hit. Reviews were uniformly favorable if not always ecstatic; it would take Ultima fans, traditionalists that so many of them were, a while to come to terms with the radically overhauled interface that made this Ultima look so different from the Ultimas of yore. Not helping things were the welter of bugs, some of them of the potentially showstopping variety, that the game shipped with (in years to come Origin would become almost as famous for their bugs as for their ambitious virtual world-building). In time, most if not all old-school Ultima fans were comforted as they settled in and realized that at bottom you tackled this one pretty much like all the others, trekking around Britannia talking to people and writing down the clues they revealed until you put together all the pieces of the puzzle. Meanwhile Origin gradually fixed the worst of the bugs through a series of patch disks which they shipped to retailers to pass on to their customers, or to said customers directly if they asked for them. Still, both processes did take some time, and the reaction to this latest Ultima was undeniably a bit muted — a bit conflicted, one might even say — in comparison to the last few games. It perhaps wasn’t quite clear yet where or if the Ultima series fit on these newer computers in this new decade.

Both the muted critical reaction and that sense of uncertainty surrounding the game have to some extent persisted to this day. Firmly ensconced though it apparently is in the middle of the classic run of Ultimas, from Ultima IV through Ultima VII, that form the bedrock of the series’s legacy, Ultima VI is the least cherished of that cherished group today, the least likely to be named as the favorite of any random fan. It lacks the pithy justification for its existence that all of the others can boast. Ultima IV was the great leap forward, the game that dared to posit that a CRPG could be about more than leveling up and collecting loot. Ultima V was the necessary response to its predecessor’s unfettered idealism; the two games together can be seen to form a dialog on ethics in the public and private spheres. And, later, Ultima VII would be the pinnacle of the series in terms not only of technology but also, and even more importantly, in terms of narrative and thematic sophistication. But where does Ultima VI stand in this group? Its plea for understanding rather than extermination is as important and well-taken today as it’s ever been, yet its theme doesn’t follow as naturally from Ultima V as that game’s had from Ultima IV, nor is it executed with the same sophistication we would see in Ultima VII. Where Ultima VI stands, then, would seem to be on a somewhat uncertain no man’s land.

Indeed, it’s hard not to see Ultima VI first and foremost as a transitional work. On the surface, that’s a distinction without a difference; every Ultima, being part of a series that was perhaps more than any other in the history of gaming always in the process of becoming, is a bridge between what had come before and what would come next. Yet in the case of Ultima VI the tautology feels somehow uniquely true. The graphical interface, huge leap though it is over the old alphabet soup, isn’t quite there yet in terms of usability. It still lacks a drag-and-drop capability, for instance, to make inventory management and many other tasks truly intuitive, while the cluttered onscreen display combines vestiges of the old, such as a scrolling textual “command console,” with this still imperfect implementation of the new. The prettier, more detailed window on the world is welcome, but winds up giving such a zoomed-in view in the half of a screen allocated to it that it’s hard to orient yourself. The highlighted keywords in the conversation engine are also welcome, but are constantly scrolling off the screen, forcing you to either lawnmower through the same conversations again and again to be sure not to miss any of them or to jot them down on paper as they appear. There’s vastly more text in Ultima VI than in any of its predecessors, but perhaps the kindest thing to be said about Dr. Cat as a writer is that he’s a pretty good programmer. All of these things would be fixed in Ultima VII, a game — or rather games; there were actually two of them, for reasons we’ll get to when the time comes — that succeeded in becoming everything Ultima VI had wanted to be. To use the old playground insult, everything Ultima VI can do Ultima VII can do better. One thing I can say, however, is that the place the series was going would prove so extraordinary that it feels more than acceptable to me to have used Ultima VI as a way station en route.

But in the even more immediate future for Origin Systems was another rather extraordinary development. This company that the rest of the industry jokingly referred to as Ultima Systems would release the same year as Ultima VI a game that would blow up even bigger than this latest entry in the series that had always been their raison d’être. I’ll tell that improbable story soon, after a little detour into some nuts and bolts of computer technology that were becoming very important — and nowhere more so than at Origin — as the 1990s began.

(Sources: the books Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, The Official Book of Ultima, Second Edition by Shay Addams, and Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector; ACE of April 1990; Questbusters of November 1989, January 1990, March 1990, and April 1990; Dragon of July 1987; Computer Gaming World of March 1990 and June 1990; Origin’s in-house newsletter Point of Origin of August 7 1991. Online sources include Matt Barton’s interviews with Dr. Cat and Warren Spector’s farewell letter from the Wing Commander Combat Information Center‘s document archive. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Ultima VI is available for purchase from GOG.com in a package that also includes Ultima IV and Ultima V.)

 

Tags: , , ,

Opening the Gold Box, Part 5: All That Glitters is Not Gold

SSI entered 1989 a transformed company. What had been a niche maker of war games for grognards had now become one of the computer-game industry’s major players thanks to the first fruits of the coveted TSR Dungons & Dragons license. Pool of Radiance, the first full-fledged Dungeons & Dragons CRPG and the first in a so-called “Gold Box” line of same, was comfortably outselling the likes of Ultima V and The Bard’s Tale III, and was well on its way to becoming SSI’s best-selling game ever by a factor of four. To accommodate their growing employee rolls, SSI moved in 1989 from their old offices in Mountain View, California, which had gotten so crowded that some people were forced to work in the warehouse using piles of boxed games for desks, to much larger, fancier digs in nearby Sunnyvale. Otherwise it seemed that all they had to do was keep on keeping on, keep on riding Dungeons & Dragons for all it was worth — and, yes, maybe release a war game here and there as well, just for old times’ sake.

One thing that did become more clear than ever over the course of the year, however, was that not all Dungeons & Dragons products were created equal. Dungeon Masters Assistant Volume II: Characters & Treasures sold just 13,516 copies, leading to the quiet ending of the line of computerized aids for the tabletop game that had been one of the three major pillars of SSI’s original plans for Dungeons & Dragons. A deviation from that old master plan called War of the Lance, an attempt to apply SSI’s experience with war games to TSR’s Dragonlance campaign setting, did almost as poorly, selling 15,255 copies. Meanwhile the second of the “Silver Box” line of action-oriented games that made up the second of the pillars continued to perform well: Dragons of Flame sold 55,711 copies. Despite that success, though, 1989 would also mark the end of the line for the Silver Box, due to a breakdown in relations with the British developers behind those games. Going into the 1990s, then, Dungeons & Dragons on the computer would be all about the Gold Box line of turn-based traditional CRPGs, the only one of SSI’s three pillars still standing.

Thankfully, what Pool of Radiance had demonstrated in 1988 the events of 1989 would only confirm. What players seemed to hunger for most of all in the context of Dungeons & Dragons on the computer was literally Dungeons & Dragons on the computer: big CRPGs that implemented as many of the gnarly details of the rules as possible. Even Hillsfar, a superfluous and rather pointless sort of training ground for characters created in Pool of Radiance, sold 78,418 copies when SSI released it in March as a stopgap to give the hardcore something to do while they waited for the real Pool sequel.

Every female warrior knows that cleavage is more important than protection, right?

They didn’t have too long to wait. The big sequel dropped in June in the form of Curse of the Azure Bonds, and it mostly maintained the high design standard set by Pool of Radiance. Contrarians could and did complain that the free-roaming wilderness map of its predecessor had been replaced by a simple menu of locations to visit, but for this player anyway Pool‘s overland map always felt more confusing than necessary. A more notable loss in my view is the lack of any equivalent in Curse to the satisfying experience of slowly reclaiming the village of Phlan block by block from the forces of evil in Pool, but that brilliant design stroke was perhaps always doomed to be a one-off. Ditto Pool‘s unique system of quests to fulfill, some of them having little or nothing to do with the main plot.

What players did get in Curse of the Azure Bonds was the chance to explore a much wider area around Phlan with the same characters they had used last time, fighting a selection of more powerful and interesting monsters appropriate to their party’s burgeoning skills. At the beginning of the game, the party wakes up with a set of tattoos on their bodies —  the “azure bonds” of the title — and no memory of how they got there. (I would venture to guess that many of us have experienced something similar at one time or another…) It turns out that the bonds can be used to force the characters to act against their own will. Thus the quest is on to get them removed; each of the bonds has a different source, corresponding to a different area you will need to visit and hack and slash your way through in order to have it removed. By the end of Curse, your old Pool characters — or the new ones you created just for this game, who start at level 5 — will likely be in the neighborhood of levels 10 to 12, just about the point in Dungeons & Dragons where leveling up begins to lose much of its interest.

TSR was once again heavily involved in the making of Curse of the Azure Bonds, if not quite to the same extent as Pool of Radiance. As they had for Pool, they provided for Curse an official tie-in novel and tabletop adventure module. I can’t claim to have understood all of the nuances of the plot, such as they are, when I played the game; a paragraph book is once again used, but much of what I was told to read consisted of people that I couldn’t remember or never knew who they were babbling on about stuff I couldn’t remember or never knew what it was. But then, I know nothing about the Forgotten Realms setting other than what I learned in Pool of Radiance and never read the novel, so I’m obviously not the ideal audience. (Believe me, readers, I’ve done some painful things for this blog, but reading a Dungeons & Dragons novel was just a bridge too far…) Still, my cluelessness never interfered with my pleasure in mapping out each area and bashing things with my steadily improving characters; the standard of design in Curse remains as high as the writing remains breathlessly, entertainingly overwrought. Curse of the Azure Bonds did almost as well as its predecessor for SSI, selling 179,795 copies and mostly garnering the good reviews it deserved.

It was only with the third game of the Pool of Radiance series, 1990’s Secret of the Silver Blades, that some of the luster began to rub off of the Gold Box in terms of design, if not quite yet in that ultimate metric of sales. The reasons that Secret is regarded as such a disappointment by so many players — it remains to this day perhaps the least liked of the entire Gold Box line — are worth dwelling on for a moment.

One of the third game’s problems is bound up inextricably with the Dungeons & Dragons rules themselves. Secret of the Silver Blades allows you to take your old party from Pool of Radiance and/or Curse of the Azure Bonds up to level 15, but by this stage gaining a level is vastly less interesting than it was back in the day. Mostly you just get a couple of hit points, some behind-the-scenes improvements in to-hit scores, and perhaps another spell slot or two somewhere. Suffice to say that there’s no equivalent to, say, that glorious moment when you first gain access to the Fireball spell in Pool of Radiance.

The tabletop rules suggest that characters who reach such high levels should cease to concern themselves with dungeon delving in lieu of building castles and becoming generals or political leaders. Scorpia, Computer Gaming World‘s adventure and CRPG columnist, was already echoing these sentiments in the context of the Pool of Radiance series at the conclusion of her article on Curse of the Azure Bonds: “Characters have reached (by game’s end) fairly high levels, where huge amounts of experience are necessary to advance. If character transfer is to remain a part of the series (which I certainly hope it does), then emphasis needs to be placed on role-playing, rather than a lot of fighting. The true heart of AD&D is not rolling the dice, but the relationship between the characters and their world.” But this sort of thing, of course, the Gold Box engine was utterly unequipped to handle. In light of this, SSI probably should have left well enough alone, making Curse the end of the line for the Pool characters, but players were strongly attached to the parties they’d built up and SSI for obvious reasons wanted to keep them happy. In fact, they would keep them happy to the tune of releasing not just one but two more games which allowed players to use their original Pool of Radiance parties. By the time these characters finally did reach the end of the line, SSI would have to set them against the gods themselves in order to provide any semblance of challenge.

But by no means can all of the problems with Secret of the Silver Blades be blamed on high-level characters. The game’s other issues provide an interesting example of the unanticipated effects which technical affordances can have on game design, as well as a snapshot of changing cultures within both SSI and TSR.

A Gold Box map is built on a grid of exactly 16 by 16 squares, some of which can be “special” squares. When the player’s party enters one of the latter, a script runs to make something unusual happen — from something as simple as some flavor text appearing on the screen to something as complicated as an encounter with a major non-player character. The amount of special content allowed on any given map is restricted, however, by a limitation, stemming from the tiny memories of 8-bit machines like the Commodore 64 and Apple II, on the total size of all of the scripts associated with any given map.

One of the neat 16 by 16 maps found in Pool of Radiance and Curse of the Azure Bonds.

The need for each map to be no larger than 16 by 16 squares couldn’t help but have a major effect on the designs that were implemented with the Gold Box engine. In Pool of Radiance, for example, the division of the city of Phlan into a set of neat sections, to be cleared out and reclaimed one by one, had its origins as much in these technical restrictions as it did in design methodology. In that case it had worked out fantastically well, but by the time development began on Secret of the Silver Blades all those predictably uniform square maps had begun to grate on Dave Shelley, that game’s lead designer. Shelley and his programmers thus came up with a clever way to escape the system of 16 by 16 dungeons.

One of the things a script could do was to silently teleport the player’s party to another square on the map. Shelley and company realized that by making clever use of this capability they could create dungeon levels that gave the illusion of sprawling out wildly and asymmetrically, like real underground caverns would. Players who came into Secret of the Silver Blades expecting the same old 16 by 16 grids would be surprised and challenged. They would have to assume that the Gold Box engine had gotten a major upgrade. From the point of view of SSI, this was the best kind of technology refresh: one that cost them nothing at all. Shelley sketched out a couple of enormous underground complexes for the player to explore, each larger almost by an order of magnitude than anything that had been seen in a Gold Box game before.

A far less neat map from Secret of the Silver Blades. It may be more realistic in its way, but which would you rather try to draw on graph paper? It may help you to understand the scale of this map to know that the large empty squares at the bottom and right side of this map each represent a conventional 16 by 16 area like the one shown above.

But as soon as the team began to implement the scheme, the unintended consequences began to ripple outward. Because the huge maps were now represented internally as a labyrinth of teleports, the hugely useful auto-map had to be disabled for these sections. And never had the auto-map been needed more, for the player who dutifully mapped the dungeons on graph paper could no longer count on them being a certain size; they were constantly spilling off the page, forcing her to either start over or go to work on a fresh page stuck onto the old with a piece of tape. Worst of all, placing all of those teleports everywhere used just about all of the scripting space that would normally be devoted to providing other sorts of special squares. So, what players ended up with was an enormous but mind-numbingly boring set of homogeneous caverns filled with the same handful of dull random-monster encounters, coming up over and over and over. This was not, needless to say, an improvement on what had come before. In fact, it was downright excruciating.

At the same time that this clever technical trick was pushing the game toward a terminal dullness, other factors were trending in the same direction. Shelley himself has noted that certain voices within SSI were questioning whether all of those little extras found in Pool of Radiance and Curse of the Azure Bonds, like the paragraph books and the many scripted special encounters, were really necessary at all — or, at the least, perhaps it wasn’t necessary to do them with quite so much loving care. SSI was onto a good thing with these Gold Box games, said these voices — found mainly in the marketing department — and they ought to strike while the iron was hot, cranking them out as quickly as possible. While neither side would entirely have their way on the issue, the pressure to just make the games good enough rather than great in order to get them out there faster can be sensed in every Gold Box game after the first two. More and more graphics were recycled; fewer and fewer of those extra, special touches showed up. SSI never fully matched Pool of Radiance, much less improved on it, over the course of the ten Gold Box games that followed it. That SSI’s founder and president Joel Billings, as hardcore a gamer as any gaming executive ever, allowed this stagnation to take root is unfortunate, but isn’t difficult to explain. His passion was for the war games he’d originally founded SSI to make; all this Dungeons & Dragons stuff, while a cash cow to die for, was largely just product to him.

A similar complaint could be levied — and has been levied, loudly and repeatedly, by legions of hardcore Dungeons & Dragons fans over the course of decades — against Lorraine Williams, the wealthy heiress who had instituted a coup against Gary Gygax in 1985 to take over TSR. The idea that TSR’s long, slow decline and eventual downfall is due solely to Williams is more than a little dubious, given that Gygax and his cronies had already done so much to mismanage the company down that path before she ever showed up. Still, her list of wise strategic choices, at least after her very wise early decision to finally put Dungeons & Dragons on computers, is not a long one.

At the time they were signing the contract with SSI, TSR had just embarked on the most daunting project in the history of the company: a project to reorganize the Advanced Dungeons & Dragons rules, which had sprawled into eight confusing and sometimes contradictory hardcover books by that point, into a trio of books of relatively streamlined and logically organized information, all of it completely rewritten in straightforward modern English (as opposed to the musty diction of Gary Gygax, which read a bit like a cross of Samuel Johnson with H.P. Lovecraft). The fruits of the project appeared in 1989 in the form of a second-edition Player’s Handbook, Dungeon Master’s Guide, and Monstrous Compendium.

And then, right after expending so much effort to clean things up, TSR proceeded to muddy the second-edition waters even more indiscriminately than they had those of the first edition. Every single character class got its own book, and players with a hankering to play Dungeons & Dragons as a Viking or one of Charlemagne’s paladins were catered to. Indeed, TSR went crazy with campaign settings. By 1993, boxed sets were available to let you play in the Forgotten Realms, in the World of Greyhawk, or in Dragonlance‘s world of Krynn, or to play the game as a Jules Verne-esque science-fiction/fantasy hybrid called Spelljammer. You could also play Dungeons & Dragons as Gothic horror if you bought the Ravenloft set, as vaguely post-apocalyptic dark fantasy if you bought Dark Sun, as a set of tales from the Arabian Nights if you bought Al-Qadim, or as an exercise in surreal Expressionism worthy of Alfred Kubin if you bought Planescape.

Whatever the artistic merits behind all these disparate approaches — and some of them did, it should be said, have much to recommend them over the generic cookie-cutter fantasy that was vanilla Dungeons & Dragons — the commercial pressures that led Lorraine Williams to approve this glut of product aren’t hard to discern. The base of tabletop Dungeons & Dragons players hadn’t grown appreciably for many years. Just the opposite, in fact: it’s doubtful whether even half as many people were actively playing Dungeons & Dragons in 1990 as at the height of the brief-lived fad for the game circa 1982. After the existing player base had dutifully rushed out to buy the new second-edition core books, in other words, very few new players were discovering the game and thus continuing to drive their sales. Unless and until they could find a way to change that situation, the only way for TSR to survive was to keep generating gobs of new product to sell to their existing players. Luckily for them, hardcore Dungeons & Dragons players were tremendously loyal and tremendously dedicated to their hobby. Many would buy virtually everything TSR put out, even things that were highly unlikely ever to make it to their gaming tables, just out of curiosity and to keep up with the state of the art, as it were. It would take two or three years for players to start to evince some fatigue with the sheer volume of product pouring out of TSR’s Lake Geneva offices, much of it sorely lacking in play-testing and basic quality control, and to start giving large swathes of it a miss — and that, in turn, would spell major danger for TSR’s bottom line.

Lorraine Williams wasn’t unaware of the trap TSR’s static customer base represented; on the contrary, she recognized as plainly as anyone that TSR needed to expand into new markets if it was to have a bright long-term future. She made various efforts in that direction even as her company sustained itself by flooding the hardcore Dungeons & Dragons market. In fact, the SSI computer games might be described as one of these efforts — but even those, successful as they were on their own terms, were still playing at least partially to that same old captive market. In 1989, Williams opened a new TSR office on the West Coast in an attempt to break the company out of its nerdy ghetto. Run by Flint Dille, Williams’s brother, one of TSR West’s primary goals was to get Dungeons & Dragons onto television screens or, better yet, onto movie screens. Williams was ironically pursuing the same chimera that her predecessor Gary Gygax — now her sworn, lifetime arch-enemy — had so zealously chased. She was even less successful at it than he had been. Whereas Gygax had managed to get a Saturday morning cartoon on the air for a few seasons, Flint Dille’s operation managed bupkis in three long years of trying.

Another possible ticket to the mainstream, to be pursued every bit as seriously in Hollywood as a Dungeons & Dragons deal, was Buck Rogers, the source of the shared fortune of Lorraine Williams and Flint Dille. Their grandfather had been John F. Dille, owner of a newspaper syndicator known as the National Newspaper Service. In this capacity, the elder Dille had discovered the character that would become Buck Rogers — at the time, he was known as Anthony Rogers — in Armageddon 2419 A.D., a pulp novella written by Philip Francis Nowlan and published in Amazing Stories in 1928. Dille himself had come up with the nickname of “Buck” for the lead character, and convinced Nowlan to turn his adventures in outer space into a comic strip for his syndicator. It ended up running from 1929 until 1967 — only the first ten of those years under the stewardship of Nowlan — and was also turned into very popular radio and movie serials during the 1930s, the height of the character’s popularity. Having managed to secure all of the rights to Buck from a perhaps rather naive Nowlan, John Dille and his family profited hugely.

In marked contrast to her attitude toward TSR’s other intellectual properties, Lorraine Williams’s determination to return Buck Rogers to the forefront of pop culture was apparently born as much from a genuine passion for her family’s greatest legacy as it was from the dispassionate calculus of business. In addition to asking TSR West to lobby — once again fruitlessly, as it would transpire — for a Buck Rogers revival on television or film, she pushed a new RPG through the pipeline, entitled Buck Rogers XXVc and published in 1990. TSR supported the game fairly lavishly for several years in an attempt to get it to take off, releasing source books, adventure modules, and tie-in novels to little avail. With all due deference to Buck Rogers’s role as a formative influence on Star Wars among other beloved contemporary properties, in the minds of the Dungeons & Dragons generation it was pure cheese, associated mainly with the Dille family’s last attempt to revive the character, the hilariously campy 1979 television series Buck Rogers in the 25th Century. The game might have had a chance with some players had Williams been willing to recognize the cheese factor and let her designers play it up, but taken with a straight face? No way.

SSI as well was convinced — or coerced — to adapt the Gold Box engine from fantasy to science fiction for a pair of Buck Rogers computer games, 1990’s Countdown to Doomsday and 1992’s Matrix Cubed. SSI’s designers must have breathed a sigh of relief when they saw that the rules for the Buck Rogers tabletop RPG, much more so than any of TSR’s previous non-Dungeons & Dragons RPGs, had been based heavily on those of the company’s flagship game; thus the process of adaptation wasn’t quite so onerous as it might otherwise have been. That said, most agree that the end results are markedly less interesting than the other Gold Box games when it comes to combat, the very thing at which the engine normally excels; a combat system designed to include magic becomes far less compelling in its absence. Benefiting doubtless from its association with the Dungeons & Dragons Gold Box line, for which enthusiasm remained fairly high, the first Buck Rogers game sold a relatively healthy 51,528 copies; the second managed a somewhat less healthy 38,086 copies.

All of these competing interests do much to explain why TSR, after involving themselves so closely in the development of Pools of Radiance and Curse of the Azure Bonds, withdrew from the process almost entirely after those games and just left SSI to it. And that fact in turn is yet one more important reason why the Gold Box games not only failed to evolve but actually devolved in many ways. TSR’s design staff might not have had a great understanding of computer technology, but they did understand their settings and rules, and had pushed SSI to try to inject at least a little bit of what made for a great tabletop-role-playing experience into the computer games. Absent that pressure, SSI was free to fall back on what they did best — which meant, true to their war-game roots, lots and lots of combat. In both Pool and Curse, random encounters cease on most maps after you’ve had a certain number of them — ideally, just before they get boring. Tellingly, in Secret of the Silver Blades and most of the other later Gold Box games that scheme is absent. The monsters just keep on coming, ad infinitum.

Despite lukewarm reviews that were now starting to voice some real irritation with the Gold Box line’s failure to advance, Secret of the Silver Blades was another huge hit, selling 167,214 copies. But, in an indication that some of those who purchased it were perhaps disappointed enough by the experience not to continue buying Gold Box games, it would be the last of the line to break the 100,000-copy barrier. The final game in the Pool of Radiance series, Pools of Darkness, sold just 52,793 copies upon its release in 1991.

In addition to the four-game Pool series, SSI also released an alternate trilogy of Dungeons & Dragons Gold Box games set in Krynn, the world of the Dragonlance setting. Champions of Krynn was actually released before Secret of the Silver Blades, in January of 1990, and sold 116,693 copies; Death Knights of Krynn was released in 1991 and sold 61,958 copies; and The Dark Queen of Krynn, the very last Gold Box game, was released in 1992 and sold 40,640 copies. Another modest series of two games was developed out-of-house by Beyond Software (later to be renamed Stormfront Studios): Gateway to the Savage Frontier (1991, 62,581 copies sold) and Treasures of the Savage Frontier (1992, 31,995 copies sold). In all, then, counting the two Buck Rogers games but not counting the oddball Hillsfar, SSI released eleven Gold Box games over a period of four years.

While Secret of the Silver Blades still stands as arguably the line’s absolute nadir in design terms, the sheer pace at which SSI pumped out Gold Box games during the latter two years of this period in particular couldn’t help but give all of them a certain generic, interchangeable quality. It all began to feel a bit rote — a bit cheap, in stark contrast to the rarefied atmosphere of a Big Event that had surrounded Pool of Radiance, a game which had been designed and marketed to be a landmark premium product and had in turn been widely perceived as exactly that. Not helping the line’s image was the ludicrous knockoff-Boris Vallejo cover art sported by so many of the boxes, complete with lots of tawny female skin and heaving bosoms. Susan Manley has described the odd and somewhat uncomfortable experience of being a female artist asked to draw this sort of stuff.

They pretty much wanted everybody [female] to be the chainmail-bikini babes, as we called them. I said, “Look, not everybody wants to be a chainmail-bikini babe.” They said, “All the guys want that, and we don’t have very many female players.” I said, “You’re never going to have female players if you continue like this. Functional armor that would actually protect people would play a little bit better.”

Tom [Wahl, SSI’s lead artist] and I actually argued over whether my chest size was average or not, which was an embarrassing conversation to have. He absolutely thought that everybody needed to look like they were stepping out of a Victoria’s Secret catalog if they were female. I said, “Gee, how come all the guys don’t have to be super-attractive?” They don’t look like they’re off of romance-novel covers, let’s put it that way. They get to be rugged, they get to be individual, they get to all have different costumes. They get to all have different hairstyles, but the women all had to have long, flowing locks and lots of cleavage.

By 1991, the Gold Box engine was beginning to seem rather like a relic from technology’s distant past. In a sense, the impression was literally correct. When SSI had begun to build the Gold Box engine back in 1987, the Commodore 64 had still ruled the roost of computer gaming, prompting SSI to make the fateful decision not only to make sure the Gold Box games could run on that sharply limited platform, but also to build most of their development tools on it. Pool of Radiance then appeared about five minutes before the Commodore 64’s popularity imploded in the face of Nintendo. The Gold Box engine did of course run on other platforms, but it remained throughout its life subject to limitations born of its 8-bit origins — things like the aforementioned maps of exactly 16 by 16 squares and the strict bounds on the amount of custom scripting that could be included on a single one of those maps. Even as the rest of the industry left the 8-bit machines behind in 1989 and 1990, SSI was reluctant to do so in that the Commodore 64 still made up a major chunk of Gold Box sales: Curse of the Azure Bonds sold 68,622 copies on the Commodore 64, representing more than a third of its total sales, while Secret of the Silver Blades still managed a relatively healthy 40,425 Commodore 64 versions sold. Such numbers likely came courtesy of diehard Commodore 64 owners who had very few other games to buy in an industry that was moving more and more to MS-DOS as its standard platform. SSI was thus trapped for some time in something of a Catch-22, wanting to continue to reap the rewards of being just about the last major American publisher to support the Commodore 64 but having to compromise the experience of users with more powerful machines in order to do so.

SSI had managed to improve the Gold Box graphics considerably by the time of The Dark Queen of Krynn, the last game in the line.

When SSI finally decided to abandon the Commodore 64 in 1991, they did what they could to enhance the Gold Box engine to take advantage of the capabilities of the newer machines, introducing more decorative displays and pictures drawn in 256-color VGA along with some mouse support. Yet the most fundamental limitations changed not all; the engine was now aged enough that SSI wasn’t enthused about investing in a more comprehensive overhaul. And thus the Gold Box games seemed more anachronistic than ever. As SSI’s competitors worked on a new generation of CRPGs that took advantage of 32-bit processors and multi-megabyte memories, the Gold Box games remained the last surviving relics of the old days of 8 bits and 64 K. Looking at The Dark Queen of Krynn and the technical tour de force that was Origin’s Ultima VII side by side, it’s difficult to believe that the two games were released in the same year, much less that they were, theoretically at least, direct competitors.

It’s of course easy for us to look back today and say what SSI should have done. Instead of flooding the market with so many generic Gold Box games, they should have released just one game every year or eighteen months, each release reflecting a much more serious investment in writing and design as well as real, immediately noticeable technical improvements. They should, in other words, have strained to make every new Gold Box game an event like Pool of Radiance had been in its day. But this had never been SSI’s business model; they had always released lots of games, very few of which sold terribly well by the standard of the industry at large, but whose sales in the aggregate were enough to sustain them. When, beginning with Pool of Radiance, they suddenly were making hits by anybody’s standards, they had trouble adjusting their thinking to their post-Pool situation, had trouble recognizing that they could sell more units and make more money by making fewer but better games. Such is human nature; making such a paradigm shift would doubtless challenge any of us.

Luckily, just as the Gold Box sales began to tail off SSI found an alternative approach to Dungeons & Dragons on the computer from an unlikely source. Westwood Associates was a small Las Vegas-based development company, active since 1985, who had initially made their name doing ports of 8-bit titles to more advanced machines like the Commodore Amiga and Atari ST (among these projects had been ports of Epyx’s Winter Games, World Games, and California Games). What made Westwood unique and highly sought after among porters was their talent for improving their 8-bit source material enough, in terms of both audiovisuals and game play, that the end results would be accepted almost as native sons by the notoriously snobbish owners of machines like the Amiga. Their ambition was such that many publishers came to see the biggest liability of employing them as a tendency to go too far, to such an extent that their ports could verge on becoming new games entirely; for example, their conversion of Epyx’s Temple of Apshai on the Macintosh from turn-based to real-time play was rejected as being far too much of a departure.

Westwood first came to the attention of Gold Box fans when they were given the job of implementing Hillsfar, the stopgap “character training grounds” which SSI released between Pool of Radiance and Curse of the Azure Bonds. Far more auspicious were Westwood’s stellar ports of the mainline Gold Box games to the Amiga, which added mouse support and improved the graphics well before SSI’s own MS-DOS versions made the leap to VGA. But Brett Sperry and Louis Castle, Westwood’s founders, had always seen ports merely as a way of getting their foot in the door of the industry. Already by the time they began working with SSI, they were starting to do completely original games of their own for Electronic Arts and Mediagenic/Activision. (Their two games for the latter, both based on a board-game line called BattleTech, were released under the Infocom imprint, although the “real” Cambridge-based Infocom had nothing to do with them.) Westwood soon convinced SSI as well to let them make an original title alongside the implementation assignments: what must be the strangest of all the SSI Dungeons & Dragons computer games, a dragon flight simulator (!) called Dragon Strike. Released in 1990, it wasn’t quite an abject flop but neither was it a hit, selling 34,296 copies. With their next original game for SSI, however, Westwood would hit pay dirt.

Eye of the Beholder was conceived as Dungeons & Dragons meets Dungeon Master, bringing the real-time first-person game play of FTL’s seminal 1987 dungeon crawl to SSI’s product line. In a measure of just how ahead-of-its-time Dungeon Master had been in terms not only of technology but also of fundamental design, nothing had yet really managed to equal it over the three years since its release. Eye of the Beholder arguably didn’t fully manage that feat either, but it did at the very least come closer than most other efforts — and of course it had the huge advantage of the Dungeons & Dragons license. When a somewhat skeptical SSI sent an initial shipment of 20,000 copies into the distribution pipeline in February of 1991, “they all disappeared” in the words of Joel Billings: “We put them out and boom!, they were gone.” Eye of the Beholder went on to sell 129,234 copies, nicely removing some of the sting from the slow commercial decline of the Gold Box line and, indeed, finally giving SSI a major Dungeons & Dragons hit that wasn’t a Gold Box game. The inevitable sequel, released already in December of 1991, sold a more modest but still substantial 73,109 copies, and a third Eye of the Beholder, developed in-house this time at SSI, sold 50,664 copies in 1993. The end of the line for this branch of the computerized Dungeons & Dragons family came with the pointless Dungeon Hack, a game that, as its name implies, presented its player with an infinite number of generic randomly generated dungeons to hack her way through; it sold 27,110 copies following its release at the end of 1993.

This chart from the April 1991 Software Publishers Association newsletter shows just how quickly Eye of the Beholder took off. Unfortunately, this would mark the last time an SSI Dungeons & Dragons game would be in this position.

Despite their popularity in their heyday, the Eye of the Beholder games in my view have aged less gracefully than their great progenitor Dungeon Master, or for that matter even the early Gold Box games. If what you wished for more than anything when playing Dungeon Master was lots more — okay, any — story and lore to go along with the mapping, the combat, and the puzzles, these may be just the games for you. For the rest of us, though, the Dungeons & Dragons rules make for an awkward fit to real-time play, especially in contrast to Dungeon Master‘s designed-from-scratch-for-real-time systems of combat, magic, and character development. The dungeon designs and even the graphics similarly underwhelm; Eye of the Beholder looks a bit garish today in contrast to the clean minimalism of Dungeon Master. The world would have to wait more than another year, until the release of Ultima Underworld, to see a game that truly and comprehensively improved on the model of Dungeon Master. In the meantime, though, the Eye of the Beholder games would do as runners-up for folks who had played Dungeon Master and its sequel and still wanted more, or for those heavily invested in the Dungeons & Dragons rules and/or the Forgotten Realms setting.

For SSI, the sales of the Eye of the Beholder games in comparison to those of the latest Gold Box titles provided all too clear a picture of where the industry was trending. Players were growing tired of the Gold Box games; they hungered after faster-paced CRPGs that were prettier to look at and easier to control. While Eye of the Beholder was still high on the charts, TSR and SSI agreed to extend their original five-year contract, which was due to expire on January 1, 1993, by eighteen months to mid-1994. The short length of the extension may be indicative of growing doubts on the part of TSR about SSI’s ability to keep up with the competition in the CRPG market; one might see it as a way of putting them on notice that the TSR/SSI partnership was by no means set in stone for all time. At any rate, a key provision of the extension was that SSI must move beyond the fading Gold Box engine, must develop new technology to suit the changing times and to try to recapture those halcyon early days when Pool of Radiance ruled the charts and the world of gaming was abuzz with talk of Dungeons & Dragons on the computer. Accordingly, SSI put a bow on the Gold Box era in March of 1993 with the release of Unlimited Adventures, a re-packaging of their in-house development tools that would let diehard Gold Box fans make their own games to replace the ones SSI would no longer be releasing. It sold just 32,362 copies, but would go on to spawn a loyal community of adventure-makers that to some extent still persists to this day. As for what would come next for computerized Dungeons & Dragons… well, that’s a story for another day.

By way of wrapping up today’s story, I should note that my take on the Gold Box games, while I believe it dovetails relatively well with the consensus of the marketplace at the time, is by no means the only one in existence. A small but committed group of fans still loves these games — yes, all of them — for their approach to tactical combat, which must surely mark the most faithful implementation of the tabletop game’s rules for same ever to make it to the computer. “It’s hard to imagine a truly bad game being made with it,” says blogger Chester Bolingbroke — better known as the CRPG Addict — of the Gold Box engine. (Personally, I’d happily nominate Secret of the Silver Blades for that designation.)

Still, even the Gold Box line’s biggest fans will generally acknowledge that the catalog is very front-loaded in terms of innovation and design ambition. For those of you like me who aren’t CRPG addicts, I highly recommend Pool of Radiance and Curse of the Azure Bonds, which together let you advance the same party of characters just about as far as remains fun under the Dungeons & Dragons rules, showing off the engine at its best in the process. If the Gold Box games that came afterward wind up a bit of an anticlimactic muddle, we can at least still treasure those two genuine classics. And if you really do want more Gold Box after playing those two, Lord knows there’s plenty of it out there, enough to last most sane people a lifetime. Just don’t expect any of it to quite rise to the heights of the first games and you’ll be fine.

(Sources: This article is largely drawn from the collection of documents that Joel Billings donated to the Strong Museum of Play, which includes lots of internal SSI documents and some press clippings. Also, the book Designers & Dragons Volume 1 by Shannon Appelcline; Computer Gaming World of September 1989; Retro Gamer 52 and 89; Matt Barton’s video interviews with Joel Billings, Susan Manley, and Dave Shelley and Laura Bowen.

Many of the Gold Box games and the Eye of the Beholder trilogy are available for purchase from GOG.com. You may also wish to investigate The Gold Box Companion, which adds many modern conveniences to the original games.)

 
 

Tags: , ,

What’s the Matter with Covert Action?

Covert Action‘s cover is representative of the thankfully brief era when game publishers thought featuring real models on their boxes would drive sales. The results almost always ended up looking like bad romance-novel covers; this is actually one of the least embarrassing examples. (For some truly cringeworthy examples of artfully tousled machismo, see the Pirates! reissue or Space Rogue.)

In the lore of gaming there’s a subset of spectacular failures that have become more famous than the vast majority of successful games. From E.T.: The Extra-Terrestrial to Daikatana to Godus, this little rogue’s gallery inhabits its own curious corner of gaming history. The stories behind these games, carrying with them the strong scent of excess and scandal, can’t help but draw us in.

But there are also other, less scandalous cases of notable failure to which some of us continually return for reasons other than schadenfreude. One such case is that of Covert Action, Sid Meier and Bruce Shelley’s 1990 game of espionage. Covert Action, while not a great or even a terribly good game, wasn’t an awful game either. And, while it wasn’t a big hit, nor was it a major commercial disaster. By all rights it should have passed into history unremarked, like thousands of similarly middling titles before and after it. The fact that it has remained a staple of discussion among game designers for some twenty years now in the context of how not to make a game is due largely to Sid Meier himself, a very un-middling designer who has never quite been able to get Covert Action, one of his few disappointing games, out of his craw. Indeed, he dwells on it to such an extent that the game and its real or perceived problems still tends to rear its head every time he delivers a lecture on the art of game design. The question of just what’s the matter with Covert Action — the question of why it’s not more fun — continues to be asked and answered over and over, in the form of Meier’s own design lectures, extrapolations on Meier’s thesis by others, and even the occasional contrarian apology telling us that, no, actually, nothing‘s wrong with Covert Action.

What with piling onto the topic having become such a tradition in design circles, I couldn’t bear to let Covert Action‘s historical moment go by without adding the weight of this article to the pile. But first, the basics for those of you who wouldn’t know Covert Action if it walked up and invited you to dinner.

As I began to detail in my previous article, Covert Action‘s development at MicroProse, the company at which Sid Meier and Bruce Shelley worked during the period in question, was long by the standards of its time, troubled by the standards of any time, and more than a little confusing to track in our own time. Begun in early 1988 as a Commodore 64 game by Lawrence Schick, another MicroProse designer, it was conceived from the beginning as essentially an espionage version of Sid Meier’s earlier hit Pirates! — as a set of mini-games the player engaged in to affect the course of an overarching strategic game. But Schick found that he just couldn’t get the game to work, and moved on to something else. And that would have been that — except that Sid Meier had become intrigued by the idea, and picked it up for his own next project, moving it in the process from the Commodore 64 to MS-DOS, where it would have a lot more breathing room.

In time, though, the enthusiasm of Meier and his assistant designer Bruce Shelley also began to evaporate; they started spending more and more time dwelling on an alternative design. By August of 1989, they were steaming ahead with Railroad Tycoon, and all work on Covert Action for the nonce had ceased.

After Railroad Tycoon was completed and released in April of 1990, Meier and Shelley returned to Covert Action only under some duress from MicroProse’s head Bill Stealey. With the idea that would become Civilization already taking shape in Meier’s head, his enthusiasm for Covert Action was lower than ever, but needs must. As Shelley tells the story, Meier’s priorities were clear in light of the idea he had waiting in the wings. “We’re just getting this game done,” Meier said of Covert Action when Shelley tried to suggest ways of improving the still somehow unsatisfying design. “I’ve got to get this game finished.” It’s hard to avoid the impression that in the end Meier simply gave up on Covert Action. Yet, given the frequency with which he references it to this day, it’s seems equally clear that that capitulation has never sat well with him.

Covert Action casts you as the master spy Max Remington — or, in a nice nod to gender equality that was still unusual in a game of this era, as Maxine Remington. Max is the guy the CIA calls when they need someone to crack the really tough cases. The game presents you with a series of said tough cases, each involving a plot by some combination of criminal and/or terrorist groups to do something very bad somewhere in the world. Your objective is to figure out what group or groups are involved, figure out precisely what they’re up to, and foil their plot before they bring it to fruition. As usual for a Sid Meier game, you can play on any of four difficulty levels to ensure that everyone, from the rank beginner to the most experienced super-sleuth, can be challenged without being overwhelmed. If you do your job well, you will arrest the person at the top of the plot’s org chart, one of the game’s 26 evil masterminds. Once no more masterminds are left to arrest, Max can walk off into the sunset and enjoy a pleasant retirement, confident that he has made the world a safer place. (If only counter-terrorism was that easy in real life, right?)

The game lets Max/Maxine score with progressively hotter members of the opposite sex as he/she cracks more cases.

The strategic decisions you make in directing the course of your investigation will lead to naught if you don’t succeed at the various mini-games. These include rewiring a junction box to tap a suspect’s phone (Covert Action presents us with a weirdly low-tech version of espionage, even for its own day); cracking letter-substitution codes to decipher a suspect’s message traffic; tailing or chasing a suspect’s car; and, in the most elaborate of the mini-games, breaking into a group’s hideaway to either collect intelligence or make an arrest.

Covert Action seems to have all the makings of a good game — perhaps even another classic like its inspiration, Pirates!. But, as Sid Meier and most of the people who have played it agree, it doesn’t ever quite come together to become an holistically satisfying experience.

It’s not immediately obvious just why that should be the case; thus all of the discussion the game has prompted over the years. Meier does have his theory, to which he’s returned enough that he’s come to codify it into a universal design dictum he calls “The Covert Action rule.” For my part… well, I have a very different theory. So, first I’ll tell you about Meier’s theory, and then I’ll tell you about my own.

Meier’s theory hinges on the nature of the mini-games. He doesn’t believe that any of them are outright bad by any means, but does feel that they don’t blend well with the overarching strategic game, resulting in a lumpy stew of an experience that the player has trouble digesting. He’s particularly critical of the breaking-and-entering mini-game — a “mini-game” complicated enough that one could easily imagine it being released as a standalone game for the previous generation of computers (or, for that matter, for Covert Action‘s contemporaneous generation of consoles). Before you begin the breaking-and-entering game, you must choose what Max will carry with him: depending on your goals for this mission, you can give him some combination of a pistol, a sub-machine gun, a camera, several types of grenades, bugs, a Kevlar vest, a gas mask, a safe-cracking kit, and a motion detector. The underground hideaways and safe houses you then proceed to explore are often quite large, and full of guards, traps, and alarms to avoid or foil as you snoop for evidence or try to spirit away a suspect. You can charge in with guns blazing if you like, but, especially at the higher difficulty levels, that’s not generally a recipe for success. This is rather a game of stealth, of lurking in the shadows as you identify the guards’ patrol patterns, the better to avoid or quietly neutralize them. A perfectly executed mission in many circumstances will see you get in and out of the building without having to fire a single shot.

The aspect of this mini-game which Meier pinpoints as its problem is, somewhat ironically, the very ambition and complexity which makes it so impressive when considered alone. A spot of breaking and entering can easily absorb a very tense and intense half an hour of your time. By the time you make it out of the building, Meier theorizes, you’ve lost track of why you went in in the first place — lost track, in other words, of what was going on in the strategic game. Meier codified his theory in what has for almost twenty years been known in design circles as “the Covert Action rule.” In a nutshell, the rule states that “one good game is better than two great ones” in the context of a single game design. Meier believes that the mini-games of Covert Action, and the breaking-and-entering game in particular, can become so engaging and such a drain on the player’s time and energies that they clash with the strategic game; we end up with two “great games” that never make a cohesive whole. This dissonance never allows the player to settle into that elusive sense of total immersion which some call “flow.” Meier believes that Pirates! works where Covert Action doesn’t because the former’s mini-games are much shorter and much less complicated — getting the player back to the big picture, as it were, quickly enough that she doesn’t lose the plot of what the current situation is and what she’s trying to accomplish.

It’s an explanation that makes a certain sense on its face, yet I must say that it’s not one that really rings true to my own experiences with either games in general or Covert Action in particular. Certainly one can find any number of games which any number of players have hugely enjoyed that seemingly violate the Covert Action rule comprehensively. We could, for instance, look to the many modern CRPGs which include “sub-quests” that can absorb many hours of the player’s time, to no detriment to the player’s experience as a whole, at least if said players’ own reports are to be believed. If that’s roaming too far afield from the type of game which Covert Action is, consider the case of the strategy classic X-Com, one of the most frequently cited of the seeming Covert Action rule violators that paradoxically succeed as fun designs. It merges an overarching strategic game with a game of tactical combat that’s far more time-consuming and complicated than even the breaking-and-entering part of Covert Action. And yet it must place high in any ranking of the most beloved strategy games of all time. As we continue to look at specific counterexamples like X-Com or, for that matter, Pirates!, we can only continue to believe in the Covert Action rule by applying lots of increasingly tortured justifications for why this or that seemingly blatant violator nevertheless works as a game. So, X-Com, Meier tells us, works because the strategic game is relatively less complicated than the tactical game, leaving enough of the focus on the tactical game that the two don’t start to pull against one another. And Pirates!, of course, is just the opposite.

I can only say that when the caveats and exceptions to any given rule start to pile up, one is compelled to look back to the substance of the rule itself. As nice as it might be for the designers of Covert Action to believe the game’s biggest problem is that its individual parts were just each too darn ambitious, too darn good, I don’t think that’s the real reason the game doesn’t work.

So, we come back to the original question: just what is the matter with Covert Action? I don’t believe that Covert Action‘s core malady can be found in the mini-games, nor for that matter in the strategic game per se. I rather believe the problem is with the mission design and with the game’s fiction — which, as in so many games, are largely one and the same in this one. The cases you must crack in Covert Action are procedurally generated by the computer, using a set of templates into which are plugged different combinations of organizations, masterminds, and plots to create what is theoretically a virtually infinite number of potential cases to solve. My thesis is that it’s at this level — the level of the game’s fiction — where Covert Action breaks down; I believe that things have already gone awry as soon as the game generates the case it will ask you to solve, well before you make your first move. The, for lack of a better word, artificiality of the cases is never hard to detect. Even before you start to learn which of the limited number of templates are which, the stories just feel all wrong.

Literary critics have a special word, “mimesis,” which they tend to deploy when a piece of fiction conspicuously passes or fails the smell test of immersive believability. Dating back to classical philosophy, “mimesis” technically means the art of “showing” a story — as opposed to “diegesis,” the art of telling. It’s been adopted by theorists of textual interactive fiction as well as a stand-in for all those qualities of a game’s fiction that help to immerse the player in the story, that help to draw her in. “Crimes against Mimesis” — the name of an influential Usenet post written in 1996 by Roger Giner-Sorolla — are all those things, from problems with the interface to obvious flaws in the story’s logic to things that just don’t ring true somehow, that cast the player jarringly out of the game’s fiction — that reveal, in other words, the mechanical gears grinding underneath the game’s fictional veneer. Covert Action is full of these crimes against mimesis, full of these gears poking above the story’s surface. Groups that should hate each other ally with one another: the Colombian Cartel, the Mafia, the Palestine Freedom Organization (some names have been changed to protect the innocent or not-so-innocent), and the Stasi might all concoct a plot together. Why not? In the game’s eyes, they’re just interchangeable parts with differing labels on the front; they might as well have been called “Group A,” “Group B,” etc. When they send messages to one another, the diction almost always rings horribly, jarringly wrong in the ears of those of us who know what the groups represent. Here’s an example in the form of the Mafia talking like Jihadists.

If Covert Action had believable, mimetic, tantalizing — or at least interesting — plots to foil, I submit that it could have been a tremendously compelling game, without changing anything else about it. Instead, though, it’s got this painfully artificial box of whirling gears. Writing in the context of the problems of procedural generation in general, Kate Compton has called this the “10,000 Bowls of Oatmeal Problem.”

I can easily generate 10,000 bowls of plain oatmeal, with each oat being in a different position and different orientation, and mathematically speaking they will all be completely unique. But the user will likely just see a lot of oatmeal. Perceptual uniqueness is the real metric, and it’s darn tough. It is the difference between an actor being a face in a crowd scene and a character that is memorable.

Assuming that we can agree to agree, at least for now, that we’ve hit upon Covert Action‘s core problem, it’s not hard to divine how to fix it. I’m imagining a version of the game that replaces the infinite number of procedurally-generated cases with 25 or 30 hand-crafted plots, each with its own personality and its own unique flavor of intrigue. Such an approach would fix another complaint that’s occasionally levied against Covert Action: that it never becomes necessary to master or even really engage with all of its disparate parts because it’s very easy to rely just on those mini-games you happen to be best at to ferret out all of the relevant information. In particular, you can discover just about everything you need in the files you uncover during the breaking-and-entering game, without ever having to do much of anything in the realm of wire-tapping suspects, tailing them, or cracking their codes. This too feels like a byproduct of the generic templates used to construct the cases, which tend to err on the safe side to ensure that the cases are actually soluble, preferring — justifiably, in light of the circumstances — too many clues to too few. But this complaint could easily be fixed using hand-crafted cases. Different cases could be consciously designed to emphasize different aspects of the game: one case could be full of action, another more cerebral and puzzle-like, etc. This would do yet more to give each case its own personality and to keep the game feeling fresh throughout its length.

The most obvious argument against hand-crafted cases, other than the one, valid only from the developers’ standpoint, of the extra resources it would take to create them, is that it would exchange a game that is theoretically infinitely replayable for one with a finite span. Yet, given that Covert Action isn’t a hugely compelling game in its historical form, one has to suspect that my proposed finite version of it would likely yield more actual hours of enjoyment for the average player than the infinite version. Is a great game that lasts 30 hours and then is over better than a mediocre one that can potentially be played forever? The answer must depend on individual circumstances as well as individual predilections, but I know where I stand, at least as long as this world continues to be full of more cheap and accessible games than I can possibly play.

But then there is one more practical objection to my proposed variation of Covert Action, or rather one ironclad reason why it could never have seen the light of day: this simply isn’t how Sid Meier designs his games. Meier, you see, stands firmly on the other side of a longstanding divide that has given rise to no small dissension over the years in the fields of game design and academic game studies alike.

In academia, the argument has raged for twenty years between the so-called ludologists, who see games primarily as dynamic systems, and the narratologists, who see them primarily as narratives. Yet at its core the debate is actually far older even than that. In the December 1987 issue of his Journal of Computer Game Design, Chris Crawford fired what we might regard as the first salvo in this never-ending war via an article entitled “Process Intensity.” The titular phrase meant, he explained, “the degree to which a program emphasizes processes instead of data.” While all games must have some amount of data — i.e., fixed content, including fixed story content — a more process-intensive game — one that tips the balance further in favor of dynamic code as opposed to static data — is almost always a better game in Crawford’s view. That all games aren’t extremely process intensive, he baldly states, is largely down to the laziness of their developers.

The most powerful resistance to process intensity, though, is unstated. It is a mental laziness that afflicts all of us. Process intensity is so very hard to implement. Data intensity is easy to put into a program. Just get that artwork into a file and read it onto the screen; store that sound effect on the disk and pump it out to the speaker. There’s instant gratification in these data-intensive approaches. It looks and sounds great immediately. Process intensity requires all those hours mucking around with equations. Because it’s so indirect, you’re never certain how it will behave. The results always look so primitive next to the data-intensive stuff. So we follow the path of least resistance right down to data intensity.

Crawford, in other words, is a ludologist all the way. There’s always been a strongly prescriptive quality to the ludologists’ side of the ludology-versus-narratology debate, an ideology of how games ought to be made. Because processing is, to use Crawford’s words again, “the very essence of what a computer does,” the capability that in turn enables the interactivity that makes computer games unique as a medium, games that heavily emphasize processing are purer than those that rely more heavily on fixed data.

It’s a view that strikes me as short-sighted in a number of ways. It betrays, first of all, a certain programmer and systems designer’s bias against the artists and writers who craft all that fixed data; I would submit that the latter skills are every bit as worthy of admiration and every bit as valuable on most development teams as the former. Although even Crawford acknowledges that “data endows a game with useful color and texture,” he fails to account for the appeal of games where that very color and texture — we might instead say the fictional context — is the most important part of the experience. He and many of his ludologist colleagues are like most ideologues in failing to admit the possibility that different people may simply want different things, in games as in any other realm. Given the role that fixed stories have come to play in even many of the most casual modern games, too much ludologist rhetoric verges on telling players that they’re wrong for liking the games they happen to like. This is not to apologize for railroaded experiences that give the player no real role to play whatsoever and thereby fail to involve her in their fictions. It’s rather to say that drawing the line between process and data can be more complicated than saying “process good, data bad” and proceeding to act accordingly. Different games are at their best with different combinations of pre-crafted and generative content. Covert Action fails as a game because it draws that line in the wrong place. It’s thanks to the same fallacy, I would argue, that Chris Crawford has been failing for the last quarter century to create the truly open-ended interactive-story system he calls Storytron.

Sid Meier is an endlessly gracious gentleman, and thus isn’t so strident in his advocacy as many other ludologists. But despite his graciousness, there’s no doubt on which side of the divide he stands. Meier’s games never, ever include rigid pre-crafted scenarios or fixed storylines of any stripe. In most cases, this has been fine because his designs have been well-suited to the more open-ended, generative styles of play he favors. Covert Action, however, is the glaring exception, revealing one of the few blind spots of this generally brilliant game designer. Ironically, Meier had largely been drawn to Covert Action by what he calls the “intriguing” problem of its dynamic case generator. The idea of being able to use the computer to do the hard work of generating stories, and thereby to be able to churn out infinite numbers of the things at no expense, has always enticed him. He continues to muse today about a Sherlock Holmes game built using computer-generated cases, working backward from the solution of a crime to create a trail of clues for player to follow.

Meier is hardly alone in the annals of computer science and game design in finding the problem of automated story-making intriguing. Like his Sherlock Holmes idea, many experiments with procedurally-generated narratives have worked with mystery stories, that most overtly game-like of all literary genres; Covert Action‘s cases as well can be considered variations on the mystery theme.  As early as 1971, Sheldon Klein, a professor at the University of Wisconsin, created something he called an “automatic novel writer” for auto-generating “2100-word murder-mystery stories.” In 1983, Electronic Arts released Jon Freeman and Paul Reiche III’s Murder on the Zinderneuf as one of their first titles; it allowed the player to solve an infinite number of randomly generated mysteries occurring aboard its titular Zeppelin airship. That game’s flaws feel oddly similar to those of Covert Action. As in Covert Action, in Murder on the Zinderneuf the randomized cases never have the resonance of a good hand-crafted mystery story. That, combined with their occasional incongruities and the patterns that start to surface as soon as you’ve played a few times, means that you can never forget their procedural origins. These tales of intrigue never manage to truly intrigue.

Suffice to say that generating believable fictions, whether in the sharply delimited realm of a murder mystery taking place aboard a Zeppelin or the slightly less delimited realm of a contemporary spy thriller, is a tough nut to crack. Even one of the most earnest and concentrated of the academic attempts at tackling the problem, a system called Tale-Spin created by James Meehan at Yale University, continued to generate more unmimetic than mimetic stories after many years of work — and this system was meant only to generate standalone static stories, not interactive mysteries to be solved. And as for Chris Crawford’s Storytron… well, as of this writing it is, as its website says, in a “medically induced coma” for the latest of many massive re-toolings.

In choosing to pick up Covert Action primarily because of the intriguing problem of its case generator and then failing to consider whether said case generator really served the game, Sid Meier may have run afoul of another of his rules for game design, one that I find much more universally applicable than what Meier calls the Covert Action rule. A designer should always ask, Meier tells us, who is really having the fun in a game — the designer/programmer/computer or the player? The procedurally generated cases may have been an intriguing problem for Sid Meier the designer, but they don’t serve the player anywhere near as well as hand-crafted cases might have done.

The model that comes to mind when I think of my ideal version of Covert Action is Killed Until Dead, an unjustly obscure gem from Accolade which, like Murder on the Zinderneuf, I wrote about in an earlier article. Killed Until Dead is very similar to Murder on the Zinderneuf in that it presents the player with a series of mysteries to solve, all of which employ the same cast of characters, the same props, and the same setting. Unlike Murder on the Zinderneuf, however, the mysteries in Killed Until Dead have all been lovingly hand-crafted. They not only hang together better as a result, but they’re full of wit and warmth and the right sort of intrigue — they intrigue the player. If you ask me, a version of Covert Action built along similar lines, full of exciting plotlines with a ripped-from-the-headlines feel, could have been fantastic — assuming, of course, that MicroProse could have found writers and scenario designers with the chops to bring the spycraft to life.

It’s of course possible that my reaction to Covert Action is hopelessly subjective, inextricably tied to what I personally value in games. As my longtime readers are doubtless aware by now, I’m an experiential player to the core, more interested in lived experiences than twiddling the knobs of a complicated system just exactly perfectly. In addition to guaranteeing that I’ll never win any e-sports competitions — well, that and my aging reflexes that were never all that great to begin with — this fact colors the way I see a game like Covert Action. The jarring qualities of Covert Action‘s fiction may not bother some of you one bit. And thus the debate about what really is wrong with Covert Action, that strange note of discordance sandwiched between the monumental Sid Meier masterpieces Railroad Tycoon and Civilization, can never be definitely settled. Ditto the more abstract and even more longstanding negotiation between ludology and narratology. Ah, well… if nothing else, it ensures that readers and writers of blogs like this one will always have something to talk about. So, let the debate rage on.

(Sources: the books Expressive Processing by Noah Wardrip-Fruin and On Interactive Storytelling by Chris Crawford; Game Developer of February 2013. Links to online sources are scattered through the article.

If you’d like to enter the Covert Action debate for yourself, you can buy it from GOG.com.)

 
 

Tags: , , ,

Railroad Tycoon

It says much about Sid Meier, a born game designer if ever there was one, that he tended to get some of his best work done when he was allegedly on vacation. A few years after his significant other had lost all track of him on what she thought was a romantic getaway to the Caribbean but he came to see as the ideal chance to research his game Pirates!, another opportunity for couple time went awry in August of 1989, when he spent the entirety of a beach holiday coding a game about railroads on the computer he’d lugged with him. The experience may not have done his relationship any favors, but he did come home with the core of his second masterpiece of a game — a game that would usher in what many old-timers still regard as the golden age of computerized grand strategy.

That Meier felt empowered to spend so much time on a game that featured no war or killing says much about the changing times inside MicroProse, the erstwhile specialist in military simulations and war games he had co-founded with the flamboyant former active-duty Air Force pilot “Wild” Bill Stealey. The first great deviation from the norm for MicroProse had been Meier’s first masterpiece, the aforementioned Pirates! of 1987, which Stealey had somewhat begrudgingly allowed him to make as a palate cleanser between the company’s military games. After that, it had been back to business as usual for a while, with Meier designing a submarine simulator based on a Tom Clancy thriller (the audience synergy of that project was almost too perfect to be believed) and then a flight simulator based on the rampant speculation among aviation buffs about the Air Force’s cutting-edge new stealth fighter (the speculation would almost all prove to be incorrect when the actual stealth fighter was unveiled, leaving MicroProse with a “simulation” of an airplane that had never existed).

Yet by the time the latter game was nearing completion in late 1988, a couple of things were getting hard to ignore. First was the warm reception that had been accorded to Pirates!, the way that entirely new demographics of players who would never have dreamed of buying any of MicroProse’s other games were buying and enjoying this one. And second was the fact that the market for MicroProse’s traditional military simulations, while it had served them well — in fact, served them to the tune of nearly 1 million copies sold of their most successful simulation of all, Sid Meier’s F-15 Strike Eagle — was starting to show signs of having reached its natural limit. If, in other words, MicroProse hoped to continue to increase their sales each year — something the aggressive and ambitious Stealey liked doing even more than he liked making and flying flight simulators — they were going to have to push outside of Stealey’s comfort zone. Accordingly, MicroProse dramatically expanded the scope of their business in the last two years of the 1980s, buying the Firebird and Rainbird software labels from British Telecom and setting up an affiliated-label program for distributing the work of smaller publishers;  Stealey hoped the latter might come in time to rival the similar programs of Electronic Arts and Activision/Mediagenic. In terms of in-house development, meanwhile, MicroProse went from all military games all the time — apart from, that is, the aberration that had been Pirates! — to a half-and-half mixture of games in the old style and games that roamed further afield, in some cases right into the sweet spot that had yielded the big hit Pirates!.

Sid Meier, right, at MicroProse circa early 1990 with tester Russ Cooney.

Thus the first project which Sid Meier took up after finishing F-19 Stealth Fighter was a spy game called Covert Action. Made up like Pirates! of a collection of mini-games, Covert Action was very much in the spirit of that earlier game, but had been abandoned by its original designer Lawrence Schick as unworkable. Perhaps because of its similarities to his own earlier game, Meier thought he could make something out of it, especially if he moved it from the Commodore 64 to MS-DOS, which had become his new development platform with F-19 Stealth Fighter. But Covert Action proved to be one of those frustrating games that just refused to come together, even in the hands of a designer as brilliant as Meier. He therefore started spending more and more of the time he should have been spending on Covert Action tinkering with ideas and prototypes for other games. In the spring of 1989, he coded up a little simulation of a model railroad.

The first person to whom Meier showed his railroad game was Bruce Shelley, his “assistant” at MicroProse and, one senses, something of his protege, to whatever extent a man as quiet and self-effacing as Meier was can be pictured to have cultivated someone for such a role. Prior to coming to MicroProse, Shelley had spent his first six years or so out of university at Avalon Hill, the faded king of the previous decade’s halcyon years of American tabletop war-gaming. MicroProse had for some time been in the habit of hiring refugees from the troubled tabletop world, among them Arnold Hendrick and the aforementioned Lawrence Schick, but Shelley was hardly one of the more illustrious names among this bunch. Working as an administrator and producer at Avalon Hill, he’d had the opportunity to streamline plenty of the games the company had published during his tenure, but had been credited with only one original design of his own, a solitaire game called Patton’s Best. When he arrived at MicroProse in early 1988 — he says his application for employment there was motivated largely by the experience of playing Pirates! — Shelley was assigned to fairly menial tasks, like creating the maps for F-19 Stealth Fighter. Yet something about him clearly impressed Sid Meier. Shortly after F-19 Stealth Fighter was completed, Meier came to Shelley to ask if he’d like to become his assistant. Shelley certainly didn’t need to be asked twice. “Anybody in that office would have died for that position,” he remembers.

Much of Shelley’s role as Meier’s assistant, especially in the early days, entailed being a constantly available sounding board, playing with the steady stream of game prototypes Meier gave to him — Meier always seemed to have at least half a dozen such potential projects sitting on his hard drive alongside whatever project he was officially working on — and offering feedback. It was in this capacity that Shelley first saw the model-railroad simulation, whereupon it was immediately clear to him that this particular prototype was something special, that Meier was really on to something this time. Such was Shelley’s excitement, enthusiasm, and insightfulness that it wouldn’t take long for him to move from the role of Meier’s sounding board to that of his full-fledged co-designer on the railroad game, even as it always remained clear who would get to make the final decision on any question of design and whose name would ultimately grace the box.

It appears to have been Shelley who first discovered Will Wright’s landmark city simulation SimCity. Among the many possibilities it offered was the opportunity to add a light-rail system to your city and watch the little trains driving around; this struck Shelley as almost uncannily similar to Meier’s model railroad. He soon introduced Meier to SimCity, whereupon it became a major influence on the project. The commercial success of SimCity had proved that there was a place in the market for software toys without much of a competitive element, a description which applied perfectly to Meier’s model-railroad simulation at the time. “Yes, there is an audience out there for games that have a creative aspect to them,” Meier remembers thinking. “Building a railroad is something that can really emphasize that creative aspect in a game.”

And yet Meier and Shelley weren’t really happy with the idea of just making another software toy, however neat it was to lay down track and flip signals and watch the little trains drive around. Although both men had initially been wowed by SimCity, they both came to find it a little unsatisfying in the end, a little sterile in its complete lack of an historical context to latch onto or goals to achieve beyond those the player set for herself. At times the program evinced too much fascination with its own opaque inner workings, as opposed to what the player was doing in front of the screen. As part of his design process, Meier likes to ask whether the player is having the fun or whether the computer — or, perhaps better said, the game’s designer — is having the fun. With SimCity, it too often felt like the latter.

In his role as assistant, Shelley wrote what he remembers as a five- or six-page document that outlined a game that he and Meier were calling at that time The Golden Age of Railroads; before release the name would be shortened to the pithier, punchier Railroad Tycoon. Shelley expressed in the document their firm belief that they could and should incorporate elements of a software toy or “god game” into their creation, but that they wanted to make more of a real game out of it than SimCity had been, wanted to provide an economic and competitive motivation for building an efficient railroad. Thus already by this early stage the lines separating a simulation of real trains from one of toy trains were becoming blurred.

Then in August came that fateful beach holiday which Meier devoted to Railroad Tycoon. Over the course of three weeks of supposed fun in the sun, he added to his model-railroad simulation a landscape on which one built the tracks and stations. The landscape came complete with resources that needed to be hauled from place to place, often to be converted into other resources and hauled still further: trains might haul coal from a coal mine to a steel mill, carry the steel that resulted to a factory to be converted into manufactured goods, then carry the manufactured goods on to consumers in a city. When Meier returned from holiday and showed it to him, Shelley found the new prototype, incorporating some ideas from his own recent design document and some new ones of Meier’s making, to be just about the coolest thing he’d ever seen on a computer screen. Shelley:

We went to lunch together, and he said, “We have to make a decision about whether we’re going to do this railroad game or whether we’re going to do the spy game.”

I said, “If you’re asking me, there’s no contest. We’re doing the railroad game. It’s really cool. It’s so much fun. I have zero weight in this company. I don’t have a vote in any meeting. It’s up to you, but I’m ready to go.”

Shelley was so excited by the game that at one point he offered to work on it for free after hours if that was the only way to get it done. Thankfully, it never came to that.

Dropping Covert Action, which had already eaten up a lot of time and resources, generated considerable tension with Stealey, but when it came down to it it was difficult for him to say no to his co-founder and star designer, the only person at the company who got his name in big letters on the fronts of the boxes. (Stealey, who had invented the tactic of prefixing “Sid Meier’s” to Meier’s games as a way of selling the mold-busting Pirates!, was perhaps by this point wondering what it was he had wrought.) The polite fiction which would be invented for public consumption had it that Meier shifted to Railroad Tycoon while the art department created the graphics for Covert Action. In reality, though, he just wanted to escape a game that refused to come together in favor of one that seemed to have all the potential in the world.

Meier and Shelley threw themselves into Railroad Tycoon. When not planning, coding — this was strictly left to Meier, as Shelley was a non-programmer — or playing the game, they were immersing themselves in the lore and legends of railroading: reading books, visiting museums, taking rides on historic steam trains. The Baltimore area, where MicroProse’s offices were located, is a hotbed of railroad history, being the home of the legendary Baltimore and Ohio Railroad, the oldest common-carrier rail network in the country. Thus there was plenty in the area for a couple of railroad buffs to see and do. Meier and Shelley lived and breathed trains for a concentrated six months, during which they, in tandem with a few artists and other support personnel, took Railroad Tycoon from that August prototype to the finished, boxed game that shipped to stores in April of 1990, complete with a beefy 180-page manual written by Shelley. Leaving aside all of Railroad Tycoon‘s other merits, it was a rather breathtaking achievement just to have created a game of such ambition and complexity in such a short length of time.

But even in ways apart from its compressed development time Railroad Tycoon is far more successful than it has any right to be. It’s marked by a persistent, never entirely resolved tension — one might even say an identity crisis — between two very different visions of what a railroad game should be. To say that one vision was primarily that of Meier and the other that of Shelley is undoubtedly a vast oversimplification, but is nevertheless perhaps a good starting point for discussion.

One vision of Railroad Tycoon is what we might call the operational game, the building game, or the SimCity-like game, consisting of laying down stations, tracks, and switches, scheduling your trains, and watching over them as they run in real time. A certain kind of player can spend hours tinkering here, trying always to set up the most efficient possible routes, overriding switches on the fly to push priority cargoes through to their destinations for lucrative but intensely time-sensitive rewards. None of this is without risk: if you don’t do things correctly, trains can hurtle into one another, tumble off of washed-out bridges, or just wind up costing you more money than they earn. It’s therein, of course, where the challenge lies. This is Meier’s vision of Railroad Tycoon, still rooted in the model-railroad simulation he first showed Shelley back in early 1989.

The other vision of Railroad Tycoon is the game of high-level economic strategy, which first began to assert itself in that design document Bruce Shelley wrote up in mid-1989. In addition to needing to set up profitable routes and keep an eye on your expenses, you also need to judge when to sell bonds to fund expansion and when to buy them back to save the interest payments, when to buy and sell your own stock and that of other railroads to maximize your cash reserves. Most of all, you need to keep a close eye on the competition, who, if you’ve turn the “cutthroat competition” setting on, will try to buy your railroad out from under you by making runs on your stock — that is, when they aren’t building track into your stations, setting up winner-take-all “rate wars.”

This vision of Railroad Tycoon owes much to a board game called 1830: Railways and Robber Barons which Shelley had shepherded through production during his time at Avalon Hill. Although that game was officially designed by Francis Tresham, Shelley had done much to help turn it into the classic many board-game connoisseurs still regard it as today. After Shelley had arrived at MicroProse with his copy of 1830 in tow, it had become a great favorite during the company’s occasional board-game nights. While 1830 traded on the iconography of the Age of Steam, it was really a game of stock-market manipulation; the railroads in the game could have been swapped out for just about any moneymaking industry.

Put very crudely, then, Railroad Tycoon can be seen as 1830 with a SimCity-like railroad simulation grafted on in place of the board game’s pure abstractions. Bill Stealey claims that Eric Dott, the president of Avalon Hill, actually called him after Railroad Tycoon‘s release to complain that “you’re doing my board game as a computer game.” Stealey managed to smooth the issue over; “well, don’t let it happen again” were Dott’s parting words. (This would become a problem when Meier and Shelley promptly did do it again, creating a computer game called Civilization that shared a name as well as other marked similarities with the Avalon Hill board game Civilization.)

Immense though its influence was, some of the elements of 1830 came to Railroad Tycoon shockingly late. Meier insists, for instance, that the three computerized robber barons you compete against were coded up in a mad frenzy over the last two weeks before the game had to ship. Again, it’s remarkable that Railroad Tycoon works at all, much less works as well as it does.

The problem of reconciling the two halves of Railroad Tycoon might have seemed intractable to many a design team. Consider the question of time. The operational game would seemingly need to run on a scale of days and hours, as trains chug around the tracks picking up and delivering constant streams of cargo. Yet the high-level economic game needs to run on a scale of months and years. A full game of Railroad Tycoon lasts a full century, over the course of which Big Changes happen on a scale about a million miles removed from the progress of individual trains down the tracks: the economy booms and crashes and booms again; coal and oil deposits are discovered and exploited and exhausted; cities grow; new industries develop; the Age of Steam gives ways to the Age of Diesel; competitors rise and fall and rise again. “You can’t have a game that lasts a hundred years and be running individual trains,” thought Meier and Shelley initially. If they tried to run the whole thing at the natural scale of the operational game, they’d wind up with a game that took a year or two of real-world time to play and left the player so lost in the weeds of day-to-day railroad operations that the bigger economic picture would get lost entirely.

Meier’s audacious solution was to do the opposite, to run the game as a whole at the macro scale of the economic game. This means that, at the beginning of the game when locomotives are weak and slow, it might take six months for a train to go from Baltimore to Washington, D.C. What ought to be one day of train traffic takes two years in the game’s reckoning of time. As a simulation, it’s ridiculous, but if we’re willing to see each train driving on the map as an abstraction representing many individual trains — or, for that matter, if we’re willing to not think about it at all too closely — it works perfectly well. Meier understood that a game doesn’t need to be a literal simulation of its subject to evoke the spirit of its subject — that experiential gaming encompasses more than simulations. Railroad Tycoon is, to use the words of game designer Michael Bate, an “aesthetic simulation” of railroad history.

Different players inevitably favor different sides of Railroad Tycoon‘s personality. When I played the game again for the first time in a very long time a year or so ago, I did so with my wife Dorte. Wanting to take things easy our first time out, we played without cutthroat competition turned on, in which mode the other railroads just do their own thing without actively trying to screw with your own efforts. Dorte loved designing track layouts and setting up chains of cargo deliveries for maximum efficiency; the process struck her, an inveterate puzzler, as the most delightful of puzzles. After we finished that game and I suggested we play again with cutthroat competition turned on, explaining how it would lead to a much more, well, cutthroat economic war, she said that the idea had no appeal whatsoever for her. Thus was I forced to continue my explorations of Railroad Tycoon on my own. The game designer Soren Johnson, by contrast, has told in his podcast Designer Notes how uninterested he was in the operational game, preferring to just spend some extra money to double-track everything and as much as possible forget it existed. It was rather the grand strategic picture that interested him. As for me, wishy-washy character that I am, I’m somewhere in the middle of these two extremes.

One of the overarching themes of Sid Meier’s history as a game designer is a spirit of generosity, a willingness to let his players play their way. Railroad Tycoon provides a wonderful example in the lengths to which it goes to accommodate the Dortes, the Sorens, and the Jimmys. If you want to concentrate on the operational game, you can turn off cutthroat competition, turn on “dispatcher operations,” set the overall difficulty to its lowest level so that money is relatively plentiful, and have at it. If even that winds up entailing more economics than you’d like to concern yourself with, one of Railroad Tycoon‘s worst-kept secrets is an “embezzlement key” that can provide limitless amounts of cash, allowing you essentially to play it as the model-railroad simulation that it was at its genesis. If, on the other hand, you’re interested in Railroad Tycoon primarily as a game of grand economic strategy, you can turn on cutthroat competition, turn off dispatcher operations, crank the difficulty level up, and have a full-on business war that would warm the cockles of Jack Tramiel’s heart. If you’re a balanced (or wishy-washy) fellow like me, you can turn on cutthroat competition and dispatcher operations and enjoy the full monty. Meier and Shelley added something called priority shipments to the game — one-time, extremely lucrative deliveries from one station to another — to give players like me a reason to engage with the operational game even after their tracks and routes are largely set. Priority deliveries let you earn a nice bonus by manually flipping signals and shepherding a train along — but, again, only if you enjoy that sort of thing; a budding George Soros can earn as much or more by playing the stock market just right.

One story from Railroad Tycoon‘s development says much about Sid Meier’s generous spirit. Through very nearly the entirety of the game’s development, Meier and Shelley had planned to limit the amount of time you could play at the lower difficulty levels as a way of rewarding players who were willing to tackle the challenge of the higher levels. Such a restriction meant not only that players playing at the lower difficulty levels had less time to build their railroad network, but that they lost the chance to play with the most advanced locomotives, which only become available late in the game. Almost literally at the very last possible instant, Meier decided to nix that scheme, to allow all players to play for the full 100 years. Surely fans of the operational game should have access to the cool later trains as well. After all, these were the very people who would be most excited by them. The change came so late that the manual describes the old scheme and the in-game text also is often confused about how long you’re actually going to be allowed to play. It was a small price to pay for a decision that no one ever regretted.

That said, Railroad Tycoon does have lots of rough edges like this confusion over how long you’re allowed to play, an obvious byproduct of its compressed development cycle. Meier and Shelley and their playtesters had nowhere near enough time to make the game air-tight; there are heaps of exploits big enough to drive a Mallet locomotive through (trust me, that’s a big one!). It didn’t take players long to learn that they could wall off competing railroads behind cages of otherwise unused track and run wild in virgin territory on their own; that they could trick their competitors into building in the most unfavorable region of the map by starting to build there themselves, then tearing up their track and starting over competition-free in better territory; that the best way to make a lot of money was to haul nothing but passengers and mail, ignoring all of the intricacies of hauling resources that turned into other resources that turned into still other resources; that they could do surprisingly well barely running any trains at all, just by playing the market, buying and selling their competitors’ stock; that they could play as a real-estate instead of a railroad tycoon, buying up a bunch of land during economic panics by laying down track they never intended to use, then selling it again for a profit during boom times by tearing up the track. Yes, all of these exploits and many more are possible — and yes, the line between exploits and ruthless strategy is a little blurred in many of these cases. But it’s a testament to the core appeal of the game that, after you get over that smug a-ha! moment of figuring out that they’re possible, you don’t really want to use them all that much. The journey is more important than the destination; something about Railroad Tycoon makes you want to play it fair and square. You don’t even mind overmuch that your computerized competitors get to play a completely different and, one senses, a far easier game than the one you’re playing. They’re able to build track in useful configurations that aren’t allowed to you, and they don’t even have to run their own trains; all that business about signals and congestion and locomotives gets abstracted away for them.

Despite it all, I’m tempted to say that in terms of pure design Railroad Tycoon is actually a better game than Civilization, the game Meier and Shelley would make next and the one which will, admittedly for some very good reasons, always remain the heart of Meier’s legacy as a designer. Yet it’s Railroad Tycoon that strikes me as the more intuitive, playable game, free of the tedious micromanagement that tends to dog Civilization in its latter stages. Likewise absent in Railroad Tycoon is the long anticlimax of so many games of Civilization, when you know you’ve won but still have to spend hours mopping up the map before you can get the computer to recognize it. Railroad Tycoon benefits enormously from its strict 100-year time limit, as it does from the restriction of your railroad, born from technical limitations, to 32 trains and 32 stations. “You don’t need more than that to make the game interesting,” said Meier, correctly. And, whereas the turn-based Civilization feels rather like a board game running on the computer, the pausable real time of Railroad Tycoon makes it feel like a true born-digital creation.

175 years of railroad history, from the Planet…

…to the Train à Grande Vitesse.

Of course, mechanics and interface are far from the sum total of most computer games, and it’s in the contextual layer that Civilization thrives as an experiential game, as an awe-inspiring attempt to capture the sum total of human thought and history in 640 K of memory. But, having said that, I must also say that Railroad Tycoon is itself no slouch in this department. It shows almost as beautifully as Civilization how the stuff of history can thoroughly inform a game that isn’t trying to be a strict simulation of said history. From the manual to the game itself, Railroad Tycoon oozes with a love of trains. To their credit, Meier and Shelley don’t restrict themselves to the American Age of Steam, but also offer maps of Britain and continental Europe on which to play, each with its own challenges in terms of terrain and economy. The four available maps each have a different starting date, between them covering railroad history from the distant past of 1825 to the at-the-time-of-the-game’s-development near-future of 2000. As you play, new locomotives become available, providing a great picture of the evolution of railroading, from Robert Stephenson’s original 20-horsepower Planet with its top speed of 20 miles per hour to the 8000-horsepower French Train à Grande Vitesse (“high-speed train”) with a top speed of 160 miles per hour. This is very much a trainspotter’s view of railroad history, making no attempt to address the downsides of the rush to bind nations up in webs of steel tracks, nor asking just why the historical personages found in the game came to be known as the robber barons. (For an introduction to the darker side of railroad history, I recommend Frank Norris’s 1901 novel The Octopus.) But Railroad Tycoon isn’t trying to do social commentary; it just revels in a love of trains, and that’s fine. It’s immensely likeable on those terms — another byproduct of the spirit of generosity with which it’s so shot through. Just hearing the introduction’s music makes me happy.


Upon its release, Railroad Tycoon hit with the force of a freight train. Following the implosion of the 8-bit market at the tail end of the 1980s, North American computer gaming had moved upscale to focus on the bigger, more expensive MS-DOS machines and the somewhat older demographic that could afford them. These changes had created a hunger for more complicated, ambitious strategy and simulation games. SimCity had begun to scratch that itch, but its non-competitive nature and that certain sense of sterility that clung to it left it ultimately feeling a little underwhelming for many players, just it as it had for Meier and Shelley. Railroad Tycoon remedied both of those shortcomings with immense charm and panache. Soren Johnson has mentioned on his podcast how extraordinary the game felt upon its release: “There was just nothing like it at the time.” Computer Gaming World, the journal of record for the new breed of older and more affluent computer gamers, lavished Railroad Tycoon with praise, naming it their “Game of the Year” for 1990. Russell Sipe, the founder and editor-in-chief of the magazine, was himself a dedicated trainspotter, and took to the game with particular enthusiasm, writing an entire book about it which spent almost as much time lingering lovingly over railroad lore as it did telling how to win the thing.

Meier and Shelley were so excited by what they had wrought that they charged full steam ahead into a Railroad Tycoon II. But they were soon stopped in their tracks by Bill Stealey, who demanded that they do something with Covert Action, into which, he insisted, MicroProse had poured too many resources to be able to simply abandon it. By the time that Meier and Shelley had done what they could in that quarter, the idea that would become Civilization had come to the fore. Neither designer would ever return to Railroad Tycoon during their remaining time at MicroProse, although some of the ideas they’d had for the sequel, like scenarios set in South America and Africa, would eventually make their way into a modestly enhanced 1993 version of the game called Railroad Tycoon Deluxe.

Coming as it did just before Civilization, the proverbial Big Moment of Sid Meier’s illustrious career, Railroad Tycoon‘s historical legacy has been somewhat obscured by the immense shadow cast by its younger sibling; even Meier sometimes speaks of Railroad Tycoon today in terms of “paving the way for Civilization.” Yet in my view it’s every bit as fine a game, and when all is said and done its influence on later games has been very nearly as great. “At the beginning of the game you had essentially nothing, or two stations and a little piece of track,” says Meier, “and by the end of the game you could look at this massive spiderweb of trains and say, ‘I did that.'” Plenty of later games would be designed to scratch precisely the same itch. Indeed, Railroad Tycoon spawned a whole sub-genre of economic strategy games, the so-called “Tycoon” sub-genre — more often than not that word seems to be included in the games’ names — that persists to this day. Sure, the sub-genre has yielded its share of paint-by-numbers junk, but it’s also yielded its share of classics to stand alongside the original Railroad Tycoon. Certainly it’s hard to imagine such worthy games as Transport Tycoon or RollerCoaster Tycoon — not to mention the post-MicroProse Railroad Tycoon II and 3 — existing without the example provided by Sid Meier and Bruce Shelley.

But you don’t need to look to gaming history for a reason to play the original Railroad Tycoon. Arguably the finest strategy game yet made for a computer in its own time, it must remain high up in that ranking even today.

(Sources: the books Game Design Theory and Practice by Richard Rouse III, The Official Guide to Sid Meier’s Railroad Tycoon by Russell Sipe, and Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay; ACE of May 1990; Compute!’s Gazette of May 1989; Computer Gaming World of May 1990, July/August 1990, and September 1990; Soren Johnson’s interviews with Bruce Shelley and Sid Meier. My huge thanks go to Soren for providing me with the raw audio of his Sid Meier interview months before it went up on his site, thus giving me a big leg up on my research.

Railroad Tycoon Deluxe has been available for years for free from 2K Games’s website as a promotion for Meier’s more recent train game Railroads!. It makes a fine choice for playing today. But for anyone wishing to experience the game in its original form, I’ve taken the liberty of putting together a download of the original game, complete with what should be a working DOSBox configuration and some quick instructions on how to get it running.)

 
 

Tags: , , ,