RSS

Tag Archives: ibm

The 640 K Barrier

There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.

The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.

— with my apologies to The Right Stuff1

The idea that the original IBM PC, the machine that made personal computing safe for corporate America, was a hastily slapped-together stopgap has been vastly overstated by popular technology pundits over the decades since its debut back in August of 1981. Whatever the realities of budgets and scheduling with which its makers had to contend, there was a coherent philosophy behind most of the choices they made that went well beyond “throw this thing together as quickly as possible and get it out there before all these smaller companies corner the market for themselves.” As a design, the IBM PC favored robustness, longevity, and expandability, all qualities IBM had learned the value of through their many years of experience providing businesses and governments with big-iron solutions to their most important data–processing needs. To appreciate the wisdom of IBM’s approach, we need only consider that today, long after the likes of the Commodore Amiga and the original Apple Macintosh architecture, whose owners so loved to mock IBM’s unimaginative beige boxes, have passed into history, most of our laptop and desktop computers — including modern Macs — can trace the origins of their hardware back to what that little team of unlikely business-suited visionaries accomplished in an IBM branch office in Boca Raton, Florida.

But of course no visionary has 20-20 vision. For all the strengths of the IBM PC, there was one area where all the jeering by owners of sexier machines felt particularly well-earned. Here lay a crippling weakness, born not so much of the hardware found in that first IBM PC as the operating system the marketplace chose to run on it, that would continue to vex programmers and ordinary users for two decades, not finally fading away until Microsoft’s release of Windows XP in 2001 put to bed the last legacies of MS-DOS in mainstream computing. MS-DOS, dubbed the “quick and dirty” operating system during the early days of its development, is likely the piece of software in computing history with the most lopsided contrast between the total number of hours put it into its development and the total number of hours it spent in use, on millions and millions of computers all over the world. The 640 K barrier, the demon all those users spent so much time and energy battling for so many years, was just one of the more prominent consequences of corporate America’s adoption of such a blunt instrument as MS-DOS as its standard. Today we’ll unpack the problem that was memory management under MS-DOS, and we’ll also examine the problem’s multifarious solutions, all of them to one degree or another ugly and imperfect.


 

The original IBM PC was built around an Intel 8088 microprocessor, a cost-reduced and somewhat crippled version of an earlier chip called the 8086. (IBM’s decision to use the 8088 instead of the 8086 would have huge importance for the expansion buses of this and future machines, but the differences between the two chips aren’t important for our purposes today.) Despite functioning as a 16-bit chip in most ways, the 8088 had a 20-bit address space, meaning it could address a maximum of 1 MB of memory. Let’s consider why this limitation should exist.

Memory, whether in your brain or in your computer, is of no use to you if you can’t keep track of where you’ve put things so that you can retrieve them again later. A computer’s memory is therefore indexed by bytes, with every single byte having its own unique address. These addresses, numbered from 0 to the upper limit of the processor’s address space, allow the computer to keep track of what is stored where. The biggest number that can be represented in 20 bits is 1,048,575, or 1 MB. Thus this is the maximum amount of memory which the 8088, with its 20-bit address bus, can handle. Such a limitation hardly felt like a deal breaker to the engineers who created the IBM PC. Indeed, it’s difficult to overemphasize what a huge figure 1 MB really was when they released the machine in 1981, in which year the top-of-the-line Apple II had just 48 K of memory and plenty of other competing machines shipped with no more than 16 K.

A processor needs to address other sorts of memory besides the pool of general-purpose RAM which is available for running applications. There’s also ROM memory — read-only memory, burned inviolably into chips — that contains essential low-level code needed for the computer to boot itself up, along with, in the case of the original IBM PC, an always-available implementation of the BASIC programming language. (The rarely used BASIC in ROM would be phased out of subsequent models.) And some areas of RAM as well are set aside from the general pool for special purposes, like the fully 128 K of addresses given to video cards to keep track of the onscreen display in the original IBM PC. All of these special types of memory must be accessed by the CPU, must be given their own unique addresses to facilitate that, and must thus be subtracted from the address space available to the general pool.

IBM’s engineers were quite generous in drawing the boundary between their general memory pool and the area of addresses allocated to special purposes. Focused on expandability and longevity as they were, they reserved big chunks of “special” memory for purposes that hadn’t even been imagined yet. In all, they reserved the upper three-eighths of the available addresses for specialized purposes actual or potential, leaving the lower five-eighths — 640 K — to the general pool. In time, this first 640 K of memory would become known as “conventional memory,” the remaining 384 K — some of which would be ROM rather than RAM — as “high memory.” The official memory map which IBM published upon the debut of the IBM PC looked like this:

It’s important to understand when looking at a memory map like this one that the existence of a logical address therein doesn’t necessarily mean that any physical memory is connected to that address in any given real machine. The first IBM PC, for instance, could be purchased with as little as 16 K of conventional memory installed, and even a top-of-the-line machine had just 256 K, leaving most of the conventional-memory space vacant. Similarly, early video cards used just 32 K or 64 K of the 128 K of address space offered to them in high memory. The 640 K barrier was thus only a theoretical limitation early on, one few early users or programmers ever even noticed.

That blissful state of affairs, however, wouldn’t last very long. As IBM’s creations — joined, soon enough, by lots of clones — became the standard for American business, more and more advanced applications appeared, craving more and more memory alongside more and more processing power. Already by 1984 the 640 K barrier had gone from a theoretical to a very real limitation, and customers were beginning to demand that IBM do something about it. In response, IBM that year released the PC/AT, built around Intel’s new 80286 microprocessor, which boasted a 24-bit address space good for 16 MB of memory. To unlock all that potential extra memory, IBM made the commonsense decision to extend the memory map above the specialized high-memory area that ended at 1 MB, making all addresses beyond 1 MB a single pool of “extended memory” available for general use.

Problem solved, right? Well, no, not really — else this would be a much shorter article. Due more to software than hardware, all of this potential extended memory proved not to be of much use for the vast majority of people who bought PC/ATs. To understand why this should be, we need to examine the deadly embrace between the new processor and the old operating system people were still running on it.

The 80286 was designed to be much more than just a faster version of the old 8086/8088. Developing the chip before IBM PCs running MS-DOS had come to dominate business computing, Intel hadn’t allowed the need to stay compatible with that configuration to keep them from designing a next-generation chip that would help to take computing to where they saw it as wanting to go. Intel believed that microcomputers were at the stage at which the big institutional machines had been a couple of decades earlier, just about ready to break free of what computer scientist Brian L. Stuart calls the “Triangle of Ones”: one user running one program at a time on one machine. At the very least, Intel believed, the second leg of the Triangle must soon fall; everyone recognized that multitasking — running several programs at a time and switching freely between them — was a much more efficient way to do complex work than laboriously shutting down and starting up application after application. But unfortunately for MS-DOS, the addition of multitasking complicates the life of an operating system to an absolutely staggering degree.

Operating systems are of course complex subjects worthy of years or a lifetime of study. We might, however, collapse their complexities down to a few fundamental functions: to provide an interface for the user to work with the computer and manage her programs and files; to manage the various tasks running on the computer and allocate resources among them; and to act as a buffer or interface between applications and the underlying hardware of the computer. That, anyway, is what we expect at a minimum of our operating systems today. But for a computer ensconced within the Triangle of Ones, the second and third functions were largely moot: with only one program allowed to run at a time, resource-management concerns were nonexistent, and, without the need for a program to be concerned about clashing with other programs running at the same time, bare-metal programming — manipulating the hardware directly, without passing requests through any intervening layer of operating-system calls — was often considered not only acceptable but the expected approach. In this spirit, MS-DOS provided just 27 function calls to programmers, the vast majority of them dealing only with disk and file management. (Compare that, my fellow programmers, with the modern Windows or OS X APIs!) For everything else, banging on the bare metal was fine.

We can’t even begin here to address all of the complications that are introduced when we add multitasking into the equation, asking the operating system in the process to fully embrace all three of the core functions listed above. Memory management alone, the one aspect we will look deeper into today, becomes complicated enough. A program which is sharing a machine with other programs can no longer have free run of the memory map, placing whatever it wants to wherever it wants to; to do so risks overwriting the code or data of another program running on the system. Instead the operating system must demand that individual programs formally request the memory they’d like to use, and then must come up with a way to keep a program, whether due to bugs or malice, from running roughshod over areas of memory that it hasn’t been granted.

Or perhaps not. The Commodore Amiga, the platform which pioneered multitasking on personal computers in 1985, didn’t so much solve the latter part of this problem as punted it away. An application program is expected to request from the Amiga’s operating system any memory that it requires. The operating system then returns a pointer to a block of memory of the requested size, and trusts the application not to write to  memory outside of these bounds. Yet nothing besides the programmer’s skill and good nature absolutely prevents such unauthorized memory access from happening. Every application on the Amiga, in other words, can write to any address in the machine’s memory, whether that address be properly allocated to it or not. Screen memory, free memory, another program’s data, another program’s code — all are fair game to the errant program. Such unauthorized memory access will almost always eventually result in a total system crash. A non-malicious programmer who wants her program to a good citizen would of course never intentionally write to memory she hasn’t properly requested, but bugs of this nature are notoriously easy to create and notoriously hard to track down, and on the Amiga a single instance of one can bring down not only the offending program but the entire operating system. With all due respect to the Amiga’s importance as the first multitasking personal computer, this is obviously not the ideal way to implement it.

A far more sustainable approach is to take the extra step of tracking and protecting the memory that has been allocated to each program. Memory protection is usually accomplished using  what’s known as virtual memory: when a program requests memory, it’s returned not a true address within the system’s memory pool but rather a virtual address that’s translated back into the real address to which it corresponds every time the program accesses its data. Each program is thus effectively sandboxed from everything else, allowed to read from and write to only its own data. Only the lowest levels of the operating system have global access to the memory pool as a whole.

Implementing such memory protection in software alone, however, must be an untenable drain on the resources available to systems engineers in the 1980s — a fact which does everything to explain its absence from the Amiga. Intel therefore decided to give software a leg up via hardware. They built into the 80286 a memory-management unit that could automatically translate from virtual to real memory addresses and vice versa, making this constantly ongoing process fairly transparent even to the operating system.

Nevertheless, the operating system must know about this capability, must in fact be written very differently if it’s to run on a CPU with memory protection built into its circuitry. Intel recognized that it would take time for such operating systems to be created for the new chip, and recognized that compatibility with the earlier 8086/8088 chips would be a very good thing to have in the meantime. They therefore built two possible operating modes into the 80286. In “protected mode” — the mode they hoped would eventually come to be used almost universally — the chip’s full potential would be realized, including memory protection and the ability to address up to 16 MB of memory. In “real mode,” the 80286 would function essentially like a turbocharged 8086/8088, with no memory-protection capabilities and with the old limitation on addressable memory of 1 MB still in place. Assuming that in the early days at least the new chip would need to run on operating systems with no knowledge of its full capabilities, Intel made the 80286 default to real mode on startup. An operating system which did know about the 80286 and wanted to bring out its full potential could switch it to protected mode at boot-up and be off to the races.

It’s at the intersection between the 80286 and the operating system that Intel’s grand plans for the future of their new chip went awry. An overwhelming percentage of the early 80286s were used in IBM PC/ATs and clones, and an overwhelming percentage of those machines were running MS-DOS. Microsoft’s erstwhile “quick and dirty” operating system knew nothing of the 80286’s full capabilities. Worse, trying to give it knowledge of those capabilities would have to entail a complete rewrite which would break compatibility with all existing MS-DOS software. Yet the whole reason MS-DOS was popular in the first place — it certainly wasn’t because of a generous feature set, a friendly interface, or any aesthetic appeal — was that very same huge base of business software. Getting users to make the leap to some hypothetical new operating system in the absence of software to run on it would be as difficult as getting developers to write programs for an operating system with no users. It was a chicken-or-the-egg situation, and neither chicken nor egg was about to stick its neck out anytime soon.

IBM was soon shipping thousands upon thousands of PC/ATs every month, and the clone makers were soon shipping even more 80286-based machines of their own. Yet at least 95 percent of those machines were idling along at only a fraction of their potential, thanks to the already creakily archaic MS-DOS. For all these users, the old 640 K barrier remained as high as ever. They could stuff their machines full of extended memory if they liked, but they still couldn’t access it. And of course the multitasking that the 80286 was supposed to have enabled remained as foreign a concept to MS-DOS as a GPS unit to a Model T. The only solution IBM offered those who complained about the situation was to run another operating system. And indeed, there were a number of alternatives to MS-DOS available for the PC/AT and other 80286-based machines, including several variants of the old institutional-computing favorite Unix — one of them even from Microsoft — and new creations like Digital Research’s Concurrent DOS, which struggled with mixed results to wedge in some degree of MS-DOS compatibility. Still, the only surefire way to take full advantage of MS-DOS’s huge software base was to run the real — in more ways than one now! — MS-DOS, and this is what the vast majority of people with 80286-equipped machines wound up doing.

Meanwhile the very people making the software which kept MS-DOS the only viable choice for most users were feeling the pinch of being confined to 640 K more painfully almost by the month. Finally Lotus Corporation —  makers of the Lotus 1-2-3 spreadsheet package that ruled corporate America, the greatest single business-software success story of their era — decided to use their clout to do something about it. They convinced Intel to join them in devising a scheme for breaking the 640 K barrier without abandoning MS-DOS. What they came up with was one mother of an ugly kludge — a description the scheme has in common with virtually all efforts to break through the 640 K barrier.

Looking through the sparsely populated high-memory area which the designers of the original IBM PC had so generously carved out, Lotus and Intel realized it should be possible on almost any extant machine to identify a contiguous 64 K chunk of those addresses which wasn’t being used for anything. This chunk, they decided, would be the gateway to potentially many more megabytes installed elsewhere in the machine. Using a combination of software and hardware, they implemented what’s known as a bank-switching scheme. The 64 K chunk of high-memory addresses was divided into four segments of 16 K, each of which could serve as a lens focused on a 16 K segment of additional memory above and beyond 1 MB. When the processor accessed the addresses in high memory, the data it would actually access would be the data at whatever sections of the additional memory their lenses were currently pointing to. The four lenses could be moved around at will, giving access, albeit in a roundabout way, to however much extra memory the user had installed. The additional memory unlocked by the scheme was dubbed “expanded memory.”  The name’s unfortunate similarity to “extended memory” would cause much confusion over the years to come; from here on, we’ll call it by its common acronym of “EMS.”

All those gobs of extra memory wouldn’t quite come for free: applications would have to be altered to check for the existence of EMS memory and make use of it, and there would remain a distinct difference between conventional memory and EMS memory with which programmers would always have to reckon. Likewise, the overhead of constantly moving those little lenses around made EMS memory considerably slower to access than conventional memory. On the brighter side, though, EMS worked under MS-DOS with only the addition of a single device driver during startup. And, since the hardware mechanism for moving the lenses around was completely external to the CPU, it would even work on machines that weren’t equipped with the new 80286.

This diagram shows the different types of memory available on PCs of the mid-1980s. In blue, we see the original 1 MB memory map of the IBM PC. In green, we see a machine equipped with additional extended memory. And in orange we see a machine equipped with additional expanded memory.

Shortly before the scheme made its official debut at a COMDEX trade show in May of 1985, Lotus and Intel convinced a crucial third partner to come aboard: Microsoft. “It’s garbage! It’s a kludge!” said Bill Gates. “But we’re going to do it.” With the combined weight of Lotus, Intel, and Microsoft behind it, EMS took hold as the most practical way of breaking the 640 K barrier. Imperfect and kludgy though it was, software developers hurried to add support for EMS memory to whatever programs of theirs could practically make use of it, while hardware manufacturers rushed EMS memory boards onto the market. EMS may have been ugly, but it was here today and it worked.

At the same time that EMS was taking off, however, extended memory wasn’t going away. Some hardware makers — most notably IBM themselves — didn’t want any part of EMS’s ugliness. Software makers therefore continued to probe at the limits of machines equipped with extended memory, still looking for a way to get at it from within the confines of MS-DOS. What if they momentarily switched the 80286 into protected mode, just for as long as they needed to manipulate data in extended memory, then went back into real mode? It seemed like a reasonable idea — except that Intel, never anticipating that anyone would want to switch modes on the fly like this, had neglected to provide a way to switch an 80286 in protected mode back into real mode. So, proponents of extended memory had to come up with a kludge even uglier than the one that allowed EMS memory to function. They could force the 80286 back into real mode, they realized, by resetting it entirely, just as if the user had rebooted her computer. The 80286 would go through its self-check again — a process that admittedly absorbed precious milliseconds — and then pick back up where it left off. It was, as Microsoft’s Gordon Letwin memorably put it, like “turning off the car to change gears.” It was staggeringly kludgy, it was horribly inefficient, but it worked in its fashion. Given the inefficiencies involved, the scheme was mostly used to implement virtual disks stored in the extended memory, which wouldn’t be subject to the constant access of an application’s data space.

In 1986, the 32-bit 80386, Intel’s latest and greatest chip, made its public bow at the heart of the Compaq Deskpro 386 rather than an IBM machine, a landmark moment signaling the slow but steady shift of business computing’s power center from IBM to Microsoft and the clone makers using their operating system. While working on the new chip, Intel had had time to see how the 80286 was actually being used in the wild, and had faced the reality that MS-DOS was likely destined to be cobbled onto for years to come rather than replaced in its entirety with something better. They therefore made a simple but vitally important change to the 80386 amidst its more obvious improvements. In addition to being able to address an inconceivable total of 4 GB of memory in protected mode thanks to its 32-bit address space, the 80386 could be switched between protected mode and real mode on the fly if one desired, without needing to be constantly reset.

In freeing programmers from that massive inefficiency, the 80386 cracked open the door that much further to making practical use of extended memory in MS-DOS. In 1988, the old EMS consortium of Lotus, Intel, and Microsoft came together once again, this time with the addition to their ranks of the clone manufacturer AST; the absence of IBM is, once again, telling. Together they codified a standard approach to extended memory on 80386 and later processors, which corresponded essentially to the scheme I’ve already described in the context of the 80286, but with a simple command to the 80386 to switch back to real mode replacing the resets. They called it the eXtended Memory Specification; memory accessed in this way soon became known universally as “XMS” memory. Under XMS as under EMS, a new device driver would be loaded into MS-DOS. Ordinary real-mode programs could then call this driver to access extended memory; the driver would do the needful switching to protected mode, copy blocks of data from extended memory into conventional memory or vice versa, then switch the processor back to real mode when it was time to return control to the program. It was still inelegant, still a little inefficient, and still didn’t use the capabilities of Intel’s latest processors in anything like the way Intel’s engineers had intended them to be used; true multitasking still remained a pipe dream somewhere off in a shadowy future. Owners of sexier machines like the Macintosh and Amiga, in other words, still had plenty of reason to mock and scoff. In most circumstances, working with XMS memory was actually slower than working with EMS memory. The primary advantage of XMS was that it let programs work with much bigger chunks of non-conventional memory at one time than the four 16 K chunks that EMS allowed. Whether any given program chose EMS or XMS came to depend on which set of advantages and disadvantages best suited its purpose.

The arrival of XMS along with the ongoing use of EMS memory meant that MS-DOS now had two competing memory-management solutions. Buyers now had to figure out not only whether they had enough extra memory to run a program but whether they had the right kind of extra memory. Ever accommodating, hardware manufacturers began shipping memory boards that could be configured as either EMS or XMS memory — whatever the application you were running at the moment happened to require.

The next stage in the slow crawl toward parity with other computing platforms in the realm of memory management would be the development of so-called “DOS extenders,” software to allow applications themselves to run in protected mode, thus giving them direct access to extended memory without having to pass their requests through an inefficient device driver. An application built using a DOS extender would only need to switch the processor to real mode when it needed to communicate with the operating system. The development of DOS extenders was driven by Microsoft’s efforts to turn Windows, which like seemingly everything else in business computing ran on top of MS-DOS, into a viable alternative to the command line and a viable challenger to the Macintosh. That story is thus best reserved for a future article, when we look more closely at Windows itself. As it is, the story that I’ve told so far today moves us nicely into the era of computer-gaming history we’ve reached on the blog in general.

In said era, the MS-DOS machines that had heretofore been reserved for business applications were coming into homes, where they were often used to play a new generation of games taking advantage of the VGA graphics, sound cards, and mice sported by the latest systems. Less positively, all of the people wanting to play these new games had to deal with the ramifications of a 640 K barrier that could still be skirted only imperfectly. As we’ve seen, both EMS and XMS imposed to one degree or another a performance penalty when accessing non-conventional memory. What with games being the most performance-sensitive applications of all, that made that first 640 K of lightning-fast conventional memory most precious of all for them.

In the first couple of years of MS-DOS’s gaming dominance, developers dealt with all of the issues that came attached to using memory beyond 640 K by the simple expedient of not using any memory beyond 640 K. But that solution was compatible neither with developers’ growing ambitions for their games nor with the gaming public’s growing expectations of them.

The first harbinger of what was to come was Origin Systems’s September 1990 release Wing Commander, which in its day was renowned — and more than a little feared — for pushing the contemporary state of the art in hardware to its limits. Even Wing Commander didn’t go so far as to absolutely require memory beyond 640 K, but it did use it to make the player’s audiovisual experience snazzier if it was present. Setting a precedent future games would largely follow, it was quite inflexible in its approach, demanding EMS — as opposed to XMS — memory. In the future, gamers would have to become all too familiar with the differences between the two standards, and how to configure their machines to use one or the other. Setting another precedent, Wing Commander‘s “installation guide” included a section on “memory usage” that was required reading in order to get things working properly. In the future, such sections would only grow in length and complexity, and would need to be pored over by long-suffering gamers with far more concentrated attention than anything in the manual having anything to do with how to actually play the games they purchased.

In Accolade’s embarrassing Leisure Suit Larry knockoff Les Manley in: Lost in LA, the title character explains EMS and XMS memory to some nubile companions. The ironic thing was that anyone who wished to play the latest games on an MS-DOS machine really did need to know this stuff, or at least have a friend who did.

Thus began the period of almost a decade, remembered with chagrin but also often with an odd sort of nostalgia by old-timers today, in which gamers spent hours monkeying about with MS-DOS’s “config.sys” and “autoexec.bat” files and swapping in and out various third-party utilities in the hope of squeezing out that last few kilobytes of conventional memory that Game X needed to run. The techniques they came to employ were legion.

In the process of developing Windows, Microsoft had discovered that the kernel of MS-DOS itself, a fairly tiny program thanks to its sheer age, could be stashed into the first 64 K of memory beyond 1 MB and still accessed like conventional memory on an 80286 or later processor in real mode thanks to what was essentially an undocumented technical glitch in the design of those processors. Gamers thus learned to include the line “DOS=HIGH” in their configuration files, freeing up a precious block of conventional memory. Likewise, there was enough unused space scattered around in the 384 K of high memory on most machines to stash many or all of MS-DOS’s device drivers there instead of in conventional memory. Thus “DOS=HIGH” soon became “DOS=HIGH,UMB,” the second parameter telling the computer to make use of these so-called “upper-memory blocks” and thereby save that many kilobytes more.

These were the most basic techniques, the starting points. Suffice to say that things got a lot more complicated from there, turning into a baffling tangle of tweaks, some saving mere bytes rather than kilobytes of conventional memory, but all of them important if one was to hope to run games that by 1993 would be demanding 604 K of 640 K for their own use. That owners of machines which by that point typically contained memories in the multi-megabytes should have to squabble with the operating system over mere handfuls of bytes was made no less vexing by being so comically absurd. And every new game seemed to up the ante, seemed to demand that much more conventional memory. Those with a sunnier disposition or a more technical bent of mind took the struggle to get each successive purchase running as the game before the game got started, as it were. Everyone else gnashed their teeth and wondered for the umpteenth time if they might not have been better off buying a console where games Just Worked. The only thing that made it all worthwhile was the mixture of relief, pride, and satisfaction that ensued when you finally got it all put together just right and the title screen came up and the intro music sprang to life — if, that is, you’d managed to configure your sound card properly in the midst of all your other travails. Such was the life of the MS-DOS gamer.

Before leaving the issue of the 640 K barrier behind in exactly the way that all those afflicted by it for so many years were so conspicuously unable to do, we have to address Bill Gates’s famous claim, allegedly made at a trade show in 1981, that “640 K ought to be enough for anybody.” The quote has been bandied about for years as computer-industry legend, seeming to confirm as it does the stereotype of Bill Gates as the unimaginative dirty trickster of his industry, as opposed to Steve Jobs the guileless visionary (the truth is, needless to say, far more complicated). Sadly for the stereotypers, however, the story of the quote is similar to all too many legends in the sense that it almost certainly never happened. Gates himself, for one, vehemently denies ever having said any such thing. Fred Shapiro, for another, editor of The Yale Book of Quotations, conducted an exhaustive search for a reputable source for the quote in 2008, going so far as to issue a public plea in The New York Times for anyone possessing knowledge of such a source to contact him. More than a hundred people did so, but none of them could offer up the smoking gun Shapiro sought, and he was left more certain than ever that the comment was “apocryphal.” So, there you have it. Blame Bill Gates all you want for the creaky operating system that was the real root cause of all of the difficulties I’ve spent this article detailing, but don’t ever imagine he was stupid enough to say that. “No one involved in computers would ever say that a certain amount of memory is enough for all time,” said Gates in 2008. Anyone doubting the wisdom of that assertion need only glance at the history of the IBM PC.

(Sources: the books Upgrading and Repairing PCs, 3rd edition by Scott Mueller and Principles of Operating Systems by Brian L. Stuart; Computer Gaming World of June 1993; Byte of January 1982, November 1984, and March 1992; Byte‘s IBM PC special issues of Fall 1985 and Fall 1986; PC Magazine of May 14 1985, January 14 1986, May 30 1989, June 13 1989, and June 27 1989; the episode of the Computer Chronicles television show entitled “High Memory Management”; the online article “The ‘640K’ quote won’t go away — but did Gates really say it?” on Computerworld.)


  1. Yes, that is quite possibly the nerdiest thing I’ve ever written. 

 
 

Tags: , , ,

IBM’s New Flavor

The PS/2 lineup

IBM’s greatest triumph was inextricably linked with what by 1986 was turning into their biggest problem. Following its introduction five years before, the IBM PC had remade the face of corporate computing in its image, legitimizing personal computing in the eyes of the Fortune 500 and all those smaller companies who dreamed of someday joining their ranks. The ecosystem that surrounded the IBM PC and its successors was now worth countless billions, the greatest story of American business success of them all to play out during Ronald Reagan’s storied Morning in America.

The problem, at least as IBM and many of their worried stockholders perceived it, was that they now seemed on the verge of losing control of the very standard they had created. A combination of the decisions that had allowed the original IBM PC to become a standard in the first place — its simple, workmanlike design that utilized only off-the-shelf components; the scrupulously thorough documentation of said design; the decision to outsource the machine’s operating system to Microsoft, a third party all too willing to license the same operating system to other parties as well — had led to a thriving market in so-called “clone” machines whose combined revenues now far exceeded IBM’s personal-computer sales. IBM believed that the clonesters were lifting billions out of their pockets every year, even as they saw their own sales, which had broken record after record in the first few years following the IBM PC’s launch, beginning to show signs of stagnation.

Compaq of Houston, Texas, the most aggressive and innovative of the clonesters, had first begun to collect for themselves a reputation to rival IBM’s own with their very first product back in 1983, a portable — or, perhaps better said, “luggable” — all-in-one IBM-compatible. The Compaq Portable had forced IBM for the first time to play catch-up with a personal-computing rival, rushing to market a luggable of their own. To make matters worse, the IBM version of portable computing had proved far less practical than the Compaq, as many a reviewer wasn’t shy about pointing out.

Now, in 1986, Compaq threatened to wrangle away from IBM the mantle of technological leadership via a machine that represented a more fundamental advance than a new form factor. After hearing that IBM didn’t have any immediate plans to release a machine built around the Intel 80386, a new 32-bit processor that was sending waves of excitement rippling through the industry, Compaq decided to push ahead with a 386-based machine of their own — right now, this very year. The public launch of the Compaq Deskpro 386 on September 9, 1986 — almost exactly five years after the debut of the original IBM PC — was another watershed moment, the first time one of the clonesters had released a machine more powerful than anything in IBM’s stable. Compaq’s CEO Rod Canion, never a shrinking violet under any circumstances, outdid himself at the launch, declaring the Deskpro 386 “the third generation of the personal-computer revolution” after the Apple II and the IBM PC, thus implicitly placing his own Compaq on a par with those two storied companies.

The clone market was getting so big that there seemed a danger that the clones wouldn’t be dismissed under that selfsame moniker much longer. People in the business world were beginning to replace the phrase “IBM clone” with phrases like “the MS-DOS standard” or “the Intel standard,” giving no credit to the company that had really created that standard. As was well attested by their checkered history of antitrust investigations and allegations of unfair competitive practices, IBM had never been known as a bastion of corporate generosity. It may not be exaggerating the case to say that they felt themselves to have a moral right to the PC standard they’d created, a right that encompassed not just an acknowledgement that said standard was still the IBM standard but also the ability to continue to steer every aspect of the further development of that standard. And by all rights the right should also encompass — and this was the sticking point that really irked — their fair share of all those billions that all those other companies were making from IBM’s standard.

In addition to furnishing what they saw as ample evidence of a need for them to reassert control of their industry, this period found IBM at another, more purely technical crossroads. The imminent move from 16-bit to 32-bit computing represented by the new 80386 would have to bring with it some elaborations on IBM’s tried-and-true architecture — elaborations that would undoubtedly define the face of mainstream business computing into the 1990s. IBM saw in those elaborations a way to remedy the ongoing problem of the clonesters as well. Unknown to everyone outside the company, they were about to initiate the so-called “bus wars,” a premeditated strike aimed directly at what they saw as parasites like Compaq.

The bus in this context referred not to a mode of public transportation but rather to the system of expansion slots that allowed the innermost core of an IBM-compatible computer — little more than the processor and memory — to communicate with just about everything else that made up a full-fledged PC: floppy and hard disk drives, monitors, modems, printers, ad infinitum, from the most generalized components found in just about every office to the most specialized for the most esoteric of tasks. The original IBM PC, built around a hybrid 8-bit and 16-bit chip called the Intel 8088, had used an 8-bit bus, meaning the electronic “channel” it used to talk to all these myriad devices was just 8 bits wide. In 1984, IBM had released the PC/AT, built around the newer fully 16-bit Intel 80286, and in that machine had expanded the original bus to support 16-bit devices while remaining backward compatible with the older 8-bit standard. The result retroactively came to be known as the Industry Standard Architecture, or ISA.

Now, with the 32-bit 80386 a reality, it was time to think about revisiting the bus again, to make it support 32-bit communications. To fail to do so would be to cripple the 386, forcing it to act like a 16-bit chip every time it wanted to communicate with a peripheral; impressive as they were in many ways, the Compaq Deskpro 386 and other early 386 clones saw their performance limited by exactly this problem. Most people expected IBM to do for the 386 what they had previously done for the 286, delivering a new bus which would support 32-bit peripherals but remain compatible with older 16-bit and even 8-bit devices. Instead they delivered something they called the Micro Channel Architecture, or MCA, a complete break with the past which supported only 32-bit peripherals.

So much controversy over something barely noticeable. The four Micro Channel slots sit at the left rear of this PS/2 Model 50.

So much controversy over something barely noticeable. The four Micro Channel slots sit at the left rear of this PS/2 Model 50. Many of the components that would have been housed in expansion cards in earlier IBM systems, such as the video card and hard-drive controller, were moved onto the motherboard with the PS/2 line.

MCA debuted as a key component in a new line of personal computers in April of 1987, the most ambitious such IBM had ever or would ever introduce. The Personal System/2 lineup — better known as the PS/2 — was envisioned as exactly the next generation in personal computing that an ebullient Rod Canion had perhaps overenthusiastically declared the Compaq Deskpro 386 to represent barely six months before. IBM was determined to once again remake the computer industry in their image — and to get it right this time, avoiding the perceived mistakes that had led to the rise of the clonesters. The PS/2 lineup did encompass lower-end machines using the old 16-bit PC/AT bus, but the real point of the effort lay with the higher-end models, IBM’s first to use the 80386 and their first to use the new MCA bus architecture to take advantage of all of the 32 bits of throughput offered by that chip. IBM offered various technical justifications for the failure of MCA to support their older bus standards, but they always rang false. As the more astute industry observers quickly realized, MCA had more to do with business and marketing than it did with technology in the abstract.

IBM was attempting a delicate trick with MCA. They wanted to be able to continue to reap the enormous benefits of the business-computing standard they had birthed, with its huge constellation of compatible software that by now even more so than IBM’s reputation made an MS-DOS machine the only kind to be seriously considered by the vast majority of corporate purchasing departments. At the same time, though, they wanted to cut off the oxygen to the clonesters who were also benefiting so conspicuously from that same universal acceptance, and to reassert their role as the ultimate authorities on the direction business computing would take in the future. They believed they could accomplish all of that, in the long term at least, by threading the needle of compatibility — keeping the 386-based PS/2 lineup software-compatible with the older machines while deliberately breaking the hardware compatibility so relied on by the clonesters. In doing so, they would take the hardware to a place the clonesters couldn’t follow, thus securing for themselves all those billions the clonesters had heretofore been stealing out of their pockets.

Unlike the original IBM bus architecture, MCA was locked up inside an ironclad cage of patents, making it legally uncloneable unless one could somehow negotiate a license to do so through IBM. The patents even extended to add-on cards and other peripherals that might be compatible with MCA, meaning that absolutely anyone who wanted to make a hardware add-on for an MCA machine would have to negotiate a license and pay for the privilege. The result should be not only a lucrative new revenue stream but also complete control of business computing’s further evolution. Yes, the clonesters would be able to survive for a few more years making machines using the older 16-bit bus architecture. In the longer term, however, as personal computing inevitably transitioned into a realm of 32 bits, they would survive purely at IBM’s whim, their fate predicated on IBM’s willingness to grant them a patent license for MCA and their own willingness to pay dearly for it.

The clonesters rightly and immediately saw MCA as nothing less than an existential threat, and were thrown into a tizzy trying to figure out how to respond to it. It was the ever-quotable Rod Canion who came up with the best line of attack, drawing an analogy between MCA and the recent soft-drink marketing disaster of New Coke. (What with Pepsi alumnus John Sculley in charge over at Apple, computers and soft drinks seemed to be running oddly in parallel during this era.) Clever, pithy, and blessedly non-technical, Canion’s comparison spread like wildfire through the business press, regurgitated ad nauseam by journalists who often had little to no idea what this MCA thing that it referenced actually was. IBM never quite managed to formulate a response that didn’t sound nefariously evasive.

With the “New Coke” meme setting the tone, just about everything about the PS/2 line turned into an unexpected uphill struggle for IBM. While plenty of early reviewers dutifully toed the line, doubtless mindful that if no one ever got fired for buying IBM no one was likely to get fired for giving them a positive review either, a surprising number of the reviews were distinctly lukewarm. The complaints started and often ended with the prices. Even the low-end 16-bit PS/2 models started at a suggested list price of $2295 without monitor, while the high-end models topped out at almost $7000. Insider reports had it that IBM was enjoying profit margins of 40 percent or more, leading to rampant speculation on what the cost of entry into business-friendly personal computing might become if they really should manage to stamp out the clonesters.

The high-end models in particular struck many as a pointless waste of money given that IBM didn’t have an operating system ready to take advantage of their capabilities. The machines were all still saddled with MS-DOS, clunky and archaic and barely worthy of the name “operating system” even in the terms of 1987. In one of the more striking examples of hardware running away from software in computing history, the higher-end models shipped with 1 MB of memory, but couldn’t actually use more than 640 K of it thanks to MS-DOS’s built-in limitations. IBM promised a new, next-generation operating system called OS/2 to unlock the real potential of these next-generation machines. But OS/2, a project they had once again chosen to turn over to Microsoft, was still an unknown number of months away, with the so-called “Presentation Manager” that would add to it a Macintosh-style GUI due yet further months after that.1 And, as a final little bit of buyer discouragement, IBM planned to charge the people who had already spent many thousands on their PS/2 hardware another $800 or so for the privilege of using the eventual OS/2 to take advantage of it.

The PS/2 launch prompted constant comparisons with the original IBM PC launch of five and a half years before, and constantly came up wanting. IBM’s publicity campaign was lavish — as it ought to have been, given those profit margins — but unfocused and uninspired. Its centerpiece was a series of commercials involving much of the cast from M*A*S*H, playing their old sitcom characters inexplicably transported from the Korean War to a modern office. With M*A*S*H still a beloved cultural touchstone only a few years removed from its record-shattering final episode, the spots had plenty of sheer star power, but lacked even a modicum of the charm or creativity that had characterized the award-winning “Charlie Chaplin” advertisements for the original IBM PC.

Likewise, it was hard not to compare the unexpected spirit of openness that had suffused the 1981 IBM PC with the domination and control IBM so plainly intended to assert with the 1987 PS/2 launch. Apple’s iconic old “Big Brother” Macintosh advertisement, a soaring triumph of rhetoric over substance back in its day, would have fit much better to the PS/2 line than it had to the state of business computing back in 1984. Many chose to blame the change in tone on the loss of Don Estridge, the leader of the small team that had built the original IBM PC. An unusually charismatic personality and independent thinker for the famously conservative and bureaucratic IBM — enough so that he had been courted by Steve Jobs to fill the CEO role John Sculley ended up taking at Apple — Estridge had been killed in a plane crash in 1985. His stewardship over IBM’s microcomputer division had been succeeded by that of William Lowe, a much more traditional rank-and-file, buttoned-down IBM man. Whether due to this reason or some other, the shift in tone and direction from 1981 to 1987 was striking.

In the months following the PS/2 line’s release, the media narrative drifted from one of uncertain excitement to reports of the new machines’ disappointing reception in many quarters. IBM sold around 200,000 MCA-equipped PS/2s in the first six months, mostly to the biggest of big business; United Airlines alone, for example, bought 40,000 of them as part of a complete revamping of their reservations system. But far too many even within the Fortune 500 proved stubbornly, unexpectedly resistant to IBM’s unsubtle prodding to jump onto the PS/2 train. Many chose to invest in the clonesters’ cheaper 80386 offerings instead; the 16-bit bus used by those machines, while far from ideal from a purely technical standpoint, did at least have the advantage of compatibility with existing peripherals. Seventeen months after MCA’s debut, 66 percent of all business computers being sold each month were still using the old bus architecture, versus just 20 percent that used MCA. (The remainder was largely accounted for by the Macintosh.) Survey after survey reported IBM to be losing market share rather than gaining it since the arrival of the PS/2. By this point OS/2 and its “Presentation Manager” GUI were finally available, but, hampered by that high price tag, the new operating system’s uptake had also been limited at best.

And then, just when it seemed the news couldn’t get much worse for IBM, much of the industry went into unthinkable open revolt against their ongoing hegemony. On September 13, 1988, a group of the clonesters, driven as usual by Compaq and with the tacit support of Intel and Microsoft, announced the creation of a new 32-bit bus standard, to be called the Extended Industry Standard Architecture, or EISA. Unlike MCA, EISA would be compatible with older 16-bit and 8-bit peripherals. And it would manage to be so without performing notably worse than MCA, thus giving the lie to IBM’s claims that their decision to abandon bus compatibility had been motivated by technical rather than business concerns. The press promptly dubbed the budding consortium, which included virtually every manufacturer of IBM-compatible computers not named IBM, the “Gang of Nine” after the allegedly traitorous Gang of Four of the Chinese Cultural Revolution. Machines using the new EISA bus entered production within a year.

This shot of an EISA card illustrates the unique two-layer connection that allowed the same sockets to work for both the older ISA standard and the newer EISA. The shorter pins correspond to the older 16-bit standard; the longer extend it to 32 bits.

This shot of an EISA card illustrates the unique two-layer connection devised by the Gang of Nine to extend the old ISA standard without requiring ridiculously long, unwieldy cards and sockets. The shorter pins correspond to the older 16-bit standard; the longer extend it to 32 bits.

In the end, EISA would prove of limited technical importance in the evolution of the Intel architecture. The new standard didn’t have much more luck than had MCA in establishing itself as the market’s default. Instead, by the time a 32-bit bus became a truly commonplace need among ordinary computer users, EISA and MCA alike were replaced by a still newer and better standard than either called the Peripheral Component Interconnect, or PCI. The bus wars of the late 1980s and very early 1990s can thus all too easily be seen as just another of the industry’s tempests in a teapot, an obscure squabble over technical esoterica of interest only to hardcore hackers.

Look a little harder at EISA, however, and we see a watershed moment in the history of the personal computer that dwarfs even the arrival of the Compaq Portable or the Deskpro 386. The Gang of Nine’s announcement brought with it a torrent of press coverage that for the first time openly questioned IBM’s continuing dominance of business-oriented computing. CNN’s Moneyline, the most-watched business report on cable television, dredged up Canion’s evergreen New Coke analogy yet again, going so far as to open its reports on the Gang of Nine’s announcement with a shot of soda bottles moving down a production line. IBM was “faced with overwhelming resistance to the flavor of ‘New Compute,'” declared the breathless report that followed; September 13, 1988, “was a day that left Big Blue looking black and blue.” An only slightly more sober Wall Street Journal article had it that the Gang of Nine “was joining forces in an audacious attempt to wrest away from IBM the power of setting the standard for how personal computers are designed, and they seem to have a chance of succeeding.” The article threw all its metaphors in a blender for the big conclusion: “For IBM, the Gang’s announcement yesterday is at best a dust storm of confusion, and, at worst, a dagger to the heart of its PC strategy.” When the Wall Street Journal threatens to turn against your big business, you know you have problems.

And, indeed, September 13, 1988, wound up representing everything the pundits and journalists said it might and more. Simply put, this was the instant that IBM finally and definitively lost control of the business-computing industry, the moment when the architecture they had created back in 1981 left the nest to go its own way. After this instant, no one would ever defer to IBM again. In January of 1989, Arlan Levitan, a columnist for the big consumer-computing magazine Compute! — like most such magazines, not particularly known for the boldness of its editorial stances — signaled the shifting conventional wisdom. His editors empowered him to launch a satirical broadside at IBM, the PS/2, MCA, and even all those who had bought into the hype, a group that very much included their own magazine.

During a Monday morning press breakfast hosted by IBM, over a thousand representatives of the computing press were shocked to hear newly hired Entry Systems Division president P.W. Herman declare that the firm’s PS/2 computer systems and its associated products were part of an elaborate psychological study undertaken at the behest of the National Institute of Mental Health. “I sure am glad the American people haven’t lost their sense of humor. It’s good to know that in these times everybody still appreciates a good joke.” According to Herman, the study was intended to quantify the limits of the operational parameters associated with Abraham Lincoln’s most famous aphorism. Said Herman, “I guess you really can’t fool all of the people all of the time. I’ll tell ya, though — the Micro Channel Architecture even had me going for a while.” All PS/2 owners will receive a letter signed by Herman and thanking them for their personal contribution toward furthering the present-day understanding of aberrant behavior. Corporate executives who committed their firms to IBM’s $800 OS/2 operating system will receive free remedial therapy in DOS reeducation centers. Those who took advantage of IBM’s trade-in policy, whereby users gave up their XTs or ATs for a PS/2, will receive their weight in PCjr computers. According to internal IBM sources, all costs associated with manufacturing and promoting PS/2s will cumulatively qualify as a tax-deductible research grant.

In terms of hardware if not software — Microsoft’s long, often damaging domination was just beginning in the latter realm — the industry was now a meritocracy, bound together only by a set of mutually if often only tacitly agreed-upon standards. That could only mean hard times for IBM, who were hardly used to competing on such a level playing field. In 1993, they posted a loss of a staggering $8 billion, the largest to that point in American business history, prompting a long, painful process of reinvention as a smaller, nimbler, dare I say it even humbler company. In 2004, in another watershed moment symbolic of many things, IBM stopped making PCs altogether, selling what was left of their personal-computer division to the Chinese computer manufacturer Lenovo in order to focus on consulting services.

The PS/2 story has rightfully gone down in business history as a classic tale of overweening arrogance that received its justified comeuppance. In attempting so aggressively to seize complete control of business computing — all of it — IBM pissed away the enviable dominance they already enjoyed. In attempting to build an empire that stood utterly alone and unchallenged, they burned the one they already had.

Yet there is another side to the PS/2 story that also deserves its due. Existing in those seemingly misbegotten machines alongside MCA and the cynicism it represented was a more positive, one might even say technically idealistic determination to advance the state of the art for this architecture that had long since become the mainstream face of computing, dwarfing in terms of the sheer money it generated any other platform.

And make no mistake: the world of the IBM compatibles was in sore need of advancement on multiple fronts. While machines like the Apple Macintosh and Commodore Amiga had opened whole new paradigms of computing — the former with its friendly GUI interface and crisp almost print-quality display, the latter with its multitasking operating system and implementation of the ideal of multimedia computing long before “multimedia” became a buzzword — the world of the clones had remained as bland as ever, a land of green or amber text-only displays, unpleasant beeps and squawks, and inscrutable command lines. For all the apparently proud users and sellers who took all this ugliness as a sign of serious businesslike intent, there were others who recognized that IBM and the clonesters had long since ceded the high ground of real, fundamental innovation in computing to rival platforms. Thankfully, some inside IBM were included in the latter group, and the results could be seen in the PS/2 machines.

Given how far the IBM-compatible world had fallen behind, it’s not surprising that many or most of the alleged innovations of the PS/2 were really a case of playing catch-up. For example, IBM finally produced their first-ever mouse for the line. They also switched over from the old, fragile 5.25-inch floppy-disk format to the newer, more robust and higher-capacity 3.5-inch format already being used by machines like the Macintosh and Amiga.

But undoubtedly the most welcome and significant of all the PS/2’s new technical developments were some desperately needed display improvements. The Video Graphics Array, or VGA, was included with the higher-end PS/2 models; lower-end models shipped with something called the Multi-Color Graphics Array (MCGA), with many but not quite all of the capabilities of VGA. After allowing their machines’ graphics capabilities to languish for years, IBM through VGA and to some extent MCGA finally brought them up to a level that compared very favorably with the Amiga. VGA and MCGA defined a palette of fully 262,144 colors, a huge leap over the 64 offered by the Enhanced Graphics Adapter (EGA), IBM’s previous best display option for their mainstream machines. The Amiga, by contrast, offered just 4096 colors, although its blitter and other custom hardware still gave it some notable advantages in the realm of fast animation.

All of these new developments marked IBM’s last great gifts to the standard they had birthed — gifts destined to long outlive the PS/2 line itself. The mouse connection IBM developed, for instance, remained a standard well beyond the millennium, with so-called “PS/2” connectors remaining common jargon, used by younger tech-heads and system builders who likely had only the vaguest idea from whence the usage derived. The VGA standard proved even longer-lived. It still survives today as the lowest-common-denominator baseline for computer displays, while ports matching the specification defined by IBM all those years ago remain on the back of every monitor and television set.

Ironically given IBM’s laser focus on using the PS/2 line to secure their dominance of business computing, its technical innovations ultimately proved most important in making the architecture viable as a proposition for the home, paving the way for the Microsoft-dominated second home-computer revolution of the 1990s. With good graphics falling into place at last thanks to VGA and the raw power of the 32-bit 80386, only two barriers remained to making PC-compatible machines realistic rivals to the likes of the Amiga as compelling home computers: decent sound to replace those atrocious beeps and squawks, and a decent price.

The first problem wouldn’t be a problem at all for very much longer. The first gaming-focused sound cards began to reach the market within a year of the PS/2 line’s debut, and by 1989 Creative Music Systems and Ad Lib both offered popular cards at street prices of $200 or less.

But the prices of home-oriented systems incorporating all of the PS/2 line’s innovations — MCA excepted — would, alas, take a little longer to fall. As late as July of 1989, when the VGA standard was already more than two years old, Computer Gaming World ran an article titled “Is VGA Worth It?” that seriously questioned whether it was indeed worth the still very considerable expense — VGA boards still cost $500 or more — to so equip a machine, especially given how few games supported VGA at that point. Nor did the 80386 find an immediate place in homes. As the 1980s turned into the 1990s, the newer chip was still a piece of pricey exotica in terms of the average consumer’s budget; the vast majority of the Intel-based PCs that were in consumers’ homes were still built around the 80286 or even the venerable old 8088.

Still, in the long run prices could only fall in such a hyper-competitive market. Given Commodore’s lackadaisical attitude toward improving the Amiga and Apple’s almost complete neglect of the consumer market in their eagerness to force the Macintosh into the offices of corporate America, the emerging standard of a 32-bit Intel-based PC with VGA graphics and a sound card came to the fore effectively unopposed. With the Internet having yet to emerge as home computing’s killer app to end all killer apps, it was games that drove this shift. In 1989, an Amiga was still the ultimate gaming computer. By 1991, it was an afterthought for American game publishers, the market being absolutely dominated by what was now starting to be called the “Wintel” standard. While game consoles and mobile devices have come and gone by the handful over the years since, in the realm of desktop- and laptop-based personal computing the heirs of the original IBM PC remain the overwhelming standard to this day. How ironic that this decades-long dominance was ensured by the PS/2, simultaneously the downfall of IBM and the savior of the inadvertently standard architecture IBM created.

(Sources: the books Big Blues: The Unmaking of IBM by Paul Carroll, Open: How Compaq Ended IBM’s PC Domination and Helped Invent Modern Computing by Rod Canion, and Hard Drive: Bill Gates and Making of the Microsoft Empire by James Wallace and Jim Erickson; Byte of June 1987, July 1987, August 1987, and December 1987; Compute! of June 1988, January 1989, and March 1989; Computer Gaming World of July 1989 and September 1989; Wall Street Journal of September 14 1988; the episodes of The Computer Chronicles titled “Intel 386 — The Fast Lane,” “IBM Personal System/2,” and “Bus Wars.”)


  1. The full story of OS/2 and the Presentation Manager and their relationship to Microsoft Windows and even Apple’s MacOS is a complex yet fascinating one, but also one best reserved for a future article where I can give it its proper due. 

 
 

Tags: , ,

Starflight

Starflight

Fair warning: this article spoils the ending to Starflight, although it doesn’t spoil the things you need to know to get there.

Starflight, one of the grandest and most expansive games of the 1980s, was born in the cramped confines of a racquetball court. Rod McConnell, a businessman who had been kicking around Silicon Valley for some years, happened to have as his regular playing partner Joe Ybarra, a former Apple executive who in late 1982 had decamped to join Trip Hawkins’s fledgling new Electronic Arts as a game “producer.” Intrigued by Ybarra’s stories of “electronic artists” and an upcoming revolution in entertainment based on interactivity, McConnell wondered if he might somehow join the fun. He thus started discussing ideas with a programmer named Dave Boulton.

Boulton, who died in 2009, is the unsung hero of Starflight. His involvement with the project wouldn’t last very long, but his fingerprints are all over the finished game. He was, first and foremost, a zealot for the Forth programming language. He was one of the founding members of the Forth Interest Group, which was established just at the beginning of the PC era in 1977 and did stellar work to standardize the language and bring it to virtually every one of the bewildering variety of computers available by the early 1980s. More recently his hacking had led him to begin exploring the strange universe of Benoit Mandelbrot’s fractal sets fully eighteen months before Rescue on Fractalus! would make fractals a household name for gamers and programmers everywhere. Boulton enticed McConnell with an idea much bigger than Lucasfilm’s simple action game: an almost infinitely vast planet which, thanks to the miracle of fractals, the player could roam at will.

McConnell founded a company named Ambient Design and hired a couple of young programmers to help Boulton. One, Alec Kercso, was just finishing a degree in Linguistics in San Diego, but was more interested in his hobby of hacking. The other, Bob Gonsalves, was another dedicated Forther who wrote a monthly column on the language for Antic magazine. He was hired on the basis of this as well as his intimate familiarity with the Atari 8-bit platform, which thanks to its audiovisual capabilities was the great favorite around EA circles during that first year or so, until the Commodore 64 came online in earnest. On the strength of McConnell’s friendship with Ybarra and little else — the  whole group of them had among them neither any experience with game development nor any real plan for what their game would be beyond a vast environment created with fractals — EA signed them as one of the first of their second wave of contracts, following the premiere of the initial six, reputation-establishing EA games. Ybarra would be their producer, their main point of contact with and advocate within EA. He would have his work cut out for him in the years to come.

The idea soon evolved to encompass not just a single planet but many. The game, to be called Starquest, would let you fly in your starship across an entire galaxy of star systems, each with planets of its own, each of which would in turn be its own unique world, with unique terrain, weather, life forms, and natural resources. For Boulton, the man who had gotten this ball rolling in earnest in the first place, it was suddenly getting to be too much. You just couldn’t do all that on an 8-bit computer, he said, not even with the magic combination of Forth and fractals. He walked away. He would go on to develop early software for the Commodore Amiga and to join another unheralded founder, Jef Raskin of the original vision for the Apple Macintosh, to work on Raskin’s innovative but unsuccessful Canon Cat.

Left on their own with only Boulton’s prototype fractal code to guide them, Kercso and Gonsalves felt over their heads. They needed to be able to show each planet as a rotating globe from space, complete with the fractal terrain that the player would be able to explore more intimately if she elected to land, but didn’t know how to map the terrain onto a sphere. McConnell soon found another programmer, Tim Lee, who did. Lee had already written firmware for Texas Instruments calculators and written very complex policy-analysis applications for life-insurance companies. Yet another Forth fan, he’d just finished writing an actual game in the language, an IBM PC port of the Datasoft action game Genesis which Datasoft would never ship due to its incompatibility with the PCjr. With the graphics code he’d developed for that project, plus his own explorations of fractal programming, Lee was more than up to rendering those spinning terrain-mapped globes.

One of Tim Lee's spinning terrain-mapped planet. He was also responsible for most of the fundamental low-level architecrure of the game.

One of Tim Lee’s spinning terrain-mapped planets. He was also responsible for most of the fundamental low-level architecture of the game.

Lee also brought with him his programming expertise on the IBM PC. This prompted the team to take a big step: to abandon their little 8-bitters and move to the bigger 16-bit MS-DOS machines. They had recognized that Boulton had been right: their ideas were just too big to fit into 8 bits. MS-DOS was just finishing up its trouncing of CP/M to become undisputed king of the business-computing world, but had managed little penetration into homes, which were still dominated by the likes of the Apple II and Commodore 64. On the one hand, the IBM was a terrible gaming platform: its CGA graphics could show only four colors at a time in palettes that seemed deliberately chosen to clash as horribly with one another as possible and give artists nightmares; its single speaker was capable of little more than equally unpleasant bleeps and farts; even standard gaming equipment like joysticks were effectively non-existent due to a perceived lack of demand. But on the other hand, the IBM was an amazing gaming platform, with several times the raw processing power of the 8-bitters and at least twice the memory. Like so much in life, it all depended on how you looked at it. Ambient Design decided they needed the platform’s advantages to contain a galaxy that would eventually encompass 270 star systems with 811 planets between them, and they’d just have to take the bitter with the sweet. Still, it’s unlikely that EA would have gone along with the idea had it not been for the imminent release of the PCjr, which was widely expected to do in home computing what its big brother had in business computing.

Starflight

About this point the last and arguably biggest piece of the development puzzle arrived in the form of Greg Johnson, Kercso’s roommate. Not much of a programmer himself, Johnson had, like his roommate, also just finished a degree and wasn’t quite sure what to do next. He had listened avidly to Kercso’s reports on the game’s progress, and eventually started drawing pictures of imagined scenes on his Atari 800 just for fun. He was soon coming up with so many pictures and, more importantly, ideas that Kercso got him an interview and McConnell hired him. Just like that, Johnson became the much-needed lead designer. Until now the team had been focused entirely on the environment they were trying to create, giving little thought to what the player would be expected to actually do there. As Kercso would later put it, what had been an “open-ended game of exploration” now slowly began to evolve into “a complex story with interwoven plots and twists.” Johnson himself later said his job became to come up with what should happen, the others to come up with how it could happen. Or, as Lee put it, Johnson designed the scenario while the others designed “the game system that you could write the scenario for.” And indeed, he proved to be a boundless fount of creativity, coming up with seven unique and occasionally hilarious alien races for the player to fight, trade, and converse with during her travels.

Critical to those conversations became a designer we’ve met before on this blog, Paul Reiche III, who spent two important weeks helping Johnson and his colleagues to hash out a workable conversation engine which made use of the system of conversation “postures” from a game he had co-designed with Jon Freeman, Murder on the Zinderneuf. Reiche, an experienced designer of tabletop RPG rules and adventures as well computer games, continued to offer Johnson, who had heretofore thought of himself as a better artist than writer or designer, advice and ideas throughout the game’s protracted development.

The system of conversation "postures" from Murder on the Zinderneuf.

The system of conversation “postures” from Murder on the Zinderneuf.

Starflight's implementation of conversation postures.

Starflight’s implementation of conversation postures.

“Protracted” is perhaps putting it too mildly. The process just seemed to go on forever, so much so that it became something of a sick running joke inside EA. The project appeared on more than three years worth of weekly status reports, from the time that EA was mature enough to have weekly status reports until the game’s belated release in August of 1986. Over that time the arcades and home game consoles crashed and burned; the home-computer industry went through its own dramatic boom and bust and stabilization; countless platforms came and went, not least among them the PCjr; EA gave up on the dream of revolutionizing mainstream home-entertainment and accepted the status of big fish in the relatively small pond of computer gaming; IBM achieved business-computing domination and then ceded it almost as quickly thanks to the cloners; the bookware craze came and went; Infocom rose to dizzying heights and crashed to earth just as quickly; the Soviet Union went from an Evil Empire to a partner in nuclear-arms control. And still, ever and anon, there was Starflight, the much more elegant name chosen for Starquest after the release of Sierra’s King’s Quest. McConnell’s company name changed as well before all was said and done, from Ambient Design to Binary Systems (get it?).

EA very nearly lost patience several times; McConnell credits his old friend Joe Ybarra with personally rescuing the project from cancellation on a number of occasions. With the contract structured to provide payments only after milestones that were few and increasingly far between, McConnell himself took personal loans and worked other jobs so as to be able to pay his team a pittance. Throughout it all he never lost faith, despite ample evidence that they didn’t, to be painfully blunt, entirely know what they were doing. The team members lived on savings or loans when their meager salaries ran out. Many months were consumed by fruitless wheel-spinning. As Lee later admitted, they were so entranced with this model universe that they “spent a lot of time trying to model things that didn’t add to the play of the game.” Forth was never the most readable language nor an ideal choice for a large group project, and as the project wandered off in this or that direction and back again the code got nightmarishly gnarly. This just made trying to modify or add to it take still longer. With McConnell only able to afford a tiny office and most of the team thus working remotely most of the time, just keeping everyone on the same page was difficult. Given the situation and the personalities involved, a certain amount of freelancing was inevitable. “There was no master plan detailing each and every task to be done,” said Kercso later. “We had an idea of what the major modules had to be and we added a lot of final design as we got into programming each of the modules”; then they did their best to mash it all together.

Starflight was a prototypical runaway, mismanaged, overambitious project, the likes of which the industry has seen many times since. The difference was, instead of being ignominiously cancelled or shoved out the door incomplete, Starflight somehow ended up amazing. Call it serendipity, or credit it to a team that just wouldn’t give up. Once the core group was assembled, nobody thought of quitting; everyone was determined to finish the game — and on its own original, insanely ambitious terms at that — or die trying. “I remember saying that I didn’t care if I died after it came out,” said Johnson later, but “please, God, let me live until then.”

The hopeless combat screen.

The hopeless muddle of a combat engine.

Some of the confusion and occasional lack of direction is visible in the final game. Even the biggest Starflight fan would have trouble praising the arcade-style in-space combat engine, for instance, which manages to be both far too simplistic and far too baffling to actually control. There’s a disconnected feeling to certain elements, as of clever ideas that were never fully woven into the holistic design. You can gather flora and fauna from the planets you visit and return them to your base to study, for example, but you make so little money from doing so as opposed to mining minerals — and the controls for stunning and gathering your specimens are once again so awkward — that you’re left wondering what the point is. Ditto most of the intriguing alien artifacts you find, which you cart excitedly back to base only to find that they “reveal very little of interest” and are “totally useless to us.” And the game has what should be a fatal case of split personality, being half stately space opera and half silly romp filled with sci-fi alien caricatures.

And yet it really doesn’t matter. Starflight is that rare piece of work that actually justifies the critic’s cliché of being more than the sum of its parts. It’s not a tight design; appropriately given its theme, it sprawls everywhere, sometimes seemingly uselessly so. But even its blind alleys are fascinating to wander down once or twice. It’s the opposite of a minimalist masterpiece like M.U.L.E., whose every last note is carefully considered and exhaustively tested and blended carefully into the whole. And you know what? It’s every bit as awesome.

But for the benefit of those of you who haven’t played it it’s really high time that I tell what the game’s all about, isn’t it?

Your home starbase, where you outfit your ship, select and train your crew, buy and sell equipment and resources, etc.

Your home starbase, where you outfit your ship, select and train your crew, buy and sell equipment and resources, etc. It was largely the work of Alec Kercso.

Starflight starts you off at your base on your home planet of Arth — no, that’s not a typo — with a rather shabbily equipped ship and a little bit of seed capital. If you’re smart, you’ll spend most of the latter training a crew, which will include, in the tradition of a certain classic television series that went where no man has gone before, a Captain, a Science Officer, a Navigator, an Engineer, a Communications Officer, and a Doctor. You’ll also need to save enough to add a cargo pod or three to your ship, so you can begin to earn money by landing on nearby planets and scooping up minerals for sale back at Arth. You need money to upgrade your ship with better engines, weapons, and defenses, to train your crew, and to buy something called endurium, if you can’t find or mine enough of it. Endurium is Starflight‘s equivalent to dilithium crystals, the semi-magical fuel that enables faster-than-light travel.

As you build up your ship and your bank account, you can travel ever farther from Arth, exploring an algorithmically generated galaxy so vast that, like the Fibonacci galaxies of Elite, even Starflight‘s creators hadn’t seen all of it before the game’s release. And so you fly and land where you will, searching for mineral-rich planets you can mine and, even better, habitable planets you can recommend for colonization; you receive a substantial finder’s fee in return for each recommendation. Alien races inhabit various sectors of the galaxy. Some you may be able to befriend, or at least achieve a level of mutual toleration with; others you’ll have to fight. Thus the need to fit out your ship with the best possible weapons and defenses.

Exploring the surface of a planet. This module was largely the work of Bob Gonsalves.

Exploring the surface of a planet. This module was largely the work of Bob Gonsalves.

This, then, is Starflight the sandbox game. While it’s in no way derivative of EliteStarflight‘s creators couldn’t have even been aware of the older game until quite late in their own development cycle, since Elite didn’t reach American shores until late 1985 — Starflight does generate a similar compulsion to explore, an addictive need to see what all is out there. But everything about Starflight is richer and more complex, with the exception only of the combat system that was the heart of Elite but a mere afterthought in Starflight (if you had to spend much time in Starflight actually fighting, it would be a very, very bad game). With so much more computing horsepower at their disposal, Binary Systems was able to add layer after intriguing layer: the ability to land on planets, and once there to engage in an exploring and mining mini-game that is as absurdly addictive as it is superficially simplistic; the chance to converse with the aliens you meet instead of just shooting at them; the whole CRPG angle of training a crew and keeping them healthy; sensor- and Navigator-confounding nebulae and wormholes to negotiate. Whereas Elite sessions soon settle down into a comfortable routine of trade-jump-fight-dock, rinse and repeat forever, Starflight always seems to have something new to throw at you.

But the most important difference is the plot that Starflight layers over its sandbox. I realize everyone is different on this point, but personally I always have a little bit of trouble with purely open-ended games (see my review of Seven Cities of Gold for another example). When I play Elite I eventually start to get bored for lack of any real narrative or goal to shoot for beyond the almost impossible one of actually becoming Elite. Ian Bell and David Braben originally wanted to include a real plot, but there just wasn’t room to contain it inside a 32 K BBC Micro. Starflight, however, has the sort of plot-driven direction that Elite so painfully lacks.

So, having told you what you can do in Starflight, let me now tell you why you do it. Evidence has recently turned up on Arth that the planet’s inhabitants did not evolve there; that it was colonized at some point in the distant past, that the colonists regressed into barbarism due to war or other pressures, and that only now has civilization recovered. A cache of old documents has also revealed the secrets of endurium and faster-than-light travel. All of which is great, except that Arth has even bigger fish to fry. A strange wave is spreading across the galaxy, causing stars to flare — with disastrous results for any orbiting planets — as it strikes them. Thus your mission is not just to explore and get rich, but to discover the source of the wave and to stop it before it reaches Arth.

Starflight has an unusually elaborate plot for its day, but unlike in so many more recent games it never straitjackets you to it. The plot is more backstory than story. The game is essentially a big scavenger hunt, sending you off to reconstruct quite a complicated galactic history. Follow the trail long enough and you should turn up the clues and objects you need to end the threat to Arth and the galaxy by blowing up a certain Crystal Planet that’s the source of all the trouble. There’s not all that much that you actually need to do to beat the game when you know how. In fact, you can do it in less than two game days. It’s the clue- and object-scavenging that’s all the fun, the process of putting the pieces of the backstory together. When you discover Earth, for example — yes, those original colonizers of Arth came, inevitably, from Earth — it gives a thrill when you first look down on those familiar continents from orbit. Other pieces of the puzzle are almost equally thrilling when they come to light. If you’re playing cold, sans walkthrough — which is honestly the only way to play; you’ll otherwise just be left wondering what all the fuss is about — you’ll need to look everywhere for clues: to the occasional emails you receive from your overseers on Arth; to messages and artifacts you find on the planets; to the map and other materials included in the game package. And, most importantly, you need to talk at length to all those aliens, a goofy and amusing rogue’s gallery of sci-fi clichés. They’re the silly part of this odd mixture of stately epic and silly romp — but they’re so much fun we’ll take them just as they are, cognitive dissonance be damned.

The Elowans, a race of plant-like hippies who evince peace and love along with passive aggression.

The Elowan, a race of plant-like hippies who evince peace and love along with passive aggression.

The Thrynn, who have such weird issues with the Elowan that they'll attack if you have one in your crew.

The Thrynn, who have such weird issues with the Elowan that they’ll attack if you have one in your crew.

The unforgettably loathsome Spemin, who lack backbone -- literally.

The unforgettably loathsome Spemin, who lack backbone — literally.

The Mechans, a group of robots who think you're just what they've been waiting for all these millennia.

The Mechans, a group of robots who think you’re just what they’ve been waiting for all these millennia.

Now, this plot-as-scavenger-hunt approach to gameplay is hardly an innovation of Starflight. The Ultima games in particular had long been trolling these waters by the time it appeared. The breadcrumb-following approach to game design always gives rise to the possibility of getting yourself stuck because you’ve missed that one little thing in this absurdly vast virtual world on which all further progress depends. Yet there is a difference between Starflight and Ultima in this respect, a difference not so much in kind as in quality. Starflight is a much more friendly, generous game. Whereas Ultima seems to relish making you fail by hiding vital clues in the most outlandish places or behind the most unlikely parser keywords, there’s a sense that Starflight really wants you to succeed, wants you to solve the mystery and save the galaxy. There are multiple ways to learn much of what you need to know, multiple copies of some vital artifacts hidden in completely different places, multiple solutions to most of the logistic and diplomatic puzzles it sets before you. Yes, there’s a time limit, what with Arth’s sun destined eventually to flare, but even that is very generous, operating more as a way to lend the game excitement and narrative urgency than a way to crush you for failing some hardcore gamer test. Its generosity is not absolute: in my own recent playthrough I had to turn to a walkthrough to learn that you need to be obsequious when you talk to the Velox or they’ll never share an absolutely vital piece of information that I don’t think you can glean anywhere else (remember that, would-be future players!). Still, even these few choke points feel more like accidents than deliberate cruelties strewn in your path by cackling designers. Starflight really does feel like a welcome step toward a more forgiving, inclusive sort of gaming.

Spoilers begin!

No discussion of Starflight‘s plot can be complete without the shocker of an ending. When you finally arrive at the Crystal Planet and are preparing to destroy it, everything suddenly gets deeply weird via a message from an earlier visitor:

I can hardly believe it! Those weird lumps are actually intelligent life. The Ancients are endurium! And we have spent hundreds of years hunting them to burn for fuel in our ships. Their metabolism is so much slower than ours that they live in an entirely different time framework. I don’t think they even know we are sentient. I believe it was only because of a link thru the Crystal Planet that contact was made at all. This Crystal Planet was their last defense. I can hardly blame them. Carbon-based life must have seemed something like a virus to them.

Despite this discovery, the only option — other than to simply stop playing — is to blow up the Crystal Planet anyway, thus annihilating the home planet of a race much more ancient and presumably more sophisticated than your own. It’s a strange, discordant sort of ending to say the least. Some have made much of it indeed; see for instance an earnest article in Envirogamer that would make of Starflight an elaborate allegory for our environmental problems of today, with endurium representing fossil fuels and the stellar wave standing in for global warming. I’m not really buying that. Not only does the article strike me as anachronistic, an argument born of 2009 rather than the mid-1980s, but I somehow have trouble seeing Starflight as a platform for such deliberate messaging; it strikes me as a grand escapist space opera, full stop, without any rhetorical axe to grind.

But is Starflight‘s ending a deliberate subversion of genre convention, like the controversial ending to Infidel? Maybe. It’s not as if the game is not without a certain melancholia in other places; for instance, you’ll occasionally meet a race of interstellar minstrels who roam the spacelanes singing sad songs about glories that used to be. Yet after you do the bloody deed on the Crystal Planet the game immediately leaps back to unabashed triumphalism, gifting you with medals and music and glory and a chance to roam the galaxy at your leisure with an extra 500,000 credits in your account, burning genocidal quantities of endurium as you do so with nary a moral qualm to dog your steps. What to make of this seeming obliviousness to the ramifications of what you’ve just done? You’ll have to decide for yourself whether it represents subtlety or just a muddle of mixed messages that got away from their creators. It’s just one more of the layers within layers that make Starflight so memorable for just about everyone who plays it.

Spoilers end!

In addition to its innovations in the softer arts of design and narrative, Starflight has one final, more concrete quality that sets it apart from what came before, one that can be very easy to overlook but is nevertheless of huge importance. It’s the first of these big open-world games that offers a truly persistent virtual world to explore. Due to the limitations of 8-bit floppy-disk drives, earlier games all fudge persistence in some way. The Wizardry and Bard’s Tale games, for instance, save only the state of your characters between sessions. Everything else resets to its original state as soon as you leave the game, or, indeed, just travel between dungeon levels or between the dungeon and the city. Amongst numerous other oddities, this means that you can actually “win” these games over and over with the same group of characters; the games literally forget you’ve done so almost immediately after showing you the victory screen. The 8-bit Ultimas do only a little better: the outdoor world map does persist along with the details of your characters, but cities and dungeons, again, reset themselves ad infinitum. You can go into a town, murder a dozen guards and rob every shop in town, then exit and return to find all restored to peace and tranquility again. Indeed, solving the early Ultimas is virtually dependent on you doing exactly this.

Starflight, however, is different. Its whole vast galaxy remembers what you have done, on both a macro- and micro-scale. If you discover a juicy new planet, name it, and recommend it for colonization, it goes by that name for the rest of the game. If you befriend or piss off a given alien race, they don’t forget you or what you’ve done. If you strip-mine a planet of all its minerals, they don’t reappear the next time you land on it. If you make notes in your “Captain’s Log” (a first in itself), they remain there until you delete them. If you blow up an alien race’s home planet thereby destroying their entire civilization, it stays blown up. This is a huge step forward for verisimilitude, one enabled by Binary Systems’s choice to throw caution to the wind and target the bigger, more capable MS-DOS machines.1

As Starflight neared release at last, it was very much an open question whether it would find an audience. By now the PCjr had come and gone — just as well given that the game’s memory requirements had ballooned past that machine’s standard 128 K to a full 256 K. No one had ever released such a major title exclusively on MS-DOS. Normally if that platform got games at all they were ports of titles that had already proved themselves on the more popular gaming machines, delivered months or years after their debuts elsewhere. Binary Systems and EA could only make sure Starflight supported the popular Tandy 1000’s enhanced graphics and hope for the best.

The best was far better than they had bargained for: initial sales far exceeded the most optimistic expectations, leaving EA scrambling to produce more copies to fill empty store shelves. It would eventually sell well over 100,000 copies on MS-DOS alone, a major hit by the standards of any platform. Starflight placed owners of other computers in the unaccustomed position of lusting after a game on MS-DOS of all places, a platform most had heretofore viewed with contempt. Appearing as it did even as owners of the new generation of 68000-based machines were reveling in their Macs, Amigas, and Atari STs, Starflight was an early sign of a sea change that would all but sweep those platforms into oblivion within five years or so. With it now clear that a market of eager MS-DOS gamers existed, the platform suddenly became a viable first-release choice for publishers and developers. Only years later would Starflight belatedly, and not without much pain given the unique Forthian nature of its underpinnings, be ported to the Amiga, ST, Macintosh, Sega Genesis, and even the little Commodore 64 — the latter of which would probably have been better bypassed. It would sell at least 200,000 more copies on those platforms, a nice instance of creativity and sheer hard work being amply rewarded for Rod McConnell’s idealistic little team of five.

Most of Binary Systems stayed together long enough to craft a fairly workmanlike sequel, Starflight 2: Trade Routes of the Cloud Nebula, released in 1989 for MS-DOS just as the first ports of the original game were reaching those other platforms. It was playable enough, but somehow lacked some of the magic of the original. Most then drifted away from the games industry, with only Greg Johnson continuing to work intermittently as a designer, most notably of the Toejam & Earl games for the Sega Genesis. Starflight had been such an all-consuming, exhausting labor of love that perhaps it was hard for the others to imagine starting all over again on another, inevitably less special project. Making Starflight had been the sort of experience that can come only once in a lifetime; anything else they did in games afterward would have been anticlimactic.

If we’re looking for something to which to attribute Starflight‘s success, both commercially and, more importantly, artistically, we’re fairly spoiled for choice. Alec Kercso credits the way that he and his colleagues were allowed to work “organically,” experimenting to see what worked and what didn’t. Credit also the odd idealism that clung to the team as a whole, which prompted them to never back away from their determination to make something bigger and qualitatively different than anything that had come before, no matter how long it took. Credit Joe Ybarra and the management of EA who, skeptical as they may sometimes have been, ultimately gave Binary Systems the time and space they needed to create a masterpiece. Credit Rod McConnell for giving his stable of talented youngsters the freedom to create whilst finding a way to keep the lights on through all those long years. And credit, almost paradoxically, the limited technology of the era. With their graphics capabilities sharply limited, the team was free to concentrate on building an interesting galaxy, full of interesting things to do, and to tweak it endlessly, without needing to employ dozens of artists and 3D modellers to represent every little idea; tellingly, the only artist on the team was Johnson, who was also the lead designer. And of course credit Johnson for giving the game a plot and an unforgettable, quirky personality all its own, without which all of its technical innovations would have added up to little.

There’s a stately dignity to Starflight even amidst all the goofy gags, a sense of something grand and fresh and new attempted and, even more remarkably, actually realized. Few games have ever captured that science-fictional sense of wonder quite this well. If you start playing it — and that’s very easy to do now; Starflight 1 and 2 are both available in one-click-installable form from GOG.com — you might just find yourself lost in its depths for a long, long time. This, my friends, is one of the great ones.

(Useful vintage articles on Starflight include an interview with Rod McConnell in the March 1987 Computer Gaming World and especially one with Tim Lee in the July/August 1987 Forth Dimensions. Alec Kercso wrote about the game in, of all places, Jonathan S. Harbour’s Microsoft Visual BASIC Programming with DirectX. Good recent articles include ones in The Escapist, Gamasutra, and Sega-16. Tim Lee gave part of the source code and many design documents to Ryan Lindeman. He once made even more source and documents available online, some of which can still be recovered via the Internet Archive’s Wayback Machine. Similarly available only via the Wayback Machine is Miguel Etto’s technical analysis of the game’s Forthian underpinnings.)


  1. This is not to say that all is smooth sailing. Starflight constantly saves updated versions of its data files to disk as you play. It then relies on you to “commit” all of these changes by cleanly exiting the game from the menu. If you ever exit without properly saving, or get killed, your game becomes unplayable until you reset it back to its original data — whereupon you have the joy of starting all over. My advice is to make backups of the files “STARA.COM” and “STARB.COM” after every successful session; then if you get killed or have some other problem you can just copy these back into the game’s directory to get back to a good state. Or, if you like, here’s a DOSBox startup script you can use to automatically keep a few generations of states. Just copy it into the “[autoexec]” section of the game’s “.conf” file, editing as needed to suit your directory names. 

 
 

Tags: , ,

Send in the Clones

In computer parlance, a clone is Company B’s copycat version of Company A’s computer that strains to be as software and hardware compatible with its inspiration as possible. For a platform to make an attractive target for cloning, it needs to meet a few criteria. The inspiration needs to be simple and/or well-documented enough that it’s practical for another company — and generally a smaller company at that, with far fewer resources at its disposal — to create a compatible knock-off in the first place. Then the inspiration needs to be successful enough that it’s spawned an attractive ecosystem that lots of people want to be a part of. And finally, there needs to be something preventing said people from joining said ecosystem by, you know, simply buying the machine that’s about to be cloned. Perhaps Company A, believing it has a lock on the market, keeps the price above what many otherwise interested people are willing or able to pay; perhaps Company A has simply neglected to do business in a certain part of the world filled with eager would-be buyers.

Clones have been with us almost from the moment that the trinity of 1977 kicked off the PC revolution in earnest. The TRS-80 was the big early winner of the trio thanks to its relatively low price and wide distribution through thousands of Radio Shack stores, outselling the Apple II in its first months by margins of at least twenty to one (as for the Commodore PET, it was the Bigfoot of the three, occasionally glimpsed in its natural habitat of trade-show booths but never available in a form you could actually put your hands on until well into 1978). The first vibrant, non-business-focused commercial software market in history sprung up around the little Trash 80. Cobbled together on an extreme budget out of generic parts that were literally just lying around at Radio Shack — the “monitor,” for instance, was just a cheap Radio Shack television re-purposed for the role — the TRS-80 was eminently cloneable. Doing so didn’t make a whole lot of sense in North America, where Radio Shack’s volume manufacturing and distribution system would be hard advantages to overcome. But Radio Shack had virtually no presence outside of North America, where there were nevertheless plenty of enthusiasts eager to join the revolution.

EACA shindig in Hong Kong

A shindig for EACA distributors in Hong Kong. Shortly after this photo was taken, Eric Chuang, third from right in front, would abscond with $10 million and that would be that for EACA.

The most prominent of the number of TRS-80 cloners that had sprung up by 1980 was a rather shady Hong Kong-based company called EACA, who made cheap clones for any region of the world with distributors willing to buy them. Their knock-offs popped up in Europe under the name “The Video Genie”; in Australasia as the “Dick Smith System 80,” distributed under the auspices of Dick Smith Electronics, the region’s closest equivalent to Radio Shack; even in North America as the “Personal Micro Computers PMC-80.” EACA ended in dramatic fashion in 1983 when founder Eric Chuang absconded to Taiwan with all of his company’s assets that he could liquify, $10 million worth, stuffed into his briefcase. He or his descendants are presumably still living the high life there today.

By the time of those events, the TRS-80’s heyday was already well past, its position as the most active and exciting PC platform long since having been assumed by the Apple II, which had begun a surge to the fore in the wake of the II Plus model of 1979. The Apple II was if anything an even more tempting target for cloners than the TRS-80. While Steve Wozniak’s hardware design is justly still remembered as a marvel of compact elegance, it was also built entirely from readily available parts, lacking the complex and difficult-to-duplicate custom chips of competitors like Atari and Commodore. Wozniak had also insisted that every last diode on the Apple II’s circuit board be meticulously documented for the benefit of hackers just like him. And Apple, then as now, maintained some of the highest profit margins in the industry, creating a huge opportunity for a lean-and-mean cloner to undercut them.

The Franklin Ace 1000

A Franklin Ace 1000 mixed and matched with a genuine Apple floppy drive.

Assorted poorly distributed Far Eastern knock-offs aside, the first really viable Apple II clone arrived in mid-1982 in the form of the Franklin Ace line. The most popular model, the Ace 1000, offered for about 25 percent less than a II Plus complete hardware and software compatibility while also having more memory as well as luxuries like a numeric keypad and upper- and lowercase letter input. The Ace terrified Apple. With the Apple III having turned into a disaster, Apple remained a one-platform company, completely dependent on continuing Apple II sales — and continuing high Apple II profit margins — to fund not one but two hugely ambitious, hugely innovative, and hugely expensive new platform initiatives, Lisa and Macintosh. A viable market in Apple II workalikes which cut seriously into sales, or that forced price cuts, could bring everything down around their ears. Already six months before the Ace actually hit the market, as soon as they got word of Franklin’s plans, Apple’s lawyers were therefore looking for a way to challenge Franklin in court and drive their machine from the market.

As it turned out, the basis for a legal challenge wasn’t hard to find. Yes, the Apple II’s unexceptional hardware would seem to be fair game — but the machine’s systems software was not. Apple quickly confirmed that, like most of the TRS-80 cloners, Franklin had simply copied the contents of the II’s ROM chips; even bugs and the secret messages Apple’s programmers had hidden inside them were still there in Franklin’s versions. A triumphant Apple rushed to federal court to seek a preliminary injunction to keep the Ace off the market until the matter was decided through a trial. Much to their shocked dismay, the District Court for the Eastern District of Pennsylvania found the defense offered by Franklin’s legal team compelling enough to deny the injuction. The Ace came out right on schedule that summer of 1982, to good reviews and excellent sales.

Franklin’s defense sounds almost unbelievable today. They readily admitted that they had simply copied the contents of the ROM chips. They insisted, however, that the binary code contained on the chips, being a machine-generated sequence of 1s and 0s that existed only inside the chips and that couldn’t be reasonably read by a human, was not a form of creative expression and thus not eligible for copyright protection in the first place. In Franklin’s formulation, only the human-readable source code used to create the binary code stored on the ROM chips, which Franklin had no access to and no need for given that they had the binary code, was copyrightable. It was an audacious defense to say the least, one which if accepted would tear down the legal basis for the entire software industry. After all, how long would it take someone to leap to the conclusion that some hot new game, stored only in non-human-readable form on a floppy disk, was also ineligible for copyright protection? Astonishingly, when the case got back to the District Court for a proper trial the judge again sided with Franklin, stating that “there is some doubt as to the copyrightability of the programs described in this litigation,” in spite of an earlier case, Williams Electronics, Inc. v. Arctic International, Inc., which quite clearly had established binary code as copyrightable. Only in August of 1983 was the lower court’s ruling overturned by the Federal Court of Appeals in Philadelphia. A truculent Franklin threatened to appeal to the Supreme Court, but finally agreed to a settlement that January that demanded they start using their own ROMs if they wanted to keep cloning Apple IIs.

Apple Computer, Inc., v. Franklin Computer Corp. still stands today as a landmark in technology jurisprudence. It firmly and finally established the copyrightable status of software regardless of its form of distribution. And it of course also had an immediate impact on would-be cloners, making their lives much more difficult than before. With everyone now perfectly clear on what was and wasn’t legal, attorney David Grais clarified the process cloners would need to follow to avoid lawsuits in an episode of Computer Chronicles:

You have to have one person prepare a specification of what the program [the systems software] is supposed to do, and have another person who’s never seen the [original] program write a program to do it. If you can persuade a judge that the second fellow didn’t copy from the [original] code, then I think you’ll be pretty safe.

After going through this process, Apple II cloners needed to end up with systems software that behaved absolutely identically to the original. Every system call needed to take the exact same amount of time that it did on a real Apple II; each of the original software’s various little quirks and bugs needed to be meticulously duplicated. Anything less would bring with it incompatibility, because there was absolutely nothing in those ROMs that some enterprising hacker hadn’t used in some crazy, undocumented, unexpected way. This was a tall hurdle indeed, one which neither Franklin nor any other Apple II cloner was ever able to completely clear. New Franklins duly debuted with the new, legal ROMs, and duly proved to be much less compatible and thus much less desirable than the older models. Franklin left the Apple-cloning business within a few years in favor of hand-held dictionaries and thesauri.

There is, however, still another platform to consider, one on which the cloners would be markedly more successful: the IBM PC. The open or (better said) modular architecture of the IBM PC was not, as so many popular histories have claimed, a sign of a panicked or slapdash design process. It was rather simply the way that IBM did business. Back in the 1960s the company had revolutionized the world of mainframe computing with the IBM System/360, not a single computer model but a whole extended family of hardware and software designed to plug and play together in whatever combination best suited a customer’s needs. It was this product line, the most successful in IBM’s history, that propelled them to the position of absolute dominance of big corporate computing that they still enjoyed in the 1980s, and that reduced formerly proud competitors to playing within the house IBM had built by becoming humble “Plug-Compatible Manufacturers” selling peripherals that IBM hadn’t deigned to provide — or, just as frequently, selling clones of IBM’s products for lower prices. Still, the combined profits of all the cloners remained always far less than those of IBM itself; it seemed that lots of businesses wanted the security that IBM’s stellar reputation guaranteed, and were willing to pay a bit extra for it. IBM may have thought the PC market would play out the same way. If so, they were in for a rude surprise.

The IBM PC was also envisioned as not so much a computer as the cornerstone of an ever-evolving, interoperable computing family that could live for years or decades. Within three years of the original machine’s launch, you could already choose from two CPUs, the original Intel 8088 or the new 80286; could install as little as 16 K of memory or as much as 640 K; could choose among four different display cards, from the text-only Monochrome Display Adapter to the complicated and expensive CAD-oriented Professional Graphics Controller; could choose from a huge variety of other peripherals: floppy and hard disks, tape backup units, modems, printer interfaces, etc. The unifying common denominator amongst all this was a common operating system, MS-DOS, which had quickly established itself as the only one of the four operating paradigms supported by the original IBM PC that anyone actually used. Here we do see a key difference between the System/360 and the IBM PC, one destined to cause IBM much chagrin: whereas the former ran an in-house-developed IBM operating system, the operating system of the latter belonged to Microsoft.

The IBM architecture was different from that of the Apple II in that its operating system resided on disk, to be booted into memory at system startup, rather than being housed in ROM. Still, every computer needs to have some code in ROM. On an IBM PC, this code was known as the “Basic Input/Output System,” or BIOS, a nomenclature borrowed from the CP/M-based machines that preceded it. The BIOS was responsible on startup for doing some self-checks and configuration and booting the operating system from disk. It also contained a set of very basic, very low-level routines to do things like read from and write to the disks, detect keyboard input, or display text on the screen; these would be called constantly by MS-DOS and, very commonly, by applications as well while the machine was in operation. The BIOS was the one piece of software for the IBM PC that IBM themselves had written and owned, and for obvious reasons they weren’t inclined to share it with anyone else. Two small companies, Corona Labs and Eagle Computer, would simply copy IBM’s BIOS a la Franklin. It took the larger company all of one day to file suit and force complete capitulation and market withdrawal when those machines came to their attention in early 1984.

Long before those events, other wiser would-be cloners recognized that creating a workalike, “clean-room” version of IBM’s BIOS would be the key to executing a legal IBM clone. The IBM PC’s emphasis on modularity and future expansion meant that it was a bit more forgiving in this area than the likes of the more tightly integrated Apple II. Yet an IBM-compatible BIOS would still be a tricky business, fraught with technical and financial risk.

As the IBM PC was beginning to ship, a trio of Texas Instruments executives named Rod Canion, James Harris, and William Murto were kicking around ideas for getting out from under what they saw as a growing culture of non-innovation inside TI. Eager to start a business of their own, they considered everything from a Mexican restaurant to household gadgets like a beeper for finding lost keys. Eventually they started to ask what the people around them at TI wanted but weren’t getting in their professional lives. They soon had their answer: a usable portable computer that executives and engineers could cart around with them on the road, and that was cheap enough that their purchasing managers wouldn’t balk. Other companies had explored this realm before, most notably the brief-lived Osborne Computer with the Osborne 1, but those products had fallen down badly in the usability sweepstakes; the Osborne 1, for example, had a 5-inch display screen the mere thought of which could prompt severe eye strain in those with any experience with the machine, disk drives that could store all of 91 K, and just 64 K of memory. Importantly, all of those older portables ran CP/M, until now the standard for business computing. Canion, Harris, and Murto guessed, correctly, that CP/M’s days were numbered in the wake of IBM’s adoption of MS-DOS. Not wanting to be tied to a dying operating system, they first considered making their own. Yet when they polled the big software publishers about their interest in developing for yet another new, incompatible machine the results were not encouraging. There was only one thing for it: they must find a way to make their portable compatible with the IBM PC. If they could bring out such a machine before IBM did, the spoils could be enormous. Prominent tech venture capitalist Ben Rosen agreed, investing $2.5 million to help found Compaq Computer Corporation in February of 1982. What with solid funding and their own connections within the industry, Canion, Harris, and Murto thought they could easily design a hardware-compatible portable that was better than anything else available at the time. That just left the software side.

Given Bill Gates’s reputation as the Machiavelli of the computer industry, we perhaps shouldn’t be surprised that some journalists have credited him with anticipating the rise of PC clones from well before the release of the first IBM PC. That, however, is not the case. All indications are that Gates negotiated a deal that let Microsoft lease MS-DOS to IBM rather than sell it to them simply in the expectation that the IBM PC would be a big success, enough so that an ongoing licensing fee would amount to far more than a lump-sum payout in the long run. Thus he was as surprised as anyone when Compaq and a few other early would-be cloners contacted him to negotiate MS-DOS license deals for their own machines. Of course, Gates being Gates, it took him all of about ten minutes to grasp the implications of what was being requested, and to start making deals that, not incidentally, actually paid considerably better than the one he’d already made with IBM.

The BIOS would be a tougher nut to crack, the beachhead on which this invasion of Big Blue’s turf would succeed or fail. Having quickly concluded that simply copying IBM’s ROMs wasn’t a wise option, Compaq hired a staff of fifteen programmers who would dedicate the months to come to creating a slavish imitation. Programmers with any familiarity at all with the IBM BIOS were known as “dirty,” and barred from working on the project. Instead of relying on IBM’s published BIOS specifications (which might very well be incorrect due to oversight or skulduggery), the team took the thirty biggest applications on the market and worked through them one at a time, analyzing each BIOS call each program made and figuring out through trial and error what response it needed to receive. The two trickiest programs, which would go on to become a sort of stress test for clone compatibility both inside and outside of Compaq, proved to be Lotus 1-2-3 and Microsoft Flight Simulator.

Before the end of the year, Compaq was previewing their new portable to press and public and working hard to set up a strong dealer network. For the latter task they indulged in a bit of headhunting: they hired away from IBM H. L. ”Sparky” Sparks, the man who had set up the IBM PC dealer network. Knowing all too well how dealers thought and what was most important to them, Sparks instituted a standard expected dealer markup of 36 percent, versus the 33 percent offered by IBM, thus giving them every reason to look hard at whether a Compaq might meet a customer’s needs just as well or better than a machine from Big Blue.

The Compaq Portable

Compaq’s first computer, the Portable

Savvy business realpolitik like that became a hallmark of Compaq. Previously clones had been the purview of small upstarts, often with a distinct air of the fly-by-night about them. The suburban-Houston-based Compaq, though, was different, not only from other cloners but also from the established companies of Silicon Valley. Compaq was older, more conservative, interested in changing the world only to the extent that that meant more Compaq computers on desks and in airplane luggage racks. ”I don’t think you could get a 20-year-old to not try to satisfy his ego by ‘improving’ on IBM,” said J. Steven Flannigan, the man who led the BIOS reverse-engineering effort. “When you’re fat, balding, and 40, and have a lot of patents already, you don’t have to try.” That attitude was something corporate purchasing managers could understand. Indeed, Compaq bore with it quite a lot of the same sense of comforting stolidity as did IBM itself. Not quite the first to hit the market with an IBM clone with a “clean” BIOS (that honor likely belongs to Columbia Data Products, a much scruffier sort of operation that would be out of business by 1985), Compaq nevertheless legitimized the notion in the eyes of corporate America.

The Compaq Portable goes flying

The worst possible 1980s airplane seatmate: a business traveler lugging along a Compaq Portable.

Yet the Compaq Portable that started shipping very early in 1983 also succeeded because it was an excellent and — Flannigan’s sentiments aside — innovative product. By coming out with their portable before IBM itself, Compaq showed that clones need not be mere slavish imitations of their inspirations distinguished only by a lower price. “Portable” in 1983 did not, mind you, mean what it does today. The Compaq Portable was bigger and heavier  — a full 28 pounds — than most desktop machines of today, something you manhandled around like a suitcase rather than slipping into a pocket or backpack. There wasn’t even a battery in the thing, meaning the businessperson on the go would likely be doing her “portable” computing only in her hotel room. Still, it was very thoughtfully designed within the technical constraints of its era; you could for instance attach it to a real monitor at your desk to enjoy color graphics in lieu of the little 9-inch monochrome screen that came built-in, a first step on the road to the ubiquitous laptop docking stations of today.

Launching fortuitously just as some manufacturing snafus and unexpected demand for the new PC/XT were making IBM’s own computers hard to secure in some places, the Compaq Portable took off like a rocket. Compaq sold 53,000 of them for $111 million in sales that first year, a record for a technology startup. IBM, suddenly in the unaccustomed position of playing catch-up, released their own portable the following year with fewer features but — and this was truly shocking — a lower price than the Compaq Portable; by forcing high-and-mighty IBM to compete on price, Compaq seemed to have somehow turned the world on its head. The IBM Portable PC was a notable commercial failure, first sign of IBM’s loosening grip on the monster they had birthed. Meanwhile Compaq launched their own head-to-head challenge that same year with the DeskPro line of desktop machines, to much greater success. Apple may have been attacking IBM in melodramatic propaganda films and declaring themselves and IBM to be locked in a battle of Good versus Evil, but IBM hardly seemed to notice the would-be Apple freedom fighters. The only company that really mattered to IBM, the only company that scared them, wasn’t sexy Apple but buttoned-down, square-jawed Compaq.

But Compaq was actually far from IBM’s only problem. Cloning just kept getting easier, for everyone. In the spring of 1984 two little companies called Award Software and Phoenix Technologies announced identical products almost simultaneously: a reverse-engineered, completely legal IBM-compatible BIOS which they would license to anyone who felt like using it to make a clone. Plenty of companies did, catapulting Award and Phoenix to the top of what was soon a booming niche industry (they would eventually resolve their rivalry the way that civilized businesspeople do it, by merging). With the one significant difficulty of cloning thus removed, making a new clone became almost a triviality, a matter of ordering up a handful of components along with MS-DOS and an off-the-shelf BIOS, slapping it all together, and shoving it out the door; the ambitious hobbyist could even do it in her home if she liked. By 1986, considerably more clones were being sold than IBMs, whose own sales were stagnant or even decreasing.

That year Intel started producing the 80386, the third generation of the line of CPUs that powered the IBM PC and its clones. IBM elected to wait a bit before making use of it, judging that the second-generation 80286, which they had incorporated into the very successful PC/AT in 1984, was still plenty powerful  for the time being. It was a bad decision, predicated on a degree of dominance which IBM no longer enjoyed. Smelling opportunity, Compaq made their own 80386-based machine, the DeskPro 386, the first to sport the hot new chip. Prior to this machine, the cloners had always been content to let IBM pave the way of such fundamental advances. The DeskPro 386 marks Compaq’s — and the clone industry’s — coming of age. No longer just floating along in the wake of IBM, tinkering with form factors, prices, and feature sets, now they were driving events. Already in November of 1985, Bill Machrone of PC Magazine had seen where this was leading: “Now that it [IBM] has created the market, the market doesn’t necessarily need IBM for the machines.” We see here business computing going through its second fundamental shift (the first being the transition from CP/M to MS-DOS). What was an ecosystem of IBM and IBM clones now became a set of sometimes less-than-ideal, sometimes accidental, but nevertheless agreed-upon standards that were bigger than IBM or anyone else. IBM, Machrone wrote, “had better conform” to the standards or face the consequences just like anyone else. Tellingly, it’s at about this time that we see the phrase “IBM clone” begin to fade, to be replaced by “MS-DOS machine” or “Intel-based machine.”

The emerging Microsoft/Intel juggernaut (note the lack of an “IBM” in there) would eventually conquer the home as well. Already by the mid-1980s certain specimens of the breed were beginning to manifest features that could make them attractive for the home user. Let’s rewind just slightly to look at the most important of them, which I’ve mentioned in a couple of earlier articles but have never really given its full due.

When the folks at Radio Shack, trying to figure out what to do with their aging, fading TRS-80 line, saw the ill-fated IBM PCjr, they saw things well worth salvaging in its 16-color graphics chip and its three-voice sound synthesizer, both far superior to the versions found in its big brothers. Why not clone those pieces, package them into an otherwise fairly conventional PC clone, and sell the end result as the perfect all-around computer, one which could run all the critical business applications but could also play games in the style to which kids with Commodore 64s were accustomed? Thanks to the hype that had accompanied the PCjr’s launch, there were plenty of publishers out there with huge inventories of games and other software that supported the PCjr’s audiovisuals, inventories they’d be only too eager to unload on Radio Shack cheap. With those titles to prime the pump, who knew where things might go…

Launched in late 1984, the Tandy 1000 was the first IBM clone to be clearly pitched not so much at business as at the ordinary consumer. In addition to the audiovisual enhancements and very aggressive pricing, it included DeskMate, a sort of proto-GUI operating environment designed to insulate the user from the cryptic MS-DOS command prompt while giving access to six typical home applications that came built right in. A brilliant little idea all the way around, the Tandy 1000 rescued Radio Shack from the brink of computing irrelevance. It also proved a godsend for many software publishers who’d bet big on the PCjr; John Williams credits it with literally saving Sierra by providing a market for King’s Quest, a game Sierra had developed for the PCjr at horrendous expense and to underwhelming sales given that platform’s commercial failure. Indeed, the Tandy 1000 became so popular that it prompted lots of game publishers to have a second look at the heretofore dull beige world of the clones. As they jumped aboard the MS-DOS gravy train, many made sure to take advantage of the Tandy 1000’s audiovisual enhancements. Thousands of titles would eventually blurb what became known as “Tandy graphics support” on their boxes and advertisements. Having secured the business market, the Intel/Microsoft architecture’s longer, more twisting road to hegemony over home computing began in earnest with the Tandy 1000. And meanwhile poor IBM couldn’t even get proper credit for the graphics standard they’d actually invented. Sometimes you just can’t win for losing.

Another sign of the nascent but inexorably growing power of Intel/Microsoft in the home would come soon after the Tandy 1000, with the arrival of the first game to make many Apple, Atari, and Commodore owners wish that they had a Tandy 1000 or, indeed, even one of its less colorful relatives. We’ll get to that soon — no, really! — but first we have just one more detour to take.

(I was spoiled for choice on sources this time. A quick rundown of periodicals: Creative Computing of January 1983; Byte of January 1983, November 1984, and August 1985; PC Magazine of January 1987; New York Times of November 5 1982, October 26 1983, January 5 1984, February 1 1984, and February 22 1984; Fortune of February 18 1985. Computer Wars by Charles H. Ferguson and Charles R. Morris is a pretty complete book-length study of IBM’s trials and tribulations during this period. More information on the EACA clones can be found at Terry Stewart’s site. More on Compaq’s roots in Houston can be found at the Texas Historical Association. A few more invaluable links are included in the article proper.)

 
 

Tags: , , , , ,

The Unmaking and Remaking of Sierra On-Line

King's Quest

What happened for Ken and Roberta Williams in less than three years would have gone to anyone’s head. As the 1980s dawned, their lives were utterly ordinary. Ken was a business programmer putting in long hours every day in Los Angeles, Roberta his pretty, quiet, vaguely dissatisfied stay-at-home wife. Six months later she was a published game designer (to the extent that description meant anything in 1980), and the couple was sitting at their kitchen table opening the mail in disbelief as orders poured in for their little homemade adventure game. A year later, Ken was head of a burgeoning software house in their dream setting, nestled in the heart of the California Redwoods, and Roberta was his star designer. A year after that, they and the company they had built were software superstars. Glossy magazines and television shows begged for access and interviews; entertainment moguls flew them to New York to wine and dine them at 21 Club; venture capitalists lined up to offer money and advice, telling them they were at the forefront of the next big thing in media; big corporations offered to buy their whole operation, with starting offers of $20 million or more. Big franchises approached to talk about licensing deals: Jim Henson Associates, Disney, the Family Circus comic strip. For Ken, two of whose greatest heroes were Jim Henson and Walt Disney, such offers were flabbergasting. Late in 1982 IBM, by at least some measures the biggest, most powerful company in the world, humbly came knocking at the Williams’ door to ask if they’d be willing to work with them to develop software for their new home computer.

Yes, it would have gone to anyone’s head. Ken said yes to just about everyone, with the exception only of the outright buyout offers; he was having far too much fun to entertain them. The pundits, advisers, and investors that surrounded Ken were all telling him that the new low-cost home computers were the wave of the future, destined to replace the old Atari VCS game console and its competitors in the hearts and minds of consumers. This was the new gravy train, and the key to riding it was to get lots and lots of product out there to feed customers hungry for games for their new Commodore VIC-20s, Texas Instruments 99/4As, and Coleco Adams. Don’t stress too much about any given title, they said; just get lots of them out there. Simpler games were actually better, because then you could port them more quickly from platform to platform and pack them onto cartridges for all those ultra-low-end users without even a cassette drive. Ken, with these words ringing in his ears, dutifully made plans to push out 100 separate products in 1983 alone. He amassed a fleet of programmers to churn out action games which could be easily ported from platform to platform. Sierra spelled out this new approach in their “strategy outline” for 1983:

We believe the home-computer market to be so explosive that “title saturation” is impossible. The number of new machines competing for the Apple/Atari segment in 1983 will create a perpetually new market hungry for winning 1982 titles. We will exploit this opportunity.

Mr. Cool advertisement VIC-20 advertisements

Housing his growing fleet of salaried, workaday programmers — Ken had decided that dealing with artistically-tempered programmers like John Harris of Jawbreaker fame just wasn’t worth the trouble, that programming really shouldn’t be considered a creative endeavor at all — was soon becoming a problem. Growing technical, clerical, marketing, and warehouse staffs were also pushing the company’s total head count rapidly toward 100. Thus when the developer who owned Sierra’s office facilities offered to build a brand new building to house the company, a lovely place which perfectly suited the company’s image (if not, increasingly, its reality) as a clan of computer artisans living in the woods, Ken happily acquiesced, accepting rent in the vicinity of $25,000 per month.

The Sierra "redwood" building, custom-built for them in 1982

The Sierra “redwood” building, custom-built for them in 1982

The new offices weren’t the only building contract Ken signed around this time. Figuring that if they were going to be entertainment moguls they needed to live the part, Ken and Roberta hired an architect to design a sprawling 10,000 square-foot, $800,000 house — huge money in this rural area — on the Fresno River, complete with racquetball and volleyball courts, full-length wet bar, and a mini-arcade with all the latest games.

But by the time Ken and Roberta moved on Labor Day weekend, 1983, the fantasy of their lavish housewarming party, which included a professional comedy troupe brought in from San Francisco for the occasion, was undercut by some slowly dawning realities. Sierra’s first big partnership with Big Media, on the Dark Crystal game, had been a major artistic and commercial disappointment, done in by the tired old Hi Res Adventure engine that powered it and a rote design by a Roberta Williams who seemed determined not to grow past what she had done for Mystery House. Their one real hit of the year, meanwhile, had not been any of the titles from Ken’s new programmers, but rather John Harris’s loving, officially licensed port of the arcade game Frogger, a port done so well that some said it surpassed its inspiration. Alas, Frogger was the last game Harris did for Sierra; he had left some time before, having signed on with Synapse Software, whom he considered more quality-oriented. It was already beginning to dawn at that party that they might actually make less this year than they had the last even as the new building and growing staff had increased their expenses enormously. Soon after, things really started to go south.

Much of the software that Sierra was now producing was on cartridges, which were both more expensive to produce than disks or tapes and took much longer to duplicate. With much of the industry following Sierra’s plan of churning out new games practically by the dozen, production capacity at the relatively limited number of facilities capable of making cartridges was at a premium. Sierra was forced to place huge orders in June or July for the games they hoped to be selling huge numbers of come Christmas. But a funny thing happened during the six months in between: the market for the VIC-20, the TI 99/4A, and the Coleco Adam, the machines for which most of these cartridges were produced, collapsed. Jack Tramiel, you see, had won the Home Computer Wars of 1983 by then, driving TI right out of the market. In the process, he had just about killed his own VIC-20 as well; the price of the vastly more desirable and capable Commodore 64 had dropped so low that there was little point in buying a VIC-20 instead. As for the Adam… well, it never had a chance; by the time it arrived the war was largely over and the victor already determined. The Commodore 64 rocketed out of that Christmas the new center of the gaming universe, a position it would hold for the next several years. Yet all Sierra had to sell Commodore 64 owners were a few simple games ported from the VIC-20. And they had tens of thousands of cartridges, millions of dollars of inventory which they couldn’t move for ten cents on the dollar, sitting in warehouses. Meanwhile their shiny licensing deals were also turning out to be of little benefit to the bottom line. Sierra felt that they were doing all the work on these and all the profits — what little there sometimes were — were going to the licensees. As 1984 ground on, it became clear that the company was in the most dire of straits, unable to even make their mortgage payments on their fancy new office building.

The only thing to do was to start cutting. In a matter of days the company shed the extra skin it had built up, going from 100 employees to an absolutely essential core of about 20. A desperate Ken went to Sierra’s landlord and offered him a 10% share in the company if he would just forgive them the rent for a few months, while they got back on their feet. Figuring that 10% of a dead company was worth less than the rent he might be able to get out of them now, he said no thanks. In the end Ken was able to negotiate only to give back some of the building for other tenants. He and Roberta and their closest associates paid some of the remaining rent for a while using second mortgages and personal credit cards. It looked like this dream they had been living was about to end less than four years after it had begun, that soon they might end up right back where they had started in the suburbs of Los Angeles. They might have packed it in but for one remaining hope: that contract they had signed with IBM back in the halcyon days.

Sierra’s relationship with IBM actually went back even further than that contract. IBM first partnered with Sierra during the run-up to the original IBM PC’s launch in 1981, when they hired them to port The Wizard and the Princess, one of the biggest Apple II titles of that year, to their new machine. Sierra first experienced the legendary IBM secrecy then. Prototypes would arrive in X-ray-resistant lead chests sealed with solder, and were expected to be stored and used in windowless rooms that were to be kept locked at all times. Despite being a relatively minor part of the PC’s launch, Sierra, and Ken in particular, got on well with IBM. For all the party-hearty persona Ken could put on (as well described in Hackers and elsewhere on this blog), he had spent his previous life working for big technology companies like IBM. He understood how they worked, knew what it meant to shake down a new computer system and find the bugs and flaws while also obeying the rules of corporate hierarchy. IBM likely found him a refreshing change from both the un-technical MBAs and the technically masterful but socially unsophisticated hackers that were most of his peers. At any rate, they came back to Sierra soon after initiating the PCjr project.

IBM flew Ken and Jeff Stephenson, the man who was quickly assuming Ken’s role as hacker in chief at Sierra as Ken got more and more absorbed with the business side, out to their offices in Boca Raton, Florida. After the NDAs and other legal niceties that were part and parcel of dealing with IBM, they explained what the PCjr was to be and asked them to pitch some software that might make a good fit. Ken and Jeff made a number of proposals that were accepted, including HomeWord, an easy-to-use, casual word processor with an early graphical user interface of sorts which Ken and Jeff were already working on; it would wind up IBM’s official word processor for the PCjr. But the most important proposal, the biggest in the history of Sierra On-Line and one which would change adventure gaming forever, was made up on the fly, drawn up on the back of a napkin during a pause in the proceedings.

Sierra was still known most of all for their Hi Res Adventure line of illustrated adventure games. Unsurprisingly, IBM very much wanted something along those lines for the PCjr. But they had some specific requests for changes from Sierra’s traditional approach, which if nothing else proved that not everyone at IBM was as blissfully ignorant of gaming as legend would have it. They asked for a game that could be replayable, that would be more dynamic and complex in its world modeling, sort of like Ultima and Wizardry (adventures and CRPGs were not yet clearly defined separate genres at this point). They specifically asked that puzzles have multiple solutions, that there be many different possible paths through the game.

Ken and Jeff sensed that they really wanted Sierra to push themselves, to get beyond the tried-and-true Hi Res Adventure model. And with good reason: as the sales for The Dark Crystal were about to show, Sierra desperately needed to raise their game if they wanted to keep their hand in adventures at all. Next to the games that Infocom was putting out, the Hi Res Adventure games were painfully primitive. Yet how should they try to compete? Most other publishers, witnessing Infocom’s success with pure text, were beginning to shift their emphasis back to the parsers and the writing, de-emphasizing their pictures or removing them entirely. Infocom, in others words, was replacing Sierra as the model to be emulated. Ken instinctively sensed that this was not the right bandwagon for Sierra to leap aboard, much as they respected the technical accomplishment in Infocom’s games. They were movie people rather than book people; as Ken later said, Sierra had a “mass-market” sensibility which contrasted with Infocom’s “cerebral” approach. Rather than try to ape Infocom like other publishers, why not zig while everyone else zagged, double down on graphics while de-emphasizing text? Besides, one of the main selling features of the PCjr was to be its bright 16-color graphics. Shouldn’t its showcase adventure take advantage of them?

King's Quest

When IBM joined them again in the conference room, Ken and Jeff made their pitch for a new type of adventure game. Most of the screen would be given over to the graphics, like in the Hi Res Adventures, but the interactivity would now also extend to this part of the display. The player’s avatar would be visible onscreen, with the player able to move him around within each room using a joystick or the arrow keys. The player would still have to type non-movement commands, but now positioning within each room would play an important role: you would have to move right up next to that old tree stump to peer inside, walk up to the kindly forest elf to talk to him, etc. Some text would still have to remain to explain some of what happened, but much of the experience would be entirely visual, more movie than book. Action sequences requiring precise timing and coordination could be introduced. The system also promised to introduce the kind of dynamism that IBM desired in other ways. Other characters and creatures could wander the world, to be dodged, fought, or befriended. What we would today call emergent behavior might arise: the player might hide behind a handy tree when the wicked witch suddenly popped onto the scene. It would be a showstopper, conforming to Ken’s ten-foot rule for software marketing while also introducing whole new tactical layers that had never been seen in adventure games before. IBM signed on happily.

The reaction in Oakhurst was not quite so enthusiastic. Some felt that Ken and Jeff had promised IBM the moon, that this was simply a leap too far. Perhaps remembering Sierra’s last two adventure games, both of which had gone through long, painful development cycles for little commercial reward, they pointedly suggested that Ken go back to IBM and explain that Sierra had bitten off more than they could chew, cut the proposal down dramatically to something more realistically achievable, and try to get IBM to accept it. Ken, realizing that any such action would destroy his credibility with IBM forever, absolutely refused. He pointed out that they had 128 K of memory to work with for this project, a huge figure in comparison to the 48 K they’d had for the Hi Res Adventure games. He found a critical ally in Roberta, the person who would have to actually design for the system. She simply asked questions until she felt she understood the system and what it would and would not be capable of, digested IBM’s desires for a more dynamic game than was her previous wont, then went to work. Eventually the grumbling mostly ceased and the rest of the staff followed her example.

What with Ken having a company to run, the heavy lifting of turning the proposal into a game engine largely fell to Jeff Stephenson. Just like the Hi Res Adventure engine, this one was designed to be reusable and extendible from the start. It was initially known as the Game Adaptation Language, or GAL. Ken, however, loathed the cutsiness of that acronym, and it was eventually renamed to the Adventure Game Interpreter, or AGI. (I’ll refer to the system as AGI from here on for the sake of consistency.) Soon the trucks bearing the familiar lead-lined crates began arriving in Oakhurst again, and development began in earnest on both the engine and the game it would run. The team chosen for the task consisted of Roberta and about half a dozen programmers and artists. The PCjr projects as a whole, which included the adventure game, HomeWord, and several others pieces of software, were given a top-secret code-name: Project Siesta. Still, it’s hard to keep anything a secret in a small town like Oakhurst. Word quickly spread around town: “The big fucking company is in town again.”

Some of the process of developing the first AGI game, eventually to be named King’s Quest, was not that far removed from the days of Hi Res Adventure. The artists still drew each scene on paper using colored pencils. These drawings were then traced using a graphics tablet connected to a computer, where they were stored using the vector-graphics techniques Ken had developed back in the days of Mystery House and The Wizard and the Princess. (When playing King’s Quest on older, slower hardware you can see each new room being drawn in line by line. Fascinatingly, what you are actually seeing there are the motions of the stylus being guided by the person who first traced the image all those years ago. Early King’s Quest versions let you see the process more clearly via an undocumented “slow draw” mode that can be activated by pressing Control-V.) Thanks to this evergreen technique, the image of each room occupies only .5 to 2.5 K. The same data also tells the interpreter where Sir Grahame, the game’s protagonist, can and cannot walk. Boundaries, such as the castle walls you see in the screenshot above, were traced with a special flag activated and incorporated right into the image itself.

Perhaps the trickiest problem that Jeff Stephenson had to wrestle with stems from the fact that we view each room in the game from a parallel perspective. Thus the game needs to account for the z-axis in addition to the x- and y-axes to maintain the illusion of depth. Each object in each room is therefore given something Jeff called its “priority,” essentially its position on the Z-axis. An object’s priority can range from 1 to 15, and increases as it gets closer to the “back” of the room. In drawing a scene, the interpreter draws objects of lower priority after those of higher priority. Say that a tree is positioned on the screen at priority 9. If Graham moves vertically, “deeper” into the screen, to, say, priority 11, then moves horizontally “behind” the tree, the tree will conceal him as expected. Up to four moving characters can be in a single room, the interpreter constantly adjusting the onscreen image to account for their movements.

King's Quest

The game logic is described using a simple scripting language which is once again descended from the system Ken had developed for the Hi Res Adventure line. Let’s take a look one small piece of the scene shown above. In addition to our alter ego Grahame, it shows a goat — “object” number 14 — who wanders back and forth in his corral, which in turn spans two rooms, numbers 10 and 11; the room shown above, the leftmost, is room 10. The goat continues to wander unless and until he is tempted to join Grahame by a scrumptious-looking carrot. Here’s how the goat’s logic in room 10 is described in AGI:

IF HAS-GOAT 0 AND OBJHIT-EDGE 14 AND EDGE-OBJ-HIT 1 AND GOAT-GONE 0 AND SHOW-CARROT 0 THEN ASSIGN GOAT-ROOM 11, ERASE 10

So, and without getting too lost in the weeds here, if we do not “have” the goat and are not showing him the carrot, and the goat has hit the edge of the screen in his wanderings, remove (“erase”) him from room 10 and put him in room 11. Room 10 alone has 180 such lines of script to describe all of its interactive possibilities. Like most software, an AGI game is more complex than it looks. This is true from the standpoint of both the engine programmers and the scripters. In the context of its time, AGI is nothing less than a stunning technological tour de force — one which, like all the best software, looks easy.

The technical virtuosity on display here made it rather easy for reviewers of the time to lose sight of the actual game it enabled, a painfully common phenomenon in the field of videogames. Indeed, I was anticipating reviewing King’s Quest more as a piece of technology than an adventure game, particularly given that I frankly don’t think very highly of Roberta’s work on the Hi Res Adventure line. I was, however, pleasantly surprised by her work here. King’s Quest‘s plot is almost as basic as that of the original Adventure: the kingdom of Daventry is in some sort of vaguely defined trouble, and the aging King Edward needs you, the brave knight Sir Grahame, to find three magic items that can save it. Since he conveniently has no heirs, do that and “the throne will be yours.” King’s Quest is another treasure hunt, nothing more or less.

Still, and making allowances for the newness of the technology, Roberta does a pretty good job with it. Many of the characters and situations you encounter as you roam Daventry are drawn, and not without a certain charm, from classic fairy tales: Hansel and Gretel, Jack and the Beanstalk, Rumpelstiltskin. The latter is at the core of the one howlingly awful puzzle in the game, which starts out dodgy and just keeps layering on the complications until it’s well-nigh impossible.

King's Quest

(For the record: you meet an old gnome-looking sort of fellow who gives you three chances to guess his name. If you’re familiar with your Brothers Grimm, you might divine that he’s Rumpelstiltskin given the fairy-tale characters everywhere else in the game. But, no, “That is very close but not quite right.” Okay, you do have a note you found elsewhere which says, “Sometimes it is wise to think backwards.” So, “nikstlitslepmur.” No — “You have the right idea, but your thinking is just a little bit off.” It turns out you have to write the name using a backwards alphabet.)

But even here the IBM design brief saves Roberta from her worst instincts. There is, thank God, an alternate way to proceed without solving this puzzle, even if it does cost you some points. And most of the other puzzles are… not that bad, actually. Some are even pretty clever. That may sound like damning with faint praise, but given some of the absurdities of Time Zone it’s nevertheless praise indeed. There are a huge number of ways to go through King’s Quest, what with all of the alternate solutions on offer, and the game feels consciously designed in a holistic sense in a way that no previous Roberta Williams game did.

King’s Quest also makes use of most of the new possibilities afforded by the AGI system. There are enemies to be dodged and eventually dispatched — the witch out of Hansel and Gretel is particularly harrowing — and tricky action sequences to be navigated. King’s Quest is mostly a competent, enjoyable game even when divorced from its place in history as the first use of the revolutionary technology that powers it. It’s also reasonably solvable, at least if you aren’t too fixated on getting the maximum possible points. Realistically, it needed to be no more than a technological proof of concept to be a bestseller, but it manages to be considerably more than that. It acquits itself very well overall as the herald of a new paradigm for adventure gaming.

As development continued and Sierra’s financial position began to look more precarious, stress began to mount. Ken’s wish to just find average and uncreative but reliable programmers was perhaps amplified more than ever by some of the characters he ended up having to assign to the King’s Quest project. Whether because of its location near the old hippie meccas of northern California or just something in the water, Sierra always seemed to be filled with eccentrics despite Ken’s best efforts to run a more buttoned-down operation. One fellow was particularly noted for his acid consumption and his fascination with Fozzy Bear, and looked freakish enough that (in John Williams’s words) “when he went into a restaurant, everyone looked at him.” Another, similarly “off” programmer acted like a cross between a mad scientist and Zaphod Beeblebrox of Hitchhiker’s Guide to the Galaxy fame. Near the end of the project one developer, angry at the long hours he had been working, held a critical piece of code ransom until Sierra paid him for dozens of hours of overtime to which he felt entitled. They agreed to pay, got the code, and promptly reneged, citing a clause in American contract law which says that a contract is null and void if one of the parties signs under duress.

When IBM officially unveiled the PCjr and its horrid Chiclet keyboard in November of 1983, Sierra was as surprised as anyone. For all their involvement in the machine’s development, they never had access to a real production model. Ken had to go down to ComputerLand and buy his own, just like everyone else, when the PCjr shipped at last in March of 1984. His first machine didn’t remain in his possession for very long. He went to the movies on his way home, leaving it in his car, only to find it stolen when he returned. That must have seemed like a bad omen coming to fruition as it became more and more clear that the PCjr, seemingly Sierra’s last hope, was flopping in the marketplace.

King's Quest

And that must have seemed a double shame because — and I know I may seem to be belaboring the point, but I can hardly emphasize this enough — King’s Quest was amazing in its time. Even magazines devoted to other platforms felt compelled to talk about it; it was just that revolutionary. King’s Quest, marketed under IBM’s official imprint with cover art apparently drawn by someone who had never seen the game, did sell pretty well by the standards of IBM PCjr software, but there just weren’t enough PCjrs being sold to save Sierra. Similarly, a version for the PCjr’s bigger brother, the IBM PC, sold well by the standards of games for that platform, but the entertainment market for such a business-computing stalwart wasn’t up to much. Although the AGI system had been designed to be portable, it had also been designed to run in 128 K of memory. This locked it out of the typical unexpanded Apple IIe (64 K) and the biggest gaming platform in the country, the Commodore 64. Sierra had exactly the right game on exactly the wrong platform. It seemed Ken had backed the wrong horse, a final bad decision that looked to have doomed his company. The situation just got more and more desperate. John Williams, Sierra’s marketing director, recalls placing media buys around this time with no idea how he was going to pay for them when the invoices came due: “This is either going to help, in which case we can deal with the cost of these and maybe negotiate payment on it — or it won’t work and we’ll be gone” anyway.

Two new machines played a big role in saving Sierra. Just a month after the PCjr finally shipped to stores, Apple announced and shipped the fourth incarnation of the Apple II, the IIc. We’ll talk a bit more about it in a future article, but for now suffice to say that the IIc was designed to be a semi-portable, closed appliance computer, in contrast to the hacker’s laboratories that had been previous Apple II models. Most critically for our purposes today, the IIc shipped with 128 K of memory. Its commercial performance would ultimately be rather lukewarm, but it did prompt many users of the older IIe model to upgrade to (at least) 128 K to match its capabilities. In time there were enough 128 K Apple IIs to justify porting the AGI interpreter to the platform.

But it was the Tandy 1000 that really saved Sierra in the most immediate sense, that gave that critical mass of 128 K Apple II users time to amass. It was introduced just as 1984, the most difficult year in Sierra’s history, was winding down. In many ways it was what the PCjr should have been, with the same graphics and sound capabilities and IBM PC compatibility in a smarter, more usable and expandable package. And it was sold in Radio Shack stores all over the country. In some areas the local Radio Shack was the only place within 200 miles to buy a computer. Sierra smartly developed a strong relationship with Radio Shack in the wake of the Tandy 1000’s announcement. Few other software publishers bothered, meaning that King’s Quest and other Sierra games stood almost alone on the shelves in many of these captive markets. The Tandy 1000, combined with the slowly increasing user base of expanded Apple IIs, gave King’s Quest the opportunity to slowly pull Sierra back from the edge of the abyss, particularly since much of the game’s $850,000 development cost had been funded by IBM. It would take time, but by the end of 1985, with King’s Quest II now already out and doing very well, the company was paying off debts and beginning to grow again.

Ken, Roberta, John, Jeff, and their closest associates had, much to their credit, stuck to their guns and not made the perfectly reasonable decision to pack it in. But they had also, as Ken well realized, gotten very, very lucky. Without the Tandy 1000 and few other lucky breaks, Sierra could easily have gone the way of Adventure International, Muse, and other big software houses who were flying high in 1982 and dead in 1985. As he recently said, it had all been “fun and games” for the first few years. Now he understood how quickly things could go bad with a few wrong decisions, understood what a fragile entity Sierra really was. Most of all, he never wanted to go through another year like 1984 again. The Ken Williams that emerged from that period was, like his company, changed. From now on he would do a remarkable job of balancing ambition with caution. This capacity to change and learn from his mistakes, much rarer than it seems it ought to be, was perhaps ultimately the most important quality he brought to Sierra. He reoriented his company to stop chasing fast bucks and to focus on a smaller number of quality titles for a modest number of proven platforms, and accumulated a stable of designers, programmers, and artists whom he treated with respect. They in turn did good, occasionally great work for him. Sierra Mark II, leaner, humbler, and wiser, was off and running.

(My huge thanks once again go to John Williams for contributing so many of his memories to this article. Hackers by Steven Levy was also invaluable for what I believe will be the last time at last, as we’ve now moved beyond the period it covers. An article in the February 1985 Compute! breaks down the AGI system in unusual detail for a contemporary source. If you want to know more about its technical side, it’s been documented in exhaustive detail since. If you just would like to play King’s Quest, it’s available in a pack with King’s Quest II and III at Good Old Games.)

 

Tags: , , , ,