RSS

Search results for ‘epyx’

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,

Alone in the Dark

Most videogame stories are power fantasies. You spend your time getting ever stronger, ever tougher, ever more formidable as you accumulate experience points, gold, and equipment. Obstacles aren’t things to go around; they’re things you go through. If you can’t get past any given monster, the solution is to go kill some other monsters, then come back when you’re yet more powerful and slay the big beast at last. Life, these games tell us, is or ought to be one unadulterated ride up the escalator of success; a setback just means you haven’t yet risen high enough.

That dynamic held true in 1992 just as much as it usually does today. But during that year there came a well-nigh revolutionary game out of France that upended all of these traditional notions about what the medium of videogames can do and be. It cast you as a painfully ordinary, near-powerless individual adrift in a scary world, with no surefire panaceas in the form of experience points, gold, or portable rocket launchers to look forward to. It was just you and your wits, trapped in a haunted house full of creatures that were stronger than you and badly wanted to kill you. Despite its supernatural elements, this game’s scenario felt more disconcertingly close to real life than that of any of those other games. Here, you truly were alone in the dark. Aren’t we all from time to time?


Any story of how this shockingly innovative game came to be must begin with that of Frédérick Raynal, its mastermind. Born in the south-central French town of Brive-la-Gaillarde in 1966, Raynal was part of the first generation of European youths to have access to personal computers. In fact, right from the time his father first came home with a Sinclair ZX81, he was obsessed with them. He was also lucky: in a dream scenario for any budding hacker, his almost equally obsessed father soon added computers to the product line of the little videocassette-rental shop he owned, thus giving his son access to a wide variety of hardware. Raynal worked at the store during the day, renting out movies and watching them to kill the time — he was a particular fan of horror movies, a fact which would soon have a direct impact on his career — and helping customers with their computer problems. Then, with a nerdy young man’s total obliviousness to proportion, he hacked away most of the night on one or another of the machines he brought home with him. He programmed his very first released game, a platformer called Robix, in 1986 on an obscure home-grown French computer called the Exelvision which his father sold at the store. His father agreed to sell his son’s Exelvision game there as well, managing to shift about 80 units to customers desperate for software for the short-lived machine.

Raynal’s lifestyle was becoming so unbalanced that his family was beginning to worry about him. One day, he ran out of his room in a panic, telling them that all of the color had bled out of his vision. His mother bustled him off to an ophthalmologist, who told him he appeared to have disrupted the photoreceptors in his eyes by staring so long at a monitor screen. Thankfully, the condition persisted only a few hours. But then there came a day when he suddenly couldn’t understand anything that was said to him; he had apparently become so attuned to the language of computer code that he could no longer communicate with humans. That worrisome condition lasted several weeks.

Thus just about everyone around him took it as a good thing on the whole when he was called up for military service in 1988. Just before leaving, Raynal released his second game, this time for MS-DOS machines. Not knowing what else to do with it, he simply posted it online for free. Popcorn was a Breakout clone with many added bells and whistles, the latest entry in a sub-genre which was enjoying new popularity following the recent success of the Taito arcade game Arkanoid and its many ports to home computers and consoles. Raynal’s game could hold its head high in a crowded field, especially given its non-existent price tag. One magazine pronounced it one of the five best arcade games available for MS-DOS, whether commercial or free, and awarded it 21 points on a scale of 20.

Raynal was soon receiving letters at his military posting from all over the world. “Popcorn has made my life hell!” complained one player good-naturedly. Another wrote that “I caught acute Popcornitus. And, it being contagious, now my wife has it as well.” When Raynal completed his service in the summer of 1989, his reputation as the creator of Popcorn preceded him. Most of the companies in the French games industry were eager to offer him a job. His days working at his father’s computer store, it seemed, were behind him. The Lyon-based Infogrames, the most prominent French publisher of all, won the Raynal sweepstakes largely by virtue of its proximity to his hometown.

Yet Raynal quickly realized that the company he had elected to join was in a rather perilous state. An ambitious expansion into many European markets hadn’t paid off; in fact, it had very nearly bankrupted them. Bruno Bonnell, Infogrames’s co-founder and current chief executive, had almost sold the company to the American publisher Epyx, but that deal had fallen through as soon as the latter had gotten their first good look at the state of his books. It seemed that Infogrames would have to dig themselves out of the hole they’d made. Thus Bonnell had slashed costs and shed subsidiaries ruthlessly just to stay alive. Now, having staunched the worst of the bleeding, he knew that he needed as many talented programmers as he could get in order to rebuild his company — especially programmers like Raynal, who weren’t terribly assertive and were naive enough to work cheap. So, Raynal was hired as a programmer of ports, an unglamorous job but an absolutely essential one in a European market that had not yet consolidated around a single computer platform.

Bonnell, for his part, was the polar opposite of the shy computer obsessive he had just hired; he had a huge personality which put its stamp on every aspect of life at Infogrames. He believed his creativity to be the equal of anyone who worked for him, and wasn’t shy about tossing his staff ideas for games. He called one of them, which he first proposed when Raynal had been on the job for about a year, In the Dark. A typically high-concept French idea, its title was meant to be taken literally. The player would wander through a pitch-dark environment, striking the occasional match from her limited supply, but otherwise relying entirely on sound cues for navigation. Bonnell and Raynal were far from bosom buddies, then or ever, but this idea struck a chord with the young programmer.

As Raynal saw it, the question that would make or break the idea was that of how to represent a contiguous environment with enough verisimilitude to give the player an embodied sense of really being there in the dark. Clearly, a conventional adventure-game presentation, with its pixel graphics and static views, wouldn’t do. Only one approach could get the job done: 3D polygonal graphics. Not coincidentally, 3D was much on Raynal’s mind when he took up Bonnell’s idea; he’d been spending his days of late porting an abstract 3D puzzle game known as Continuum from the Atari ST to MS-DOS.

I’ve had occasion to discuss the advantages and disadvantages of this burgeoning new approach to game-making in previous articles, so I won’t rehash that material here. Suffice to say that the interest so many European programmers had in 3D reflected not least a disparity in the computing resources available to them in comparison to their American counterparts. American companies in this period were employing larger and larger teams, who were filling handfuls of floppy disks — and soon CD-ROMs — with beautiful hand-drawn art and even digitized snippets of real-world video. European companies had nothing like the resources to compete with the Americans on those terms. But procedurally-generated 3D graphics offered a viable alternative. At this stage in the evolution of computer technology, they couldn’t possibly be as impressively photorealistic as hand-drawn pixel art or full-motion video, but they could offer far more flexible, interactive, immersive environments, with — especially when paired with a French eye for aesthetics — a certain more abstracted allure of their own.

This, then, was the road Raynal now started down. It was a tall order for a single programmer. Not only was he trying to create a functional 3D engine from scratch, but the realities of the European market demanded that he make it run on an 80286-class machine, hardware the Americans by now saw as outdated. Even Bonnell seemed to have no confidence in Raynal’s ability to bring his brainstorm to fruition. He allowed Raynal to work on it only on nights and weekends, demanding that he spend his days porting SimCity to the Commodore CDTV.

An artist named Didier Chanfray was the closest thing to a partner and confidante which Raynal had at Infogrames during his first year of working on the engine. It was Chanfray who provided the rudimentary graphics used to test it. And it was also Chanfray who, in September of 1991, saw the full engine in action for the first time. A character roamed freely around a room under the control of Raynal, able to turn about and bend his body and limbs at least semi-realistically. The scene could be viewed from several angles, and it could be lit — or not — by whatever light sources Raynal elected to place in the room. Even shadows appeared; that of the character rippled eerily over the furniture in the room as he moved from place to place. Chanfray had never seen anything like it. He fairly danced around Raynal’s desk, pronouncing it a miracle, magic, alchemy.

In the meantime, Bruno Bonnell had negotiated and signed a new licensing deal — not exactly a blockbuster, but something commensurate with a rebuilding Infogrames’s circumstances.


Something tentacled and other-worldly, it seems, got into the water at Infogrames from the start: Didier Chanfray provided this very Lovecraftian concept drawing for Raynal’s game long before the conscious decision was made to turn it a Lovecraft pastiche. Raynal kept the sketch tacked on the wall beside his desk throughout the project as a reminder of the atmosphere he was going for.

The American horror writer H.P. Lovecraft, who died well before the advent of the computer age in 1937, was nowhere near as well-known in 1991 as he is today, but his so-called “Cthulhu Mythos” of extra-dimensional alien beings, terrifying by virtue of their sheer indifference to humanity and its petty morality, had already made appearances in games. The very first work of ludic Lovecraftia would appear to be the 1979 computer game Kadath, an odd sort of parser-less text adventure. Two years later, at the height of the American tabletop-RPG craze, a small company called Chaosium published Call of Cthulhu, a game which subverted the power fantasy of tabletop Dungeons & Dragons in much the same way that Raynal’s project would soon be subverting that of so many computer games. Still, although Call of Cthulhu was well-supported by Chaosium and remained reasonably popular by the standards of its niche industry throughout the 1980s and beyond, its success didn’t lead to any Lovecraftian onslaught in the realm of digital games. The most notable early example of the breed is Infocom’s very effective 1987 interactive fiction The Lurking Horror. But, being all text at a time when text adventures were becoming hard sells, it didn’t make much commercial impact.

Now, though, Bonnell believed the time had come for a more up-to-date Lovecraftian computer game; he believed such a thing could do well, both in France and elsewhere.

Lovecraft had long had a strong following in France. From the moment his books were first translated into the language in 1954, they had sold in considerable numbers. Indeed, in 1991 H.P. Lovecraft was about as popular in France as he was anywhere — arguably more popular on a per-capita basis than in his native land. The game of Call of Cthulhu too had long since been translated into French, giving a potential digital implementation of it as much natural appeal there as in its homeland. So, Bonnell approached Chaosium about licensing their Call of Cthulhu rules for computers, and the American company agreed.

When viewed retrospectively, it seems a confusing deal to have made, one that really wasn’t necessary for what Infogrames would ultimately choose to do with Lovecraft. When Lovecraft died in obscurity and poverty, he left his literary estate in such a shambles that no one has ever definitively sorted out its confusing tangle of copyright claimants; his writing has been for all intents and purposes in the public domain ever since his death, despite numerous parties making claims to the contrary. Prior to publishing their Lovecraft tabletop RPG, Chaosium had nevertheless negotiated a deal with Arkham House, the publisher that has long been the most strident of Lovecraft’s copyright claimants. With that deal secured, Chaosium had promptly trademarked certain catchphrases, including “Call of Cthulhu” itself, in the context of games. Yet as it turned out Infogrames would use none of them; nor would they draw any plots directly from any of Lovecraft’s published stories. Like the countless makers of Lovecraftian games and stories that would follow them, they would instead draw from the author’s spirit and style of horror, whilst including just a few of his more indelible props, such as the forbidden book of occult lore known as the Necronomicon.

The first Lovecraftian game Infogrames would make would, of course, be the very game that Frédérick Raynal had now spent the last year or so prototyping during his free time. By the time news of his work reached Bonnell, most of Infogrames’s staff were already talking about it like the second coming. While the idea that had inspired it had been wonderfully innovative, it seemed absurd even to the original source of said idea to devote the best 3D engine anyone had ever seen to a game that literally wouldn’t let you see what it could do most of the time. It made perfect sense, on the other hand, to apply its creepy visual aesthetic to the Lovecraft license. The sense of dread and near-powerlessness that was so consciously designed into the tabletop RPG seemed a natural space for the computer game as well to occupy. It was true that it would have to be Call of Cthulhu in concept only: the kinetic, embodied, real-time engine Raynal had created wasn’t suitable for the turn-based rules of the tabletop RPG. For that matter, Raynal didn’t even like the Chaosium game all that much; he considered it too complicated to be fun.

Still, Bonnell, who couldn’t fail to recognize the potential of Raynal’s project, put whatever resources he could spare from his still-rebuilding company at the mild-mannered programmer’s disposal: four more artists to join Chanfray, a sound designer, a second programmer and project manager. When the team’s first attempts at writing an authentic-feeling Lovecraftian scenario proved hopelessly inadequate, Bonnell hired for the task Hubert Chardot, a screenwriter from 20th Century Fox’s French division, a fellow who loved Lovecraft so much that he had turned his first trip to the United States into a tour of his dead hero’s New England haunts. One of Chardot’s first suggestions was to add the word “alone” to the title of the game. He pointed out, correctly, that it would convey the sense of existential loneliness that was such an integral part of Lovecraftian horror — even, one might say, the very thing that sets it apart from more conventional takes on horror.

You can choose to enter the mansion as either of two characters.

The game takes place in the 1920s, the era of Lovecraft himself and of most of his stories (and thus the default era as well for Chaosium’s Call of Cthulhu game). It begins as you arrive in the deserted Louisiana mansion known as Derceto, whose owner Jeremy Hartwood has recently hanged himself. You play either as Edward Carnby, a relic hunter on the trail of a valuable piano owned by the deceased, or as Emily Hartwood, the deceased’s niece, eager to clear up the strange rumors that have dogged her uncle’s reputation and to figure out what really went down on his final night of life. The direction in which the investigation leads you will surprise no one familiar with Lovecraft’s oeuvre or Chaosium’s RPG: occult practices, forbidden books, “things man was never meant to know,” etc. But, even as Chardot’s script treads over this ground that was well-worn already in the early 1990s, it does so with considerable flair, slowly revealing its horrifying backstory via the books and journals you find hidden about the mansion as you explore. (There is no in-game dialog and no real foreground story whatsoever, only monsters and traps to defeat or avoid.) Like most ludic adaptations of Lovecraft, the game differs markedly from its source material only in that there is a victory state; the protagonist isn’t absolutely guaranteed to die or become a gibbering lunatic at the end.

One of the in-game journals, which nails the spirit and style of Lovecraft perfectly. As I noted in an earlier article about the writer, the emotion he does better than any other is disgust.

Yet Chaosium wasn’t at all pleased when Infogrames sent them an early build of the game for their stamp of approval. It seems that the American company had believed they were licensing not just their trademarks to their French colleagues, nor even the idea of a Lovecraft game in the abstract, but rather the actual Call of Cthulhu rules, which they had expected to see faithfully implemented. And, indeed, this may have been Bonnell’s intention when he was making the deal — until Raynal’s 3D engine had changed everything. Chaosium, who had evidently been looking forward to an equivalent of sorts to the Gold Box line of licensed Dungeons & Dragons CRPGs, felt betrayed. After some tense negotiation, they agreed to let Alone in the Dark continue without the Call of Cthulhu name on the box; some editions would include a note saying the game had been “inspired by the works of H.P. Lovecraft,” while others wouldn’t even go that far. In return for Chaosium’s largess on this front, Infogrames agreed to make a more conventional adventure game that would make explicit use of the Call of Cthulhu trademarks.

Call of Cthulhu: Shadow of the Comet, the fruit of that negotiation, would prove a serviceable game, albeit one that still didn’t make much direct use of the tabletop rules. But, whatever its merits, it would come and go without leaving much of a mark on an industry filled to bursting with graphical adventures much like it in terms of implementation. Alone in the Dark, on the other hand, would soon be taking the world by storm — and Chaosium could have had their name on it, a form of advertisement which could hardly have failed to increase their commercial profile dramatically. Chalk it up as just one more poor decision in the life of a company that had a strange talent for surviving — Chaosium is still around to this day — without ever quite managing to become really successful.

Infogrames got their first preview of just what an impact Alone in the Dark was poised to make in the spring of 1992, when Dany Boolauck, a journalist from the French videogame magazine Tilt, arrived to write a rather typical industry puff piece, a set of capsule previews of some of the company’s current works-in-progress. He never got any further than Alone in the Dark. After just a few minutes with it, he declared it “the best game of the last five years!” and asked for permission to turn the capsule blurb about it into a feature-length article, complete with a fawning interview with Raynal. (He described him in thoroughly overwrought terms: as a reincarnation of The Little Prince from Antoine de Saint-Exupéry’s beloved novella of the same name.) In a “review” published in the summer of 1992, still a couple of months before Infogrames anticipated releasing the game, he gave it 19 of 20 stars, gushing over its “exceptional staging” and “almost perfect character movement,” calling it “a revolution in the field of play” that “people must buy!”

Bruno Bonnell was pleased with the positive press coverage, but less thrilled by Boolauck’s portrayal of Raynal as the game’s genius auteur. He called in his introverted young programmer, who seemed a bit befuddled by all the attention, and told him to scrub the words “a Frédérick Raynal creation” from the end credits. Alone in the Dark, he said, was an Infogrames creation, full stop. Raynal agreed, but a grievance began to fester in his heart.

Thanks to Bonnell’s policy of not advertising the individuals behind Infogrames’s games, Raynal’s name didn’t spread quite so far and wide as that of such other celebrated gaming auteurs as Éric Chahi, the mastermind of Another World, France’s standout game from the previous year. Nevertheless, upon its European release in September of 1992, Raynal’s game stood out on its own terms as something special — as an artistic creation that was not just fun or scary but important to its medium. As one would expect, the buzz started in France. “We review many games,” wrote one magazine there. “Some are terrible, some mediocre, some excellent. And occasionally there comes along the game that will revolutionize the world of microcomputers, one that causes sleepless nights, one that you cannot tear yourself away from, can only marvel at. We bid welcome now to the latest member of this exclusive club: Alone in the Dark.” By the end of 1992, the game was a hit not only in France but across most of Europe. Now for America.

Bonnell closed a deal with the American publisher Interplay for distribution of the game there. Interplay had also published Another World, which had turned into a big success Stateside, and the company’s head Brian Fargo was sure he saw similar potential in Alone in the Dark. He thus put the game through his company’s internal testing wringer, just as he had Another World; the French studios had their strengths, but such detail work didn’t tend to be among them. Raynal’s game became a much cleaner, much more polished experience thanks to Interplay’s QA team. Yet Bonnell still had big international ambitions for Infogrames, and he wasn’t willing to let such a remarkable game as this one share with Another World the fate of becoming known to American players simply as an Interplay title. Instead he convinced Fargo to accept a unique arrangement. Interplay and Infogrames each took a stake in a new shared American subsidiary known as I-Motion, under which imprint they published Alone in the Dark.

The game took North America by storm in early 1993, just as it had Europe a few months earlier. It was that rarest of things in games, a genuine paradigm shift; no one had ever seen one that played quite like this. Worldwide, it sold at least 400,000 copies, putting Infogrames on the map in the United States and other non-European countries in the process. Indeed, amidst the international avalanche of praise and punditry, perhaps the most gratifying press notice of all reached Frédérick Raynal’s ears from all the way off in Japan. Shigeru Miyamoto, the designer of Super Mario Bros. and many other iconic Nintendo classics, proclaimed Alone in the Dark to be, more so than any other game, the one he wished he could have come up with.


Arguably the creepiest visual in the game is the weird mannequin’s head of your own character. Its crudely painted expression rather smacks of Chucky the doll from the Child’s Play horror films.

Seen from the perspective of a modern player, however, the verdict on Alone in the Dark must be more mixed. Some historically important games transcend that status to remain vital experiences even today, still every bit as fun and playable as the day they were made. But others — and please forgive me the hoary old reviewer’s cliché! — haven’t aged as well. This game, alas, belongs to the latter category.

Today, in an era when 3D graphics have long since ceased to impress us simply for existing at all, those of Alone in the Dark are pretty painful to look at, all jagged pixels sticking out everywhere from grotesquely octagonal creatures. Textures simply don’t exist, leaving everything to be rendered out of broad swatches of single colors. And the engine isn’t even holistically 3D: the 3D characters move across pasted-on pre-rendered backgrounds, which looks decidedly awkward in many situations. (On the other hand, it could have been worse: Raynal first tried to build the backgrounds out of digitized photographs of a real spooky mansion, a truly unholy union that he finally had to give up on.) Needless to say, a comparison with the lovingly hand-drawn pixel art in the adventure games being put out by companies like LucasArts and Sierra during this period does the crude graphics found here no favors whatsoever. Some of the visuals verge on the unintentionally comical; one of the first monsters you meet was evidently meant to be a fierce dragon-like creature, but actually looks more like a sort of carnivorous chicken. (Shades of the dragon ducks from the original Atari Adventure…)

Dead again! Killed by… Prince during his Purple Rain period?

Then, too, the keyboard-only controls are clunky and unintuitive, and they aren’t made any less awkward by a fixed camera that’s constantly shifting about to new arbitrary locations as you move through the environment; some individual rooms have as many as nine separate camera angles. This is confusing as all get-out when you’re just trying to get a sense of the space, and quickly becomes infuriating when you’re being chased by a monster and really, really don’t have time to stop and adjust your thinking to a new perspective.

The more abstract design choices also leave something to be desired. Sudden deaths abound. The very first room of the game kills you when you step on a certain floorboard, and every book is either a source of backstory and clues or an instant game-ender; the only way to know which it is is to save your game and open it. Some of the puzzles are clever, some less so, but even those that are otherwise worthy too often depend on you standing in just the right position; if you aren’t, you get no feedback whatsoever on what you’re doing wrong, and are thus likely to go off on some other track entirely, never realizing how close you were to the solution. This fiddliness and lack of attention to the else in the “if, then, else” dynamic of puzzle design is a clear sign of a game that never got sufficiently tested for playability and solubility. At times, the game’s uncommunicativeness verges on the passive-aggressive. You’ll quickly grow to loathe the weirdly stilted message, “There is a mechanism which can be triggered here,” which the game is constantly spitting out at you as you gaze upon the latest pixelated whatsit. Is it a button? A knob? A keyhole? Who knows… in the end, the only viable course of action is to try every object in your inventory on it, then go back and start trying all the other objects you had to leave lying around the house thanks to your character’s rather brutal inventory limit.

Fighting is a strange, bloodless pantomime.

Yes, one might be able to write some of the game’s issues off as an aesthetic choice — as merely more ways to make the environment feel unsettling. Franck de Girolami, the second programmer on the development team as well as its project leader, has acknowledged using the disorienting camera consciously for just that purpose: “We realized that the camera angles in which the player was the most helpless were the best to bring in a monster. Players would instantly run for a view in which they felt comfortable.” While one does have to admire the team’s absolute commitment to the core concept of the game, the line between aesthetic choice and poor implementation is, at best, blurred in cases like this one.

And yet the fact remains that it was almost entirely thanks to that same commitment to its core concept that Alone in the Dark became one of the most important games of its era. Not a patch on a contemporary like Ultima Underworld as a demonstration of the full power and flexibility of 3D graphics — to be fair, it ran on an 80286 processor with just 640 K of memory while its texture-mapped, fully 3D rival demanded at least an 80386 with 2 MB — it remained conceptually unlike anything that had come before in daring to cast you as an ordinary mortal, weak and scared and alone, for whom any aspirations toward glory quickly turn into nothing more than a desperate desire to just escape the mansion. For all that it threw the Call of Cthulhu rules completely overboard, it retained this most fundamental aspect of its inspiration, bringing Chaosium’s greatest innovation to a digital medium for the first time. It’s not always impossible to kill the monsters in Alone in the Dark — often it’s even necessary to do so — but, with weapons and ammunition scarce and your health bar all too short, doing so never fails to feel like the literal death struggle it ought to. When you do win a fight, you feel more relieved than triumphant. And you’re always left with that nagging doubt in the back of the mind as you count your depleted ammo and drag your battered self toward the next room: was it worth it?


The legacy of this brave and important game is as rich as that of any that was released in its year, running along at least three separate tracks. We’ll begin with the subsequent career of Frédérick Raynal, its original mastermind.

The seeds of that career were actually planted a couple of weeks before the release of Alone in the Dark, when Raynal and others from Infogrames brought a late build of it to the European Computer Trade Show in London. There he met the journalist Dany Boolauck once again, learning in the process that Boolauck had switched gigs: he had left his magazine and now worked for Delphine Software, one of Infogrames’s French competitors. Delphine had recently lost the services of their biggest star: Éric Chahi, the auteur behind the international hit Another World. As his first assignment in his own new job, Boolauck had been given the task of replacing Chahi with a similarly towering talent. Raynal struck him as the perfect choice; he rather resembled Chahi in many respects, what with his very French aesthetic sensibility, his undeniable technical gifts, and his obsessive commitment to his work. Boolauck called in Paul de Senneville, the well-known composer who had launched Delphine Software as a spinoff from his record label of the same name, to add his dulcet voice to the mix. “We wish to place you in a setting where you will be able to create, where you will not be bullied, where we can make you a star,” said the distinguished older gentleman. “We want to give free rein to the fabulous talent you showed in Alone in the Dark.” When Raynal returned to Lyon to a reprimand from Bruno Bonnell for letting his game’s planned release date slip by a week, the contrast between his old boss and the possible new one who was courting him was painted all too clearly.

Much to Raynal’s dismay, Bonnell was already pushing him and the rest of the team that had made the first Alone in the Dark to make a sequel as quickly as possible using the exact same engine. One Friday just before the new year, Bonnell threw his charges a party to celebrate what he now believed would go down in history as the year when his struggling company turned the corner, thanks not least to Raynal’s game. On the following Monday morning, Raynal knocked on Bonnell’s office door along with three other members of the newly christened Alone in the Dark 2 team, including his most longstanding partner Didier Chanfray. They were all quitting, going to work for Delphine, Raynal said quietly. Much to their surprise, Bonnell offered to match Delphine’s offer, the first overt sign he’d ever given that he understood how talented and valuable they really were. But his counteroffer only prompted Delphine to raise the stakes again. Just after New Years Day, Bonnell bowed out of the bidding in a huff: “You want to leave? Goodbye!”

A couple of weeks later, the videogame magazine Génération 4 held an awards ceremony for the previous year’s top titles at Disneyland Paris. Everyone who had been involved with Alone in the Dark, both those who still worked at Infogrames and those who didn’t, was invited. When, as expected, it took the prize for top adventure game, Bruno Bonnell walked onto the stage to accept the award on behalf of his company. The departure of Raynal and crew being the talk of the industry, the room held its collective breath to see what would happen next. “My name is Bruno Bonnell,” he said from behind the rostrum. “I’d like to thank God, my dog, my grandmother, and of course the whole team at Infogrames for a beautiful project.” And with that he stumped offstage again.

It hadn’t been a particularly gracious acceptance speech, but Raynal and his colleagues nonetheless had much to feel good about. Dany Boolauck and Paul de Senneville were true to their word: they set Raynal up with a little auteur’s studio all his own, known as Adeline Software. They even allowed him to run it from Lyon rather than joining the rest of Delphine in Paris.

Naturally, all of the Alone in the Dark technology, along with the name itself and the Chaosium license (whatever that was worth), stayed with Infogrames. Raynal and his colleagues were thus forced to develop a new engine in the style of the old and to devise a fresh game idea for it to execute. Instead of going dark again, they went light. Released in 1994, Little Big Adventure (known as Relentless: Twinsen’s Adventure in North America) was a poetic action-adventure set in a whimsical world of cartoon Impressionism, consciously conceived by Raynal as an antidote to the ultra-violent Doom mania that was sweeping the culture of gaming at the time. He followed it up in 1997 with Little Big Adventure 2 (known as Twinsen’s Odyssey in North America). Although both games were and remain lovely to look at, Raynal still struggled to find the right balance between the art and the science of game design; both games are as absurdly punishing to play as they are charming to watch, with a paucity of save points between the countless places where they demand pin-point maneuvering and split-second timing. This sort of thing was, alas, something of a theme with the French games industry for far too many years.

This, then, is one legacy of Alone in the Dark. Another followed on even more directly, taking the form of the two sequels which Infogrames published in 1993 and 1994. Both used the same engine, as Bruno Bonnell had demanded in the name of efficiency, and both continued the story of the first game, with Edward Carnby still in the role of protagonist. (Poor Emily Hartwood got tossed by the wayside.) But, although Hubert Chardot once again provided their scripts, much of the spirit of the first game got lost, as the development team began letting the player get away with much more head-to-head combat. Neither sequel garnered as many positive reviews or sales as the original game, and Infogrames left the property alone for quite some time thereafter. A few post-millennial attempts to revive the old magic, still without the involvement of Raynal, have likewise yielded mixed results at best.

But it’s with Alone in the Dark‘s third legacy, its most important by far, that we should close. For several years, few games — not even its own sequels — did much to build upon the nerve-wracking style of play it had pioneered. But then, in 1996, the Japanese company Capcom published a zombie nightmare known as Resident Evil for the Sony Playstation console. “When I first played Resident Evil,” remembers Infogrames programmer Franck de Girolami, “I honestly thought it was plagiarism. I could recognize entire rooms from Alone in the Dark.” Nevertheless, Resident Evil sold in huge numbers on the consoles, reaching a mass market the likes of which Alone in the Dark, being available only on computers and the 3DO multimedia appliance, could never have dreamed. In doing so, it well and truly cemented the new genre that became known as survival-horror, which had gradually filtered its way up from the obscure works of a poverty-stricken writer to a niche tabletop RPG to a very successful computer game to a mainstream ludic blockbuster. Culture does move in mysterious ways sometimes, doesn’t it?

(Sources: the books La Saga des Jeux Vidéo by Daniel Ichbiah, Designers & Dragons: A History of the Roleplaying Game Industry, Volume 1 by Shannon Appelcline, and Alone in the Dark: The Official Strategy Guide by Johan Robson; Todd David Spaulding’s PhD thesis H.P. Lovecraft & The French Connection: Translations, Pulps, and Literary History”; Computer Gaming World of February 1993; Amiga Format of June 1991; Edge of November 1994; Retro Gamer 98. Online sources include Adventure Europe‘s interview with Frédérick Raynal, Just Adventure‘s interview with Hubert Chardot, and the video of Frédérick Raynal’s Alone in the Dark postmortem at the 2012 Game Developers Conference. Note that many of the direct quotations in this article were translated by me into English from their French originals.

The original Alone in the Dark trilogy is available as a package download at GOG.com.)

 
 

Tags: , , ,

Life Off the Grid, Part 1: Making Ultima Underworld

The 1980s was the era of the specialist in game development, when many of the most successful studios did just one or two things, but did them very, very well. For Infocom, that meant text adventures; for Sierra, graphic adventures; for MicroProse, military simulations; for SSI, strategic wargames and Dungeons & Dragons; for Epyx, joystick-twiddling sports and action games; for Origin, Ultima. When such specialists stepped outside of their comfort zones, the results were occasionally a triumph, but more often merely served to reemphasize their core competencies.

The most respected studios of the 1990s, however, tended toward more eclecticism. Developers like Dynamix and Westwood may have had their roots in the previous decade, but they really came into their own in this one, and did so with games of very diverse types. Westwood, for example, was happily making CRPGs, graphic adventures, real-time-strategy games, and Monopoly, for Pete’s sake, all virtually at the same time. Even the holdover specialists from the 1980s — those who were still standing — aggressively tried to diversify in the 1990s: Sierra moved into strategy games, MicroProse into CRPGs and graphic adventures, Origin into Wing Commander.

Still, if we look harder at many 1990s developers, we can find themes that bind together their output. In the case of Dynamix, we might posit that to be an interest in dynamic simulation, even when working in traditionally static genres like the graphic adventure. In that of Westwood, we can identify an even more pronounced interest in bringing the excitement of real time to traditionally turn-based genres like the CRPG and the wargame. And in the case of the studio we’ll be meeting for the first time today — Looking Glass Technologies, arguably the most respected and beloved 1990s studio of all — the binding thread is crystal clear. From beginning to end, they used the flexibility of 3D graphics to bring virtual environments to life in unprecedentedly immersive ways. Whether making a CRPG or a flight simulator, a first-person shooter or a first-person sneaker, this was their constant.


3D graphics were, one might say, baked right into Looking Glass’s DNA. Paul Neurath and Ned Lerner, its two eventual founders, met one another in 1978 in a computer-science course at Wesleyan University, where Neurath was studying environmental science, Lerner physics. For the course’s final project, they teamed up to make a 3D space game rendered in ASCII text. They got a B-minus on it only because their professor considered games to be beneath his course’s dignity.

After university, the two New England boys remained friends as they started their professional careers. When the home-computer craze got rolling in earnest, each bought an Apple II. They started experimenting, together and apart, on games much like the one they had written for that computer-science class, only implemented in real bitmap graphics, with a real joystick as a controller. These efforts culminated in a joint game known as Deep Space: Operation Copernicus, which they sold in 1985 to the publisher Sir-Tech, purveyors of the Wizardry CRPG series. Sir-Tech didn’t seem to know quite what to do with Neurath and Lerner’s very different sort of game, and it never escaped Wizardry‘s long shadow. Nevertheless, the experience of making a game and getting paid for it — however modestly — lit a fire in both partners. Each went off to pursue his own agenda, but they remained in touch, keeping one another updated on their progress and often sharing code and technical tricks.

Initially, it was Ned Lerner who made the most determined effort to become a real commercial game developer. He formed a little company called Lerner Research, and started gathering like-minded souls around him. As fixated as ever on 3D graphics, he decided that an at least semi-realistic flight simulator would be a good application for the technology. The leading product of that type on the market, subLOGIC’s generically titled Flight Simulator, he considered akin to a “textbook lesson”; he envisioned a flight simulator of his own that would be more accessible and fun. He hired an aerodynamic engineer to design a flight model for his game, which would focus on high-performance military aircraft like the legendary SR-71 Blackbird rather than the little Cessna that was forever tooling around from airport to airport in subLOGIC’s simulator. In fact, his game would let you fly any of fourteen different airplanes, in contrast to its rival’s one, and would concentrate on goal-oriented activities — “Flight Instruction,” “Test Flight,” “Formation Flying,” or “Airplane Racing” — instead of just expecting you to choose a starting airport and do whatever tickled your fancy.

Chuck Yeager and Ned Lerner discuss the vagaries of aerodynamics.

Electronic Arts, who lacked a competitive flight simulator and were eager to get in on one of the industry’s fastest-growing segments, signed on as publisher. Unlike Sir-Tech, they knew the appeal of snazzy packaging and celebrity endorsements. They convinced Chuck Yeager to put his name on the product. This was quite the coup; Yeager, a World War II fighter ace and the first man to break the sound barrier, was by far the most famous pilot in the country, after having been brought to indelible life by the actor Sam Shepard in the recent hit movie The Right Stuff. It was a decidedly nervous group of nerds and businessmen who met this aerospace legend for the first time in March of 1987. Lerner:

As we were sitting there in the office, listening to the rain outside, Rich Hilleman, associate producer at EA, was first to spot the Blazer entering the parking lot (license plate “BELL X1”). A few minutes later, we heard the unmistakable West Virginia drawl outside the door, as pure and easygoing as the man on TV who sells spark plugs with a shotgun. For a brief second, I remembered the opening scene of Patton where George C. Scott steps forward, dressed to the teeth in full military regalia. The door suddenly opened, and there he was: wearing cowboy boots, blue jeans, and a polo shirt under his racing-style jacket. General Yeager had a trim figure, and his face was tan, well-weathered, as if he had spent a lot of time outdoors. The general stepped forward, shaking hands with the members of the group, but I sensed a certain degree of reservation in his actions.

To get past this awkward beginning, we loaded in the current version of Advanced Flight Trainer. I flew the simulator for a while, then offered to let General Yeager take over. “I never fooled with these things,” he said. “That’s because, you know, the damned things are so…” — he searched for the word — “…insignificant. If you want to really scorch something, hell, you can program the X-31 in there, the aerospace plane. Now, see, you got some kid who can say, ‘Man, this thing is smoking along at mach 25.'”

The ice had finally been broken, and we all began contributing to the conversation. After discussing the subjects of liquid-oxygen fuel and the current type of aircraft that are touching the edge of space, the day was practically over. “This thing’s pretty dang realistic,” he told us. “You’ve got a lot of goodies in there.”

Released about six months later with much publicity, Chuck Yeager’s Advanced Flight Trainer became by far EA’s biggest hit of the year, and one of their biggest of the whole decade. With that push to get them off and running, Lerner Research continued their work on the frontiers of 3D graphics, giving EA a substantially revised version 2.0 of their flagship game in 1989.

Even as Ned Lerner was hobnobbing with famous test pilots, Paul Neurath was making his own inroads with the games industry. Shortly after finishing Deep Space, he had heard that Origin Systems of Ultima fame was located in New Hampshire, not all that far from him at all. On a lark, he drove down one day to introduce himself and take the temperature of the place. He hit it off immediately with Richard Garriott and the rest of the crew there. While he never became a full-fledged employee, he did become a regular around the Origin offices, contributing play-testing, design ideas, and occasional bits of code to their games on a contract basis.

In early 1987, Richard Garriott, who loathed New England with every fiber of his being, packed up and moved back to Austin, Texas, with most of Origin’s technical and creative staff. He left behind his older brother and business manager Robert, along with the latter’s support staff of accountants, secretaries, and marketers. A few developers who for one reason or another didn’t want to make the move also stayed behind. Neurath was among this group.

At about this same time, Neurath got the green light to make a game all his own for Origin. Space Rogue began as another 3D space shooter — another Deep Space, enhanced with some of the latest graphics technology from his friends at Lerner Research. To this template Neurath grafted a trading economy, a customizable spaceship, and a real plot. The player was even able to exit her spaceship and wander around the space stations she visited, talking to others and taking on quests. There was a surprising amount of ambition in this fusion of Deep Space, Elite, and Ultima, especially considering that Neurath designed, wrote, and programmed it all almost single-handedly from New Hampshire while most of his friends at Origin pursued other projects down in Austin. Although its disparate parts don’t ever gel quite well enough to make it a true classic, it’s remarkable that it works as well as it does.

Space Rogue sold in moderate numbers upon its release in 1989. More importantly in terms of gaming history, Chris Roberts of Origin spent a lot of time with it. Its melding of a space shooter with an adventure-game-like plot became one of the inspirations behind Wing Commander, the first Origin game to fully escape the shadow of Ultima — and, indeed, the beginning of one of the blockbuster franchises of the 1990s.

Space Rogue‘s hilarious cover art, with its artfully pouting male model who looks better suited to a Harlequin-romance cover. Paul Neurath remembers Origin’s marketing department asking him about his packaging preferences for his game. He said he would prefer a “non-representational” cover picture. Naturally, the marketers delivered about the most representational thing imaginable.

By the time of Space Rogue‘s release, Paul Neurath was a lonelier game developer than ever. In January of 1989, the last remnants of Origin’s New Hampshire operation had moved to Austin, leaving Neurath stranded in what Richard Garriott liked to call “the frozen wastes of New England.” For him, this was a crossroads of life if ever there was one. Did he want to continue to make games, and, if so, how? Sure, he could probably move down to Austin and get a job with Origin, but, truth be told, he had no more desire to live in Texas than Garriott had to live in New England. But how else could he stay in games?

At last, Neurath decided to take a page from Ned Lerner’s book. He would put together his own little company and try to establish it as an independent studio; after all, it had worked out pretty well for Ned so far. He registered his company under the name of Blue Sky Productions.

Neurath had always loved the CRPG genre, ever since Wizardry had become one of the first games he bought for his new Apple II. That love had once led him to publish Deep Space through Sir-Tech, and sent him out to Origin’s New Hampshire offices for that fateful visit. Now, he dreamed of taking the first-person dungeon crawl beyond the turn-based Wizardry, even beyond the real-time but still grid-based Dungeon Master, the state of the art in the genre as the 1980s expired. On a visit to Lerner Research, he saw the technology that he believed would make the genre’s next step possible — the foundation, one might even say, for everything he and his fabled studio Looking Glass would do in the 1990s. What he saw was the first 3D texture mapper that was suitable for use in an ordinary computer game.

3D graphics were hardly unknown on personal computers of the 1980s, as can be seen not least through the early careers of Ned Lerner and Paul Neurath. Yet, being enormously taxing to implement in the context of an interactive game, they demanded a lot of aesthetic compromise. Some early 3D games, such as Elite and the first versions of subLogic’s Flight Simulator, didn’t draw in the surfaces of their polygons at all, settling for wire frames. With the arrival of more powerful 16-bit computers in the mid-1980s, filled surfaces became more common in 3D games, but each side of a polygon was drawn in a single color. Combine this fact with the low polygon count that was still necessitated by the hardware of the time — resulting in big, fairly crude polygons — and you had a recipe for blotchy landscapes made up of garishly clashing primary colors.

A few clever developers were able to turn the limitations of 3D graphics into an aesthetic statement in its own right. But most of those who used them — among them makers of flight simulators and space shooters, such as Lerner and Neurath — suffered with their limitations because there just wasn’t any practical alternative for the sorts of games they were making. For an out-the-cockpit view from an airplane, the aesthetic compromises necessitated by going 3D were just about acceptable, given the way the distant landscape below tends to blur into hazy abstractions of color even in real life. But for a more personal, embodied experience, such as a first-person dungeon crawl, real-time 3D graphics were just too crude, too ugly. You couldn’t render the grain of a wooden door or the patina of a stone wall as one uniform splotch of color and expect to get away with it — not with the way that gamers’ audiovisual expectations were increasing every year.

A screenshot from Dungeon Master, the state of the art in dungeon crawls at the end of the 1980s. Notice how the walls, floor, and ceiling are shown in aesthetically pleasing detail. This was possible because movement in Dungeon Master was still based on a grid, which meant that each view could be assembled from pre-rendered component parts rather than needing to be rendered from scratch. A free-scrolling, real-time-rendered 3D version would have had to replace all of this detail with great uniform slabs of gray in order to run at an acceptable speed. The result, needless to say, would not have been pretty.

None of these problems were unknown to academic computer-graphics researchers; they’d been wrestling with them since well before the first personal computer hit the market. And they’d long since come up with a solution: texture mapping. The texture in question takes the form of an ordinary image file, which might be drawn by hand or digitized from a real-world photograph. A texture suitable for a wooden door, for example, could be an extreme closeup of any slab of wood. The texture is “glued” onto a polygon’s face in lieu of a solid color. Just like that, you suddenly have doors that look like real doors, slimy dungeon walls that look like real slimy dungeon walls.

The problem with texture mapping from the perspective of game development was the same one that haunted the whole field of 3D graphics: the problem of performance. Simple though the basic concept is, a lot of tricky math comes into play when one introduces textures; figuring out how they should wrap and fit together with one another over so many irregular polygonal surfaces is much more complicated than the lay observer might initially believe. At a time when just managing to paint the sides of your polygons in solid colors while maintaining a respectable frame rate was a real achievement, texture mapping was hopeless. Maybe it could be used in another decade or so, said the conventional wisdom, when Moore’s Law put a supercomputer on every desk.

But one recent arrival at Lerner Research wasn’t so sure that texture mapping was impossible using extant PC hardware. Chris Green had considerable experience with interactive 3D graphics, having spent several years at subLogic working on products like Flight Simulator and Jet. He arrived at Lerner Research knowing that texture mapping couldn’t be done on the likes of an 8-bit Apple II, the computer on which Neurath and Lerner among so many others had gotten their start. On the latest 16- and 32-bit MS-DOS hardware, however… he suspected that, with the right compromises, he could make it work there.

There was doubtless much efficient code in the texture mapper Green created, but it was indeed an unabashed compromise that made it feasible to attempt at all. The vertices of the polygons in a 3D graphics system are defined with an X, a Y, and a Z coordinate; it’s this last, of course, that makes such a system a 3D system at all. And it’s also the Z coordinate that is the source of all of the complications relating to 3D graphics in general. Nowhere is this more true than in the case of texture mapping. To do it correctly, textures have to be scaled and transformed to account for their position in relation to the viewing location, as largely defined by their Z coordinate. But Green didn’t bother to do texture mapping correctly; he effectively threw away the Z coordinate and glued his textures onto their polygons as if they were in a 2D space. This technique would come to be known inside the industry as “affine texture mapping.” It yielded an enormous increase in rendering speed, balanced by a degree of distortion that was almost unnoticeable in some situations, very noticeable indeed in others. Still, an imperfect texture mapper, Green decided, was better than no texture mapper at all.

The video clip above, from the finished game of Ultima Underworld, shows some of the spatial distortion that results from affine texture mapping, especially when viewing things from a very short distance. Moving through the game’s virtual space can look and feel a bit like moving through real space after having drunk one beer too many. Nonetheless, the environment is far more realistic, attractive, and immersive than any first-person 3D environment to appear in any game before this one.

Ned Lerner had recently signed a contract with EA to make a driving game bearing the name of Car and Driver magazine. Knowing the technology’s limitations, he planned to use Chris Green’s texture mapper in a somewhat constrained fashion therein, to draw onto the faces of billboards and the like. Yet he wasn’t averse to sharing it with Paul Neurath, who as soon as he saw it wanted to use it to take the next step beyond Dungeon Master.

To do so, however, he’d need more programmers, not to mention artists and all the rest; if there was one thing the two years or so he had spent making Space Rogue had taught him, it was that the days of the one-man development team were just about over. Luckily, a friend of his had a nephew who had a friend who was, as Neurath would be the first to admit, a far better programmer than he would ever be.

Doug Church was an MIT undergraduate who had let himself get so consumed by the fun going on inside the university’s computer labs that it had all but derailed his official education. He and his buddies spent hours every day hacking on games and playing them. Their favorite was a 3D tank game called Xtank, written by one of their number, a fellow student named Terry Donahue. They tinkered with its code endlessly, producing variations that departed radically from the original concept, such as a Frisbee simulator. When not coding or playing, they talked about what kinds of games they would like to make, if only they had infinite amounts of money and time and no hardware limitations whatsoever. They envisioned all sorts of little simulated worlds, rendered, naturally, in photo-realistic 3D graphics. Thus when Neurath introduced himself to Church in early 1990 and asked if he’d like to work on a free-scrolling, texture-mapped 3D dungeon crawl running in real time, he dropped his classes and rushed to get in on the chance. (Terry Donahue would doubtless have been another strong candidate to become lead programmer on the project, but he felt another calling; he would go on to become a priest.)

Neurath also found himself an artist, a fellow named Doug Wike who had worked on various projects for Origin in New Hampshire before those offices had been shuttered. Together the three men put together a crude non-interactive demo in a matter of weeks, showing the “player” moving up a texture-mapped dungeon corridor and bumping into a monster at the end of it. At the beginning of June, they took the demo to the Summer Consumer Electronics Show, where, behind all of the public-facing hype, many of the games industry’s most important deals got made.

As Neurath tells the story, the response from publishers was far from overwhelming. The demo was undeniably crude, and most were highly skeptical whether this unproven new company could get from it to a real, interactive game. It turned out that the only publisher willing to give the project any serious consideration at all was none other than Neurath’s old friends from Origin.

That Neurath hadn’t taken his idea to Origin straight away was down to his awareness of a couple of strategic decisions that had recently been made there, part of a whole collection of changes that were being made to greet the new decade’s challenges. Origin had, first of all, decided to stop giving contracts to outside developers, taking all development in-house so as to have complete control over the products they released. And secondly, they had decided, for the time being anyway, to make all of their output fit into one of two big franchises, Ultima and Wing Commander. Both of these decisions would seem to exclude Blue Sky’s proposed dungeon crawler, which they were calling simply Underworld, from becoming an Origin product. Nor did it help that a sexy public demonstration of the first Wing Commander game [1]Wing Commander was actually still known as Wingleader at this time. had become the hit of the show, making it difficult for Origin to focus on anything else; they could practically smell the money they were about to make from their new franchise.

Luckily, Blue Sky and Underworld found a champion at Origin even amidst all the distractions. Warren Spector was a relatively recent arrival at the company, but Neurath knew him pretty well; as his very first task for Origin, Spector had spent about a month expanding and polishing the text in Space Rogue just before its release. Now, looking at Underworld, he was sure he saw not just a game with real commercial potential but a technologically and aesthetically important one. “I was blown away,” he says today. “I remember thinking as I watched that demo that the world had just changed.” Spector convinced his colleagues to take a chance, to violate their rule of in-house development and sign a contract with Blue Sky, giving them a modest advance of $30,000. If the game worked out, they might be in on the ground floor of something major. It might also be something they could brand with the Ultima name, make into the beginning of a whole new sub-series in the franchise — a revival of the first-person (albeit turn-based) dungeons that had been in every Ultima through Ultima V. And if it didn’t work out, the $30,000 they’d lose on the flier was far from a fortune. The deal was done.

With that mission accomplished, Neurath’s little team returned to the office space he’d rented for them in New Hampshire. They spent almost a year there trying to understand the new set of technical affordances which Chris Green’s texture mapper had put at their disposal. They didn’t invent anything fundamentally new in terms of 3D graphics technology during that time. Like the texture mapper which spawned the project, everything they put into Underworld could be found in any number of books and journals at the MIT library, many of them dating well back into the 1970s and even 1960s. It was just a matter of adapting it all to the MS-DOS architecture. As it happened, the hardware they had to work with was about equal to the cutting-edge research workstations of ten years ago, so the old journal articles they pored over actually made a pretty good fit to it.

They kept coming back to the theme of embodiment, what Neurath called “a feeling of presence beyond what other games give you.” None of the earlier dungeon crawlers — not even those in the Dungeon Master tradition that ran in real time — had been able to deliver this. They could be exciting, stressful, even terrifying, but they never gave you the feeling of being physically embodied in their environments. It was the difference between reading a book or watching a movie and really being someplace.

It went without saying that Underworld must place you in control of just one character rather than the usual party of them. You needed to be able to sense the position of “your” body and limbs in the virtual space. Neurath:

We wanted to get a feeling that you were really in this dungeon. What would you expect to do in a dungeon? You might need to jump across a narrow chasm. You might expect to batter down a wooden door. You might expect to look up if there was a precipice above you. All these sorts of physical activities. And we tried to achieve, at least to a reasonable degree, that kind of freedom of motion and freedom of action. That really extended the R&D stage. It was about nine months, even a year, before we had all the underlying technology in place that allowed us to visualize this fantasy universe in a manner that we felt was appropriate and would work well and would allow the player the freedom to maneuver around and perform different kinds of actions.

Over the course of this time, Neurath hired only one more programmer, one Jonathan “J.D.” Arnold, who had previously worked on Infocom’s Z-Machine technology in that company’s twilight years. But finally, in the late spring of 1991, with the basic interface and the basic technical architecture all in place, Neurath decided it was time to hire some more people and make a real game out of it all. Doug Church immediately thought of his old friends back at MIT, and Neurath had no objections to recruiting from that pool; they were smart and passionate and, just as importantly, they were all happy to work for peanuts. Given the time of year it was, Church’s old buddies were all either graduating or finishing up their semester of coursework, leaving them free to come to Blue Sky.

None of these people had ever worked on a commercial computer game before. In fact, most of them hadn’t even played any commercial computer games recently, having been ensconced for the last several years inside the ivory tower of MIT, where the nature of gaming was markedly different, being a culture of creation rather than strictly one of consumption. And yet, far from being a disadvantage, the team’s sheer naivete proved to be the opposite, making them oblivious to the conventional wisdom about what was possible. Doug Church:

I had actually played Space Rogue because one of my friends had a Mac, but the clusters [at MIT] were all Unix boxes so I ran X-Trek and NetHack and things, but I hadn’t played a PC game in five years or something. So we just said, “Let’s do a really cool dungeon game in 3D, let’s go.” It’s interesting because a lot of people talk about how we were doing such a Dungeon Master game, but as far as I know none of us had ever played Dungeon Master. We didn’t have any idea we were doing anything that wasn’t just obvious in some sense because we had no context and the last time any of us had played a [commercial] game was back when we were fourteen. We played games in college, but they were very different; you’re playing networked X-Trek or something, it doesn’t feel like a home-computer game.

At first, the new arrivals all crowded into the New Hampshire office Neurath was renting. But most of them were actually living together in a rambling old three-story house in Cambridge, Massachusetts, and it struck them as silly to make the drive out to New Hampshire every day. They soon convinced Neurath to let them work on the game from home. From dawn until night, seven days a week, they ate, drank, slept, and breathed Underworld there.

At a time when most studios had begun to systematize the process of game development, dividing their employees into rigid tiers of specialists — programmers, artists, designers, writers — Blue Sky made a virtue of their complete lack of formal organization. It was an org-chart-wielding middle manager’s nightmare; just about everybody wound up doing a little bit of everything. There was nothing like a designer giving instructions to a technical team. Instead, Blue Sky’s method of working was more akin to the way that things got done among the hackers at MIT — a crowd of equals pulling together (and occasionally pulling apart) to work toward a common goal. Anyone could contribute absolutely anywhere, knowing his ideas would be judged only on their intrinsic worth.

When it became clear that it was time to start making the actual dungeon the Underworld player would have to explore, the team divided up this design work in the most democratic manner imaginable: everybody made one level, then they were all combined together to make the eight-level final dungeon. Dan Schmidt, who had officially been hired for the role of “AI programmer,” agreed to take on the mantle of “writer,” which really meant coordinating with everyone to merge the levels into a seamless whole.

For most of the time the game was in development, Origin’s role and overall interest — or, rather, lack thereof — was a consistent sore spot. It often seemed to Blue Sky that the folks in Austin had entirely forgotten their existence way off in the frozen wastes of New England. This was good in the sense that they got to make exactly the game they wanted to make, but it didn’t do much for their confidence that a committed publisher would be ready and eager to market it properly when they were done. Warren Spector was busy with Wing Commander and, later, with an Ultima spinoff called Martian Dreams, so Origin initially assigned Jeff Johannigman to Blue Sky in the role of producer. Communication with him was nothing short of terrible. After going two full months without hearing a peep from him, Neurath tried to call him down in Austin, only to be told that he had left the company. A second producer was finally selected, but he wasn’t much more engaged. Blue Sky believed they were making a great, groundbreaking game, but it seemed that Origin really couldn’t care less.

In many ways, Underworld was at odds with the prevailing trends inside Origin, not to mention in much of the games industry at large. Following the huge success of the first Wing Commander, Origin was banking heavily on cinematic games with big, set-piece storylines. The company’s org chart reflected the new impetus, with film-making terminology — producer, director, screenwriter — shoehorned in absolutely everywhere. Blue Sky, on the other hand, was making something very different, an immersive, emergent, non-linear experience without cut scenes or chapter breaks. Yes, there was a plot of sorts — the player got cast into a dungeon to rescue a princess or die trying — along with puzzles to be solved, quests to be fulfilled, and other characters to be spoken to, but it was all driven by the player, not by any relentlessly unspooling Hollywood-style script. Origin, it seemed, wasn’t quite sure what to make of it, wasn’t quite sure where it fit. And certainly it’s easy enough, given Blue Sky’s unorthodox working methods, to understand why so many at Origin were skeptical of their ability to deliver a finished game at all.

The danger of Blue Sky’s approach was that they would keep iterating endlessly as they kept having better and better ideas. This tendency among hackers to never be able to finish something and walk away from it had already derailed more than one promising games studio — not least among them FTL, the makers of the storied Dungeon Master, who had yet to release a proper followup after some four years. (Dungeon Master II wouldn’t finally arrive until 1995.) The need to finish games on a timetable was, one might say, the reason that industry executives had begun to impose the very organizational structures that Blue Sky was now so happily eschewing. Doug Church remembers creating “four movement systems and three combat systems because we’d just write something: ‘Oh, this seems cool, go for it.'” Would they just continue chasing whatever shiny objects struck their fancy until the money ran out? That wouldn’t take much longer, given that Paul Neurath was largely financing the whole effort out of his pocket, with some help from his ever-loyal friend Ned Lerner, whose success with his Chuck Yeager flight simulators had left him with a bit of money to spare.

Thus they were all fortunate that Warren Spector, their once and future savior, suddenly returned on the scene late in 1991. Virtually alone among his colleagues down in Austin, Spector had been watching Blue Sky’s progress with intense interest. Now, having finished up Martian Dreams, he got himself assigned as Underworld‘s third producer. He had considerable clout inside the bigger company; as soon as he started to press the issue there, things started to happen on Origin’s side to reassure Blue Sky that their game would in fact be released if they could only deliver it.

Indeed, after almost eighteen months of uncertainty on the question, Origin finally made it official that, yes, Underworld would be released as an Ultima game. As usual, the star would be the Avatar, who was becoming quite a busy fellow between this game, the mainline Ultima games, and the recent pair of Worlds of Ultima spinoffs. The dungeon in question, meanwhile, would be none other than the Stygian Abyss, where the Avatar had found the Codex of Ultimate Wisdom at the end of Ultima IV. Underworld‘s backstory would need to be bent and hammered enough to make this possible.

Blue Sky soon discovered that becoming an official Ultima game, while great for marketing purposes and for their own sense of legitimacy, was something of a double-edged sword. Origin demanded that they go back through all the text in the game to insert Ultima‘s trademark (and flagrantly misused) “thees” and “thous,” provoking much annoyance and mockery. And Origin themselves made a cinematic introduction for the game in Austin, featuring Richard Garriott, one of the industry’s worst voice actors of all time — and that, friends, is really saying something — in the leading role, bizarrely mispronouncing the word “Stygian.” It seems no one at Origin, much less at Blue Sky, dared to correct Lord British’s diction… (The British magazine PC Review‘s eventual reaction to the finished product is one for the ages: “I had to listen to it two or three times before I fully grasped what was going on because for the first couple of times I was falling about laughing at the badly dubbed Dick Van Dyke cockney accents that all these lovable Americans think we sound like. You know: ‘Awlright, Guv’noor, oop the happle un stairs!'”)

While Origin made the dodgy intro in Texas, Warren Spector got everybody in New England focused on the goal of a finished, shipped game. Doug Church:

Not only was he [Spector] great creatively to help us put finishing touches on it and clean it up and make it real, but he also knew how to finish projects and keep us motivated and on track. He had that ability to say, “Guys, guys, you’re focused in totally the wrong place.” He had that ability to help me and the rest of the guys reset, from the big-picture view of someone who has done it before and was really creative, but who also understood getting games done. It was a huge, huge win.

It’s very easy in hacker-driven game development to wind up with a sophisticated simulation that’s lots of fun for the programmers to create but less fun to actually play. Spector was there to head off this tendency as well at Blue Sky, as when he pared down an absurdly complex combat system to something simple and intuitive, or when he convinced the boys not to damage the player’s character every time he accidentally bumped into a wall. That, said Spector, “doesn’t sound like fun to me” as a player — and it was the player’s fun, he gently taught Blue Sky, that had to be the final arbitrator.

At Spector’s behest, Neurath rented a second office near Boston — officially known as the “Finish Underworld Now” office — and insisted that everyone leave the house and come in to work there every day during the last two months of the project. The more businesslike atmosphere helped them all focus on getting to the end result, as did Spector himself, who spent pretty much all of those last two months in the office with the team.

Spector did much to make Blue Sky feel like a valued part of the Origin family, but the relationship still remained rocky at times — especially when the former learned that the latter intended to release Ultima Underworld just two weeks before Ultima VII, the long-awaited next title in the franchise’s main series. It seemed all but certain that their game would get buried under the hype for Ultima VII, would be utterly forgotten by Origin’s marketers. Certainly marketing’s initial feedback hadn’t been encouraging. They were, they said, having trouble figuring out how to advertise Ultima Underworld. Its graphics were spectacular when seen in motion, but in still screenshots they didn’t look like much at all compared to a Wing Commander II or an Ultima VII. Blue Sky seethed with frustration, certain this was just an excuse for an anemic, disinterested advertising campaign.

In Origin’s defense, the problem their marketers pointed to was a real one. And it wasn’t really clear what they could have done about the release-date issue either. The original plan had been, as they didn’t hesitate to remind Blue Sky, to release Ultima Underworld in time for the Christmas of 1991, but the protracted development had put paid to that idea. Now, Blue Sky themselves needed Ultima Underworld to come out as quickly as possible because they needed the royalties in order to survive; for them, delaying it was simply impossible. Meanwhile Origin, who had cash-flow concerns of their own, certainly wasn’t going to delay Ultima VII, quite possibly the most expensive computer game ever made to that point, for a mere spinoff title. The situation was what it was.

The balloons fly as Doug Church, Paul Neurath, and Warren Spector celebrate Ultima Underworld‘s release.

Whatever was to happen in terms of sales, Blue Sky’s young hackers did get the satisfaction in late March of 1992 of seeing their game as a boxed product on store shelves, something more than one of them has described as a downright surreal experience. Dan Schmidt:

We were a bunch of kids straight out of school. This was the first professional project we’d ever done. We felt lucky that anyone would see it at all. We’d go into a games store and see our game there on the shelf. Someone would walk up to it, and we’d want to say, “No! No! You don’t want to buy that! We just hacked that together. It’s not, like, a real game.”

In the beginning, sales went about as expected. A snapshot from Origin’s in-house newsletter dated July 31, 1992, shows 71,000 copies of Ultima VII shipped, just 41,000 copies of Ultima Underworld. But, thanks to ecstatic reviews and strong word of mouth — Origin may have struggled to see how groundbreaking the game really was, but gamers got it immediately — Ultima Underworld kept on selling, getting stronger every month. “It was the first game that ever gave me a sense of actually being in a real place,” wrote one buyer in a letter to Origin, clear evidence that Blue Sky had absolutely nailed their original design goal. Soon industry scuttlebutt had it outselling Ultima VII by two to one. Paul Neurath claims that Ultima Underworld eventually sold more than half a million copies worldwide, an extraordinary figure for the time, and considerably more than Ultima VII or, indeed, any previous Ultima had managed.

Shortly after Ultima Underworld‘s release, Paul Neurath and Ned Lerner finally did the obvious: they merged their two companies. They had recently discovered that another, slightly older company was already operating under the name of “Blue Sky Software,” making educational products. So, they named the merged entity Looking Glass Technologies. Their first release under the name would be Ultima Underworld II.

Two months after the first Ultima Underworld appeared, a tiny company out of Dallas, Texas, who called themselves id Software released Wolfenstein 3D, another first-person game set in a 3D environment. Their game, however, had none of the complexity of Ultima Underworld, with its quests and puzzles and magic spells and its character to develop and even feed. In id’s game, you ran through the environment and killed things — period.

For the remainder of the 1990s, 3D games would exist on a continuum between the cool, high-concept innovation of Looking Glass and the hot, visceral action of id, who were interested in innovation in the area of their graphics technology but somewhat less so in terms of their basic gameplay template. id would win the argument in terms of sales, but Looking Glass would make some of the most fascinating and forward-looking games of the decade. “We were thinking, ‘Why don’t we just run around and shoot?’” says Austin Grossman, another early Looking Glass employee. “But we were interested in simulation and depth. We were driven by this holy grail of simulated worlds, by that enabled choice and creativity of the player.”

We’ll be following the two companies’ artistic dialog for a long time to come as we continue with this history. First, though, we need to give Ultima Underworld a closer look, from the perspective of the player this time, to understand why it’s not just an example of groundbreaking technology but a superb example of pure game design as well.

(Sources: the books Game Design: Theory & Practice 2nd edition by Richard Rouse III, Ultima VII and Underworld: More Avatar Adventures by Caroline Spector, Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, and Principles of Three-Dimensional Computer Animation: Modeling, Rendering, and Animating with 3D Computer Graphics by Michael O’Rourke; Questbusters of February 1992 and September 1992; PC Review of June 1992; Game Developer of April/May 1995, June/July 1995, August/September 1995, December 1995/January 1996, and April/May 1996; Commodore Magazine of January 1988; Origin’s internal newsletter Point of Origin from January 17 1992, March 27 1992, May 8 1992, August 28 1992, and December 18 1992. Online sources include “Ahead of Its Time: A History of Looking Glass” on Polygon, an interview with Paul Neurath and Doug Church on the old Ultima Online site, Gambit Game Lab’s interviews with Paul Neurath and Dan Schmidt, and Matt Barton’s interview with Paul Neurath. My thanks to Dan Schmidt and Ned Lerner for making the time to talk with me personally about their careers.

Ultima Underworld and its sequel can be purchased from GOG.com.)

Footnotes

Footnotes
1 Wing Commander was actually still known as Wingleader at this time.
 
 

Tags: , , ,

Star Control II

In this vaguely disturbing picture of Toys for Bob from 1994, Paul Reiche is at center and Fred Ford to the left. Ken Ford, who joined shortly after Star Control II was completed, is to the right.

There must have been something in the games industry’s water circa 1992 when it came to the subject of sequels. Instead of adhering to the traditional guidelines — more of the same, perhaps a little bigger — the sequels of that year had a habit of departing radically from their predecessors in form and spirit. For example, we’ve recently seen how Virgin Games released a Dune II from Westwood Studios that had absolutely nothing to do with the same year’s Dune I, from Cryo Interactive. But just as pronounced is the case of Accolade’s Star Control II, a sequel which came from the same creative team as Star Control I, yet which was so much more involved and ambitious as to relegate most of what its predecessor had to offer to the status of a mere minigame within its larger whole. In doing so, it made gaming history. While Star Control I is remembered today as little more than a footnote to its more illustrious successor, Star Control II remains as passionately loved as any game from its decade, a game which still turns up regularly on lists of the very best games ever made.



Like those of many other people, Paul Reiche III’s life was irrevocably altered by his first encounter with Dungeons & Dragons in the 1970s. “I was in high school,” he remembers, “and went into chemistry class, and there was this dude with glasses who had these strange fantasy illustrations in front of him in these booklets. It was sort of a Napoleon Dynamite moment. Am I repulsed or attracted to this? I went with attracted to it.”

In those days, when the entire published corpus of Dungeons & Dragons consisted of three slim, sketchy booklets, being a player all but demanded that one become a creator — a sort of co-designer, if you will — as well. Reiche and his friends around Berkeley, California, went yet one step further, becoming one of a considerable number of such folks who decided to self-publish their creative efforts. Their most popular product, typed out by Reiche’s mother on a Selectric typewriter and copied at Kinko’s, was a book of new spells called The Necromican.

That venture eventually crashed and burned when it ran afoul of that bane of all semi-amateur businesses, the Internal Revenue Service. It did, however, help to secure for Reiche what seemed the ultimate dream job to a young nerd like him: working for TSR itself, the creator of Dungeons & Dragons, in Lake Geneva, Wisconsin. He contributed to various products there, but soon grew disillusioned by the way that his own miserable pay contrasted with the rampant waste and mismanagement around him, which even a starry-eyed teenage RPG fanatic like him couldn’t fail to notice. The end came when he spoke up in a meeting to question the purchase of a Porsche as an executive’s company car. That got him “unemployed pretty dang fast,” he says.

So, he wound up back home, attending the University of California, Berkeley, as a geology major. But by now, it was the 1980s, and home computers — and computer games — were making their presence felt among the same sorts of people who tended to play Dungeons & Dragons. In fact, Reiche had been friends for some time already with one of the most prominent designers in the new field: Jon Freeman of Automated Simulations, designer of Temple of Apshai, the most sophisticated of the very early proto-CRPGs. Reiche got his first digital-game credit by designing The Keys of Acheron, an “expansion pack” for Temple of Apshai‘s sequel Hellfire Warrior, for Freeman and Automated. Not long after, Freeman had a falling-out with his partner and left Automated to form Free Fall Associates with his wife, programmer Anne Westfall. He soon asked Reiche to join them. It wasn’t a hard decision to make: compared to the tabletop industry, Reiche remembers, “there was about ten times the money in computer games and one-tenth the number of people.”

Freeman, Westfall, and Reiche made a big splash very quickly, when they were signed as one of the first group of “electronic artists” to join a new publisher known as Electronic Arts. Free Fall could count not one but two titles among EA’s debut portfolio in 1983: Archon, a chess-like game where the pieces fought it out with one another, arcade-style, under the players’ control; and Murder on the Zinderneuf, an innovative if not entirely satisfying procedurally-generated murder-mystery game. While the latter proved to be a slight commercial disappointment, the former more than made up for it by becoming a big hit, prompting the trio to make a somewhat less successful sequel in 1984.

After that, Reiche parted ways with Free Fall to become a sort of cleanup hitter of a designer for EA, working on whatever projects they felt needed some additional design input. With Evan and Nicky Robinson, he put together Mail Order Monsters, an evolution of an old Automated Simulations game of monster-movie mayhem, and World Tour Golf, an allegedly straight golf simulation to which the ever-whimsical Reiche couldn’t resist adding a real live dinosaur as the mother of all hazards on one of the courses. Betwixt and between these big projects, he also lent a helping hand to other games: helping to shape the editor in Adventure Construction Set, making some additional levels for Ultimate Wizard.

Another of these short-term consulting gigs took him to a little outfit called Binary Systems, whose Starflight, an insanely expansive game of interstellar adventure, had been in production for a couple of years already and showed no sign of being finished anytime soon. This meeting would, almost as much as his first encounter with Dungeons & Dragons, shape the future course of Reiche’s career, but its full import wouldn’t become clear until years later. For now, he spent two weeks immersed in the problems and promise of arguably the most ambitious computer game yet proposed, a unique game in EA’s portfolio in that it was being developed exclusively for the usually business-oriented MS-DOS platform rather than a more typical — and in many ways more limited — gaming computer. He bonded particularly with Starflight‘s scenario designer, an endlessly clever writer and artist named Greg Johnson, who was happily filling his galaxy with memorable and often hilarious aliens to meet, greet, and sometimes beat in battle.

Reiche’s assigned task was to help the Starflight team develop a workable conversation model for interacting with all these aliens. Still, he was thoroughly intrigued with all aspects of the project, so much so that he had to be fairly dragged away kicking and screaming by EA’s management when his allotted tenure with Binary Systems had expired. Even then, he kept tabs on the game right up until its release in 1986, and was as pleased as anyone when it became an industry landmark, a proof of what could be accomplished when designers and programmers had a bigger, more powerful computer at their disposal — and a proof that owners of said computers would actually buy games for them if they were compelling enough. In these respects, Starflight served as nothing less than a harbinger of computer gaming’s future. At the same, though, it was so far out in front of said future that it would stand virtually alone for some years to come. Even its sequel, released in 1989, somehow failed to recapture the grandeur of its predecessor, despite running in the same engine and having been created by largely the same team (including Greg Johnson, and with Paul Reiche once again helping out as a special advisor).

Well before Starflight II‘s release, Reiche left EA. He was tired of working on other people’s ideas, ready to take full control of his own creative output for the first time since his independent tabletop work as a teenager a decade before. With a friend named Fred Ford, who was the excellent programmer Reiche most definitely wasn’t, he formed a tiny studio — more of a partnership, really — called Toys for Bob. The unusual name came courtesy of Reiche’s wife, a poet who knew the value of words. She said, correctly, that it couldn’t help but raise the sort of interesting questions that would make people want to look closer — like, for instance, the question of just who Bob was. When it was posed to him, Reiche liked to say that everyone who worked on a Toys for Bob game should have his own Bob in mind, serving as an ideal audience of one to be surprised and delighted.

Reiche and Ford planned to keep their company deliberately tiny, signing only short-term contracts with outsiders to do the work that they couldn’t manage on their own. “We’re just people getting a job done,” Reiche said. “There are no politics between [us]. Once you start having art departments and music departments and this department and that department, the organization gets a life of its own.” They would manage to maintain this approach for a long time to come, in defiance of all the winds of change blowing through the industry; as late as 1994, Toys for Bob would permanently employ only three people.

Yet Reiche and Ford balanced this small-is-beautiful philosophy with a determination to avoid the insularity that could all too easily result. They made it a policy to show Toys for Bob’s designs-in-progress to many others throughout their evolution, and to allow the contractors they hired to work on them the chance to make their own substantive creative inputs. For the first few years, Toys for Bob actually shared their offices with another little collective who called themselves Johnson-Voorsanger Productions. They included in their ranks Greg Johnson of Starflight fame and one Robert Leyland, whom Reiche had first met when he did the programming for Murder on the Zinderneuf — Anne Westfall had had her hands full with Archon — back in the Free Fall days. Toys for Bob and Johnson-Voorsanger, these two supposedly separate entities, cross-pollinated one another to such an extent that they might almost be better viewed as one. When the latter’s first game, the cult-classic Sega Genesis action-adventure ToeJam & Earl, was released in 1991, Reiche and Ford made the credits for “Invaluable Aid.” And the influence which Leyland and particularly Johnson would have on Toys for Bob’s games would be if anything even more pronounced.

Toys for Bob’s first game, which they developed for the publisher Accolade, was called Star Control. With it, Reiche looked all the way back to the very dawn of digital gaming — to the original Spacewar!, the canonical first full-fledged videogame ever, developed on a DEC PDP-1 at the Massachusetts Institute of Technology circa 1962. In Star Control as in Spacewar!, two players — ideally, two humans, but potentially one human and one computer player, or even two computer players if the “Cyborg Mode” is turned on — fight it out in an environment that simulates proper Newtonian physics, meaning objects in motion stay in motion until a counter-thrust is applied. Players also have to contend with the gravity wells of the planets around them — these in place of the single star which affects the players’ ships in Spacewar! — as they try to blow one another up. But Star Control adds to this formula a wide variety of ships with markedly differing weaponry, defensive systems, sizes, and maneuvering characteristics. In best rock-paper-scissors fashion, certain units have massive advantages over others and vice versa, meaning that a big part of the challenge is that of maneuvering the right units into battle against the enemy’s. As in real wars, most of the battles are won or lost before the shooting ever begins, being decided by the asymmetries of the forces the players manage to bring to bear against one another. Reiche:

It was important to us that each alien ship was highly differentiated. What it means is, unlike, say, Street Fighter, where your characters are supposedly balanced with one another, our ships weren’t balanced at all, one on one. One could be very weak, and one could be very strong, but the idea was, your fleet of ships, your selection of ships in total, was as strong as someone else’s, and then it came down to which match-up did you find. One game reviewer called it, “Rock, Scissors, Vapor,” which I thought was a great expression.

Of course, even the worst match-ups leave a sliver of hope that a brilliant, valorous performance on the field of battle can yet save the day.

You can play Star Control in “Melee” mode as a straight-up free-for-all. Each player gets seven unique ships from the fourteen in the game, from which she gets to choose one for each battle. First player to destroy all of her opponent’s ships wins. But real strategy — that is to say, strategy beyond the logic of rock-paper-scissors match-ups — comes into play only with the full game, which takes the form of a collection of scenarios where each player must deploy her fleet over a galactic map. In the more complex scenarios, controlling more star systems means more resources at one’s disposal, which can be used to build more and better ships at a player’s home starbase; this part of the game draws heavily from the beloved old Atari 8-bit classic Star Raiders. A scenario editor is also included for players who get bored with the nine scenarios that come with the game.

Star Control strains nobly to accommodate many different play styles and preferences. Just as it’s possible to turn on Cyborg Mode in the strategy game and let the computer do the fighting, it’s also possible to turn on “Psytron Mode” and let the computer do the strategy while you concentrate on blowing stuff up.

Star Control in action. The red ship is the infamous Syreen Penetrator.

Yet the aspect of Star Control that most players seem to remember best has nothing to do with any of these efforts to be all things to all players. At some point in the development process, Reiche and Ford realized they needed a context for all this interstellar violence. They came up with an “Alliance of Free Stars” — which included Earthlings among its numbers — fighting a war against the evil “Ur-Quan Hierarchy.” Each group of allies/thralls conveniently consists of seven species, each with their own unique model of spaceship. Not being inclined to take any of this too seriously, Toys for Bob let their whimsy run wild in creating all these aliens, enlisting Greg Johnson — the creator of the similarly winsome and hilarious aliens who inhabit the galaxy of Starflight — to add his input as well. The rogue’s gallery of misfits, reprobates, and genetic oddities that resulted can’t help but make you smile, even if they are more fleshed-out in the manual rather than on the screen.

Reiche on the origins of the Illwrath, a race of arachnid fundamentalists who “receive spiritual endorsement in the accomplishment of vicious surprise attacks”:

The name “Illwrath” comes from an envelope I saw at the post office, which was being sent to a Ms. McIlwrath in Glasgow, Scotland. I didn’t see the “Mc” at first, and I swear, my first thought was that they must be sending that envelope to an alien. I am sure that somewhere there is a nice little Scottish lady laughing and saying, “Oh, those crazy Americans! Here’s one now calling me an evil, giant, religiously-intolerant space spider — ha, ha, ha, how cute!” Hmm… on second thought, if I am ever found beaten with bagpipes or poisoned with haggis, please contact the authorities.

Around the office, Fred Ford liked to say that the Illwrath had become so darn evil by first becoming too darn righteous, wrapping right around the righteousness scale and yielding results akin to all those old computer games which suddenly started showing negative statistics if you built up your numbers too far. (Personally, I favor this idea greatly, and, indeed, even believe it might serve as an explanation for certain forces in current American politics.)

Reiche on the Mmrnmhrm, an “almost interesting robot race” who “fear vowels almost as much as they do a Dreadnought closing in at full bore”:

When I first named the Mmrnmhrm, they actually had a pronounceable name, with vowels and everything. Then, in a sketch for the captain’s window illustration, I forgot to give them a mouth. Later, someone saw the sketch and asked me how they talked, so I clamped my lips shut and said something like, “Mrrk nsss,” thereby instituting a taboo on vowels in anything related to the race. Though the Mmrnmhrm ended up looking more like Daleks than Humans, the name stuck.

Reiche on the Syreen, a group of “humanoid females” who embody — knowingly, one likes to believe — every cliché about troglodyte gamers and the fairer sex, right down to their bulbous breasts that look like they’re filled with sand (their origin story also involves the San Francisco earthquake of 1989):

It was an afternoon late last October in San Francisco when Fred Ford, Greg Johnson, and I sat around a monitor trying to name the latest ship design for our new game. The space vessel on the computer screen looked like a copper-plated cross between Tin Tin’s Destination Moon rocketship and a ribbed condom. Needless to say, we felt compelled to christen this ship carefully, with due consideration for our customers’ sensibilities as well as our artistic integrity. “How about the Syreen Penetrator?” Fred suggested without hesitation. Instantly, the ground did truly rise up and smite us! WHAM-rumble-rumble-WHAM! We were thrown around our office like the bridge crew of the starship Enterprise when under fire by the Klingons. I dimly remember standing in a doorframe, watching the room flex like a cheap cardboard box and shouting, “Maybe that’s not such a great name!” and “Gee, do you think San Francisco’s still standing?” Of course, once the earth stopped moving, we blithely ignored the dire portent, and the Syreen’s ship name, “The Penetrator,” was graven in code.

Since then, we haven’t had a single problem. I mean, everyone has a disk crash two nights before a program is final, right? And hey, accidents happen. Brake pads just don’t last forever! My limp is really not that bad, and Greg is almost speaking normally these days.

Star Control was released in 1990 to cautiously positive reviews and reasonable sales. For all its good humor, it proved a rather polarizing experience. The crazily fast-paced action game at its heart was something that about one-third of players seemed to take to and love, while the rest found it totally baffling, being left blinking and wondering what had just happened as the pieces of their exploded ship drifted off the screen about five seconds after a fight had begun. For these people, Star Control was a hard sell: the strategic game just wasn’t deep enough to stand on its own for long, and, while the aliens described in the manual were certainly entertaining, this was a computer game, not a Douglas Adams book.

Still, the game did sufficiently well that Accolade was willing to fund a sequel. And it was at this juncture that, as I noted at the beginning of this article, Reiche and Ford and their associates went kind of nuts. They threw out the less-than-entrancing strategy part of the first game, kept the action part and all those wonderful aliens, and stuck it all into a grand adventure in interstellar space that owed an awful lot to Starflight — more, one might even say, than it owed to Star Control I.

As in Starflight, you roam the galaxy in Star Control II: The Ur-Quan Masters to avert an apocalyptic threat, collecting precious resources and even more precious clues from the planets you land on, negotiating with the many aliens you meet and sometimes, when negotiations break down, blowing them away. The only substantial aspect of the older game that’s missing from its spiritual successor is the need to manage a bridge crew who come complete with CRPG-style statistics. Otherwise, Star Control II does everything Starflight does and more. The minigame of resource collection on planets’ surfaces, dodging earthquakes and lightning strikes and hostile lifeforms, is back, but now it’s faster paced, with a whole range of upgrades you can add to your landing craft in order to visit more dangerous planets. Ditto space combat, which is now of the arcade style from Star Control I — if, that is, you don’t have Cyborg Mode turned on, which is truly a godsend, the only thing that makes the game playable for many of us. You still need to upgrade your ship as you go along to fight bigger and badder enemies and range faster and farther across space, but now you also can collect a whole fleet of support ships to accompany you on your travels (thus preserving the rock-paper-scissors aspect of Star Control I). I’m not sure that any of these elements could quite carry a game alone, but together they’re dynamite. Much as I hate to employ a tired reviewer’s cliché like “more than the sum of its parts,” this game makes it all but unavoidable.

And yet the single most memorable part of the experience for many or most of us remains all those wonderful aliens, who have been imported from Star Control I and, even better, moved from the pages of the manual into the game proper. Arguably the most indelible of them all, the one group of aliens that absolutely no one ever seems to forget, are the Spathi, a race of “panicked mollusks” who have elevated self-preservation into a religious creed. Like most of their peers, they were present in the first Star Control but really come into their own here, being oddly lovable despite starting the game on the side of the evil Ur-Quan. The Spathi owe more than a little something to the Spemin, Starflight‘s requisite species of cowardly aliens, but are based at least as much, Reiche admits a little sheepishly, on his own aversion to physical danger. Their idea of the perfect life was taken almost verbatim from a conversation about same that Reiche and Ford once had over Chinese food at the office. Here, then, is Reiche and the Spathi’s version of the American Dream:

I knew that someday I would be vastly rich, wealthy enough to afford a large, well-fortified mansion. Surrounding my mansion would be vast tracts of land, through which I could slide at any time I wished! Of course, one can never be too sure that there aren’t monsters hiding just behind the next bush, so I would plant trees to climb at regular, easy-to-reach intervals. And being a Spathi of the world, I would know that some monsters climb trees, though often not well, so I would have my servants place in each tree a basket of perfect stones. Not too heavy, not too light — just the right size for throwing at monsters.

“Running and away and throwing rocks,” explains Reiche, “extrapolated in all ways, has been one of my life strategies.”

The Shofixti, who breed like rabbits. Put the one remaining female in the galaxy together with the one remaining male, wait a couple of years… and poof, you have an army of fuzzy little warmongers on your side. They fight with the same enthusiasm they have for… no, we won’t go there.

My personal favorite aliens, however, are the bird-like Pkunk, a peaceful, benevolent, deeply philosophical race whose ships are nevertheless fueled by the insults they spew at their enemies during battle. They are, of course, merely endeavoring to make sure that their morality doesn’t wrap back around to zero and turn them evil like the Illwrath. “Never be too good,” says Reiche. “Insults, pinching people when they aren’t looking… that’ll keep you safe.”

In light of the aliens Greg Johnson had already created for Starflight, not to mention the similarities between Starflight‘s Spemin and Star Control‘s Spathi, there’s been an occasional tendency to perhaps over-credit his contribution — valuable though it certainly was — to Toys for Bob’s own space epic. Yet one listen to Reiche and Ford in interviews should immediately disabuse anyone of the notion that the brilliantly original and funny aliens in Star Control II are there entirely thanks to Johnson. After listening to Reiche in particular for a few minutes, it really is blindingly obvious that this is the sense of humor behind the Spathi and so many others. Indeed, anyone who has played the game can get a sense of this just from reading some of his quotes in this very article.

There’s a rich vein of story and humor running through even the most practical aspects of Star Control II, as in this report from a planet’s surface. The two complement one another rather than clashing, perhaps because Toys for Bob is clever enough to understand that less is sometimes more. Who are the Liebermann triplets? Who knows? But the line makes you laugh, and that’s the important thing. When a different development team took the reins to make a Star Control III, Reiche’s first piece of advice to them was, “For God’s sake, don’t try to explain everything.” Many a lore-obsessed modern game could afford to take the same advice to heart.

Long after every other aspect of the game has faded from memory, its great good humor, embodied in all those crazy aliens, will remain. It may be about averting a deadly serious intergalactic apocalypse, but, for all that, Star Control II is as warm and fuzzy a space opera as you’ll ever see.

Which isn’t to say that it doesn’t go in for plot. In fact, the sequel’s plot is as elaborate as its predecessor’s was thin; the backstory alone takes up some twenty pages in the manual. The war which was depicted in Star Control I, it turns out, didn’t go so well for the good guys; the sequel begins with you entering our solar system in command of the last combat-worthy craft among a shattered and defeated Alliance of Free Stars. The Ur-Quan soon get wind of your ship’s existence and the last spark of defiance against their rule that it represents, and send a battlefleet toward Earth to snuff it out. And so the race is on to rebuild the Alliance and assemble a fleet of your own before the Ur-Quan arrive. How you do so is entirely up to you. Suffice to say that Earth’s old allies are out there. It’s up to you to find the aliens and convince them to join you in whatever sequence seems best, while finding the resources you need to fuel and upgrade your spaceship and juggling a whole lot of other problems at the same time. This game is as nonlinear as they come.

Star Control II takes itself seriously in the places where it’s important to do so, but never too seriously. Anyone bored with the self-consciously “dark” fictions that so often dominate in our current era of media will find much to appreciate here.

When asked to define what makes a good game, Paul Reiche once said that it “has to have a fun core, which is a one-sentence description of why it’s fun.” Ironically, Star Control II is an abject failure by this standard, pulling in so many directions as to defy any such holistic description. It’s a strategy game of ship and resource management; it’s an action game of ship-versus-ship combat; it’s an adventure game of puzzle-solving and clue-tracking. Few cross-genre games have ever been quite so cross-genre as this one. It really shouldn’t work, but, for the most part anyway, it does. If you’re a person whose ideal game lets you do many completely different things at every session, this might just be your dream game. It really is an experience of enormous richness and variety, truly a game like no other. Small wonder that it’s attracted a cult of players who will happily declare it to be nothing less than the best game ever made.

For my part, I have a few too many reservations to go quite that far. Before I get to them, though, I’d like to let Reiche speak one more time. Close to the time of Star Control II‘s release, he outlined his four guiding principles of game design. Star Control II conforms much better to these metrics than it does to that of the “one-sentence description.”

First, [games should be] fun, with no excuses about how the game simulates the agony and dreariness of the real world (as though this was somehow good for you). Second, they [should] be challenging over a long period of time, preferably with a few ability “plateaus” that let me feel in control for a period of time, then blow me out of the water. Third, they [should] be attractive. I am a sucker for a nice illustration or a funky riff. Finally, I want my games to be conceptually interesting and thought-provoking, so one can discuss the game with an adult and not feel silly.

It’s in the intersection between Reiche’s first and second principles that I have my quibbles with Star Control II. It’s a rather complicated, difficult game by design, which is fair enough as long as it’s complex and difficult in a fun way. Some of its difficulty, however, really doesn’t strike me as being all that much fun at all. Those of you who’ve been reading this blog for a while know that I place enormous weight on fairness and solubility when it comes to the games I review, and don’t tend to cut much slack to those that can only be enjoyed and/or solved with a walkthrough or FAQ to hand. On this front, Star Control II is a bit problematic, due largely to one questionable design choice.

Star Control II, you see, has a deadline. You have about five years before Earth is wiped out by the Ur-Quan (more precisely, by the eviller of the two factions of the Ur-Quan, but we won’t get into that here). Fans will tell you, by no means entirely without justification, that this is an essential part of the game. One of the great attractions of Star Control II is its dynamic universe which just keeps evolving, with or without your intervention: alien spaceships travel around the galaxy just like yours is doing, alien races conquer others and are themselves conquered, etc.

All of this is undoubtedly impressive from a game of any vintage, let alone one as old and technologically limited as this one. And the feeling of inhabiting such a dynamic universe is undoubtedly bracing for anyone used to the more static norm, where things only happen when you push them to happen. Yet it also has its drawbacks, the most unfortunate of which is the crushing sense of futility that comes after putting dozens of hours into the game only to lose it irrevocably. The try-and-try-again approach can work in small, focused games that don’t take long to play and replay, such as the early mysteries of Infocom. In a sprawling epic like this, however… well, does anyone really want to put those dozens of hours in all over again, clicking through page after page of the same text?

Star Control II‘s interface felt like something of a throwback even in its own time. By 1992, computer games had almost universally moved to the mouse-driven point-and-click model. Yet this game relies entirely on multiple-choice menus, activated by the cursor keys and/or a joystick. Toys for Bob was clearly designing with possible console ports in mind. (Star Control was ported to the Sega Genesis, but, as it happened, Star Control II would never get the same honor, perhaps because its sales didn’t quite justify the expense and/or because its complexity was judged unsuited to the console market.) Still, for all that it’s a little odd, the interface is well thought-through, and you get used to it quickly.

There’s an undeniable tension between this rich galaxy, full of unusual sights and entertaining aliens to discover, and the need to stay relentlessly on-mission if you hope to win in the end. I submit that the failure to address this tension is, at bottom, a failure of game design. There’s much that could have been done. One solution might have been to tie the evolving galaxy to the player’s progress through the plot rather than the wall clock, a technique pioneered in Infocom’s Ballyhoo back in 1986 and used in countless narrative-oriented games since. It can convey the impression of rising danger and a skin-of-the-teeth victory every time without ever having to send the player back to square one. In the end, the player doesn’t care whether the exhilarating experience she’s just had is the result of a meticulous simulation coincidentally falling into place just so, or of a carefully manipulated sleight of hand. She just remembers the subjective experience.

But if such a step is judged too radical — too counter to the design ethos of the game — other remedies could have been employed. To name the most obvious, the time limit could have been made more generous; Starflight as well has a theoretical time limit, but few ever come close to reaching it. Or the question of time could have been left to the player — seldom a bad strategy in game design — by letting her choose from a generous, moderate, and challenging time limit before starting the game. (This approach was used to good effect by the CRPG The Magic Candle among plenty of other titles over the years.)

Instead of remedying the situation, however, Reiche and his associates seemed actively determined to make it worse with some of their other choices. To have any hope of finishing the game in time, you need to gain access to a new method of getting around the galaxy, known as “quasi-space,” as quickly as possible. Yet the method of learning about quasi-space is one of the more obscure puzzles in the game, mentioned only in passing by a couple of the aliens you meet, all too easy to overlook entirely. Without access to quasi-space, Star Control II soon starts to feel like a fundamentally broken, unbalanced game. You trundle around the galaxy in your truck of a spaceship, taking months to reach your destinations and months more to return to Earth, burning up all of the minerals you can mine just to feed your engines. And then your time runs out and you lose, never having figured out what you did wrong. This is not, needless to say, a very friendly way to design a game. Had a few clues early on shouted, “You need to get into quasi-space and you may be able to do so here!” just a little more loudly, I may not have felt the need to write any of the last several paragraphs.

I won’t belabor the point any more, lest the mob of Star Control II zealots I can sense lurking in the background, sharpening their pitchforks, should pounce. I’ll say only that this game is, for all its multifaceted brilliance, also a product of its time — a time when games were often hard in time-extending but not terribly satisfying ways, when serious discussions about what constituted fair and unfair treatment of the player were only just beginning to be had in some quarters of the industry.

Searching a planet’s surface for minerals, lifeforms, and clues. Anyone who has played Starflight will feel right at home with this part of the game in particular.

Certainly, whatever our opinion of the time limit and the game’s overall fairness, we have to recognize what a labor of love Star Control II was for Paul Reiche, Fred Ford, and everyone who helped bring it to fruition, from Greg Johnson and Robert Leyland to all of the other writers and artists and testers who lent it their talents. Unsurprisingly given its ambition, the project went way beyond the year or so Accolade had budgeted for it. When their publisher put their foot down and said no more money would be forthcoming, Reiche and Ford reached deep into their own pockets to carry it through the final six months.

As the project was being wrapped up, Reiche realized he still had no music, and only about $1500 left for acquiring some. His solution was classic Toys for Bob: he ran an online contest for catchy tunes, with prizes of $25, $50, and $100 — in addition to the opportunity to hear one’s music in (hopefully) a hit game, of course. The so-called “tracker” scene in Europe stepped up with music created on Commodore Amigas, a platform for which the game itself would never be released. “These guys in Europe [had] just built all these ricky-tink programs to play samples out,” says Reiche. “They just kept feeding samples, really amazing soundtracks, out into the net just for kicks. I can’t imagine any of these people were any older than twenty. It makes me feel like I’m part of a bigger place.”

Upon its release on November 30, 1992 — coincidentally, the very same day as Dune II, its companion in mislabeled sequels — Star Control II was greeted with excellent reviews, whose enthusiasm was blunted only by the game’s sheer unclassifiability. Questbusters called it “as funny a parody of science-fiction role-playing as it is a well-designed and fun-to-play RPG,” and named it “Best RPG of the Year” despite it not really being a CRPG at all by most people’s definitions. Computer Gaming World placed it on “this reviewer’s top-ten list of all time” as “one of the most enjoyable games to review all year,” and awarded it “Adventure Game of the Year” alongside Legend Entertainment’s far more traditional adventure Eric the Unready.

Sales too were solid, if not so enormous as Star Control II‘s staying power in gamers’ collective memory might suggest. Like Dune II, it was probably hurt by being billed as a sequel to a game likely to appeal most to an entirely different type of player, as it was by the seeming indifference of Accolade. In the eyes of Toys for Bob, the developer/publisher relationship was summed up by the sticker the latter started putting on the box after Star Control II had collected its awards: “Best Sports Game of 1992.” Accolade was putting almost all of their energy into sports games during this period, didn’t have stickers handy for anything else, and just couldn’t be bothered to print up some new ones.

Still, the game did well enough that Toys for Bob, after having been acquired by a new CD-ROM specialist of a publisher called Crystal Dynamics, ported it to the 3DO console in 1994. This version added some eight hours of spoken dialog, but cut a considerable amount of content that the voice-acting budget wouldn’t cover. Later, a third Star Control would get made — albeit not by Toys for Bob but by Legend Entertainment, through a series of intellectual-property convolutions we won’t go into in this article.

Toys for Bob themselves have continued to exist right up to the present day, a long run indeed in games-industry terms, albeit without ever managing to return to the Star Control universe. They’re no longer a two-man operation, but do still have Paul Reiche III and Fred Ford in control.

To this day, Star Control II remains as unique an experience as it was in 1992. You’ve never played a game quite like this one, no matter how many other games you’ve played in your time. Don’t even try to categorize it. Just play it, and see what’s possible when a talented design team throws out all the rules. But before you do, let me share just one piece of advice: when an alien mentions something about a strange stellar formation near the Chandrasekhar constellation, pay attention! Trust me, it will save you from a world of pain…

(Sources: Compute!’s Gazette of November 1984; Compute! of January 1992 and January 1993; Computer Gaming World of November 1990, December 1990, March 1993, and August 1993; InterActivity of November/December 1994; Questbusters of January 1993; Electronic Gaming Monthly of May 1991; Sega Visions of June 1992; Retro Gamer 14 and 15. Online sources include Ars Technica‘s video interview with Paul Reiche III and Fred Ford; Matt Barton’s interviews with the same pair in Matt Chat 95, 96, and 97; Grognardia‘s interview with Reiche; The Escapist‘s interview with Reiche; GameSpot‘s interview with Reiche.

Star Control I and II are available as a package purchase at GOG.com. Another option for experiencing Star Control II is The Ur-Quan Masters, a loving open-source re-creation based on Toys for Bob’s 3DO source code.)

 
52 Comments

Posted by on December 21, 2018 in Digital Antiquaria, Interactive Fiction

 

Controlling the Spice, Part 3: Westwood’s Dune

Brett Sperry and Louis Castle

Louis Castle first became friends with Brett Sperry in 1982, when the two were barely out of high school. Castle was selling Apple computers at the time at a little store in his native Las Vegas, and Sperry asked him to print out a file for him. “I owned a printer, so I invited him over,” remembers Castle, “and he looked at some animation and programming I was working on.”

They found they had a lot in common. They were both Apple II fanatics, both talented programmers, and both go-getters accustomed to going above and beyond what was expected of them. Through Castle’s contacts at the store — the home-computer industry was quite a small place back then — they found work as contract programmers, porters who moved software from one platform to another. It wasn’t the most glamorous job in the industry, but, at a time when the PC marketplace was fragmented into close to a dozen incompatible platforms, it was certainly a vital one. Sperry and Castle eventually came to specialize in the non-trivial feat of moving slick action games such as Dragonfire and Impossible Mission from the Commodore 64 to the far less audiovisually capable Apple II without sacrificing all of their original appeal.

In March of 1985, they decided to give up working as independent contractors and form a real company, which they named Westwood Associates. The “Westwood” came from the trendy neighborhood of Los Angeles, around the UCLA campus, where they liked to hang out when they drove down from Las Vegas of a weekend. “We chose Westwood as the company name,” says Castle, “to capture some of the feeling of youthful energy and Hollywood business.” The “Associates,” meanwhile, was nicely non-specific, meaning they could easily pivot into other kinds of software development if the games work should dry up for some reason. (The company would become known as Westwood Studios in 1992, by which time it would be pretty clear that no such pivot would be necessary.)

The story of Westwood’s very first project is something of a harbinger of their future. Epyx hired them to port the hoary old classic Temple of Apshai to the sexy new Apple Macintosh, and Sperry and Castle got a bit carried away. They converted the game from a cerebral turn-based CRPG to a frenetic real-time action-adventure, only to be greeted with howls of protest from their employers. “Epyx felt,” remembers Castle with no small sense of irony, “that gamers would not want to make complicated tactical and strategic decisions under pressure.” More sensibly, Epyx noted that Westwood had delivered not so much a port as a different game entirely, one they couldn’t possibly sell as representing the same experience as the original. So, they had to begrudgingly switch it back to turn-based.

This blind alley really does have much to tell us about Westwood’s personality. Asked many years later what common thread binds together their dizzily eclectic catalog of games, Louis Castle hit upon real-time gameplay as the one reasonable answer. This love of immediacy would translate, as we’ll soon see, into the invention of a whole new genre known as real-time strategy, which would become one of the most popular of them all by the end of the 1990s.

But first, there were more games to be ported. Having cut their teeth making Commodore 64 games work within the constraints of the Apple II, they now found themselves moving them in the other direction: “up-porting” Commodore 64 hits like Super Cycle and California Games to the Atari ST and Commodore Amiga. Up-porting was in its way as difficult as down-porting; owners of those more expensive 16-bit machines expected their capabilities to be used to good effect, even by games that had originated on more humble platforms, and complained loudly at straight, vanilla ports that still looked like they were running on an 8-bit computer. Westwood became one of the best in the industry at a very tricky task, not so much porting their source games in any conventional sense as remaking them, with dramatically enhanced graphics and sound. They acquired a reputation for technical excellence, particularly when it came to their compression systems, which allowed them to pack their impressive audiovisuals into very little space and stream them in quickly from disk. And they made good use of the fact that the Atari ST and Amiga were both built around the same Motorola 68000 CPU by developing a library for the Amiga which translated calls to the ST’s operating system into their Amiga equivalents on the fly; thus they could program a game for the ST and get the same code running on the Amiga with very few changes. If you wanted an 8-to-16-bit port done efficiently and well, you knew you could count on Westwood.

Although they worked with quite a number of publishers, Westwood cultivated a particularly close relationship with SSI, a publisher of hardcore wargames who badly needed whatever pizazz Sperry and Castle’s flashier aesthetic could provide. When SSI wanted to convince TSR to give them the hugely coveted Dungeons & Dragons license in 1987, they hired Westwood to create some of the graphics demos for their presentation. The pitch worked; staid little SSI shocked the industry by snatching the license right out from under the noses of heavier hitters like Electronic Arts. Westwood remained SSI’s most trusted partner thereafter. They ported the  “Gold Box” line of Dungeons & Dragons CRPGs to the Atari ST and Amiga with their usual flair, adding mouse support and improving the graphics, resulting in what many fans consider to be the best versions of all.

Unfortunately, Westwood’s technical excellence wasn’t always paired with equally good design sense when they occasionally got a chance to make an original game of their own. Early efforts like Mars Saga, Mines of Titan, Questron II, and BattleTech: The Crescent Hawk’s Inception all have a lot of ideas that aren’t fully worked through and never quite gel, along with third acts that fairly reek of, “We’re out of time and money, and now we just have to get ‘er done.” Ditto the first two original games they did for SSI under the Dungeons & Dragons license: the odd California Games/Gold Box mashup Hillsfar and the even odder dragon flight simulator Dragon Strike.

Still, Brett Sperry and Louis Castle were two very ambitious young men, and neither was willing to settle for the anonymous life of a strict porting house. Nor did such a life make good business sense: with the North American market at least slowly coalescing around MS-DOS machines, it looked like porting houses might soon have no reason to exist. The big chance came when Sperry and Castle convinced SSI to let them make a full-fledged Dungeons & Dragons CRPG of their own — albeit one that would be very different from the slow-paced, turn-based Gold Box line. Westwood’s take on the concept would run in — you guessed it — real time, borrowing much from FTL’s Dungeon Master, one of the biggest sensations of the late 1980s on the Atari ST and Amiga. The result was Eye of the Beholder.

At the time of the game’s release in February of 1991, FTL had yet to publish an MS-DOS port of Dungeon Master. Eye of the Beholder was thus the first real-time dungeon crawl worth its salt to become available on North America’s computer-gaming platform of choice, and this fact, combined with the Dungeons & Dragons logo on the box, yielded sales of 130,000 copies in the United States alone — a sales figure far greater than that of any previous original Westwood game, greater even than all but the first two of SSI’s flagship Gold Box line. The era of Westwood as primarily a porting house had passed.


Over at Virgin Games, the indefatigable Martin Alper, still looking to make a splash in the American market, liked what he saw in Westwood, this hot American developer who clearly knew how to make the sorts of games Americans wanted to buy. And yet they were also long-established experts at getting the most out of the Amiga, Europe’s biggest gaming computer; Westwood would do their own port of Eye of the Beholder to the Amiga, in which form it would sell in considerable numbers in Europe as well. Such a skill set made the little Las Vegas studio immensely attractive to this executive of Virgin, a company of truly global reach and vision.

Alper knew as soon as he saw Eye of the Beholder that he wanted to make Westwood a permanent part of the Virgin empire, but, not wanting to spook his target, he approached them initially only to ask them to develop a game for him. As far as Alper or anyone else outside Virgin’s French subsidiary knew at this point, the Cryo Dune game was dead. But Alper hadn’t gone to all the trouble of securing the license not to use it. In April of 1991 — just one month before the departure of Jean-Martial Lefranc from Virgin Loisirs, combined with a routine audit, would bring the French Dune conspiracy to light — Alper signed Westwood to make a Dune game of their own. It wasn’t hard to convince them to take it on; it turned out that Dune was Brett Sperry’s favorite novel of all time.

Even better, Westwood, perhaps influenced by their association with the turn-based wargame mavens at SSI, had already been playing around with ideas for a real-time (of course!) game of military conflict. “It was an intellectual puzzle for me,” says Sperry. “How can we take this really small wargame category, bring in some fresh ideas, and make it a fun game that more gamers can play?” The theme was originally to be fantasy. But, says Louis Castle, “when Virgin offered up the Dune license, that sealed our fate and pulled us away from a fantasy theme.”

Several months later, after Martin Alper reluctantly concluded that Cryo’s Dune had already cost too much money and had too much potential of its own to cancel, he found himself with quite a situation on his hands. Westwood’s Dune hadn’t been in development anywhere near as long as Cryo’s, but he was already loving what he had seen of it, and was equally unwilling to cancel that project. In an industry where the average game frankly wasn’t very good at all, having two potentially great ones might not seem like much of a problem. For Virgin’s marketers, however, it was a nightmare. Their solution, which pleased neither Cryo nor Westwood much at all, was to bill the latter’s game as a sequel to the former’s, naming it Dune II: The Building of a Dynasty.

Westwood especially had good reason to feel disgruntled. They were understandably concerned that saddling their fresh, innovative new game with the label of sequel would cause it to be overlooked. The fact was, the sequel billing made no sense whatsoever, no matter how you looked at it. While both games were, in whole or in part, strategy games that ran in real time, their personalities were otherwise about as different as it was possible for two games to be. By no means could one imagine a fan of Cryo’s plot-heavy, literary take on Dune automatically embracing Westwood’s action-heavy, militaristic effort. Nor did the one game follow on from the other in the sense of plot chronology; both games depict the very same events from the novel, albeit with radically different sensibilities.

The press too was shocked to learn that a sequel to Cryo’s Dune was due to be released the very same year as its predecessor. “This has got to be a new world record for the fastest ever followup,” wrote the British gaming magazine The One a few weeks after the first Dune‘s release. “Unlike the more adventure-based original, Dune II is expected to be more of a managerial experience comparable to (if anything) the likes of SimCity, as the two warring houses of Atreides and Harkonnen attempt to mine as much spice as possible and blow each other up at the same time.”

The Westwood Studios team who made Dune II. On the front row are Ren Olsen and Dwight Okahara; on the middle row are Judith Peterson, Joe Bostic, Donna Bundy, and Aaron Powell; on the back row are Lisa Ballan and Scott Bowen. Of this group, Bostic and Powell were the game’s official designers, and thus probably deserve the most credit for inventing the genre of real-time strategy. Westwood’s co-founder Brett Sperry also played a critical — perhaps the critical — conceptual role.

It was, on the whole, about as good a description of Dune II as any that appeared in print at the time. Not only was the new game dramatically different from its predecessor, but it wasn’t quite like anything at all which anyone had ever seen before, and coming to grips with it wasn’t easy. Legend has it that Brett Sperry started describing Dune II in shorthand as “real-time strategy” very early on, thus providing a new genre with its name. If so, though, Virgin’s marketers didn’t get the memo. They would struggle mightily to describe the game, and what they ended up with took unwieldiness to new heights: a “strategy-based resource-management simulation with a heavy real-time combat element.” Whew! “Real-time strategy” does have a better ring to it, doesn’t it?

These issues of early taxonomy, if you will, are made intensely interesting by Dune II‘s acknowledged status as the real-time-strategy urtext. That is to say that gaming histories generally claim, correctly on the whole in my opinion, that it was the first real-time strategy game ever.

Yet we do need to be careful with our semantics here. There were actually hundreds of computerized strategy games prior to Dune II which happened to be played in real time, not least among them Cryo’s Dune. The neologism of “real-time strategy” (“RTS”) — like, say, those of “interactive fiction” or even “CRPG” — has a specific meaning separate from the meanings of the individual words which comprise it. It has come to denote a very specific type of game — a game that, yes, runs in real time, but also one where players start with a largely blank slate, gather resources, and use them to build a variety of structures. These structures can in turn build military units who can carry out simple orders of the “attack there” or “defend this” stripe autonomously. The whole game plays on an accelerated time scale which yields bursts if not sustained plateaus of activity as frantic as any action game. This combination of qualities is what Westwood invented, not the abstract notion of a strategy game played in real time rather than turns.

Of course, all inventions stand on the shoulders of those that came before, and RTS is no exception. It can be challenging to trace the bits and pieces which would gel together to become Dune II only because there are so darn many of them.

Utopia

The earliest strategy game to replace turns with real time may have been Utopia, an abstract two-player game of global conquest designed and programmed by Don Daglow for the Intellivision console in 1982. The same year, Dan Bunten’s [1]Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. science-fiction-themed Cytron Masters and Chris Crawford’s Roman-themed Legionnaires became the first computer-based strategy games to discard the comfortable round of turns for something more stressful and exciting. Two years later, Brøderbund’s very successful Ancient Art of War exposed the approach to more players than ever before.

In 1989, journalists started talking about a new category of “god game” in the wake of Will Wright’s SimCity and Peter Molyneux’s Populous. The name derived from the way that these games cast you as a god able to control your people only indirectly, by altering their city’s infrastructure in SimCity or manipulating the terrain around them in Populous. This control was accomplished in real time. While, as we’ve seen, this in itself was hardly a new development, the other innovations of these landmark games were as important to the eventual RTS genre as real time itself. No player can possibly micromanage an army of dozens of units in real time — at least not if the clock is set to run at anything more than a snail’s pace. For the RTS genre as we’ve come to know it to function, units must have a degree of autonomous artificial intelligence, must be able to carry out fairly abstract orders and react to events on the ground in the course of doing so. SimCity and Populous demonstrated for the first time how this could work.

By 1990, then, god games had arrived at a place that already bore many similarities to the RTS games of today. The main things still lacking were resource collecting and building. And even these things had to some extent already been done in non-god games: a 1987 British obscurity called Nether Earth demanded that you build robots in your factory before sending them out against your enemy, although there was no way of building new structures beyond your starting factory. Indeed, even the multiplayer death matches that would come to dominate so much of the RTS genre a generation later had already been pioneered before 1990, perhaps most notably in Dan Bunten’s 1988 game Modem Wars.

Herzog Zwei

But the game most often cited as an example of a true RTS in form and spirit prior to Dune II, if such a thing is claimed to exist at all, is one called Herzog Zwei, created by the Japanese developer Technosoft and first published for the Sega Genesis console in Japan in 1989. And yet Herzog Zwei‘s status as an alternative RTS urtext is, at the very least, debatable.

Players each start the game with a single main base, and an additional nine initially neutral “outposts” are scattered over the map. Players “purchase” units in the form of Transformers-like flying robots, which they then use to try to conquer outposts; controlling more of them yields more revenue, meaning one can buy more units more quickly. Units aren’t completely out of the player’s direct control, as in the case of SimCity and Populous, but are ordered about in a rather general way: stand and fight here, patrol this radius, retreat to this position or outpost. The details are then left to the unit-level artificial intelligence. For this reason alone, perhaps, Herzog Zwei subjectively feels more like an RTS than any game before it. But on the other hand, much that would come to mark the genre is still missing: resource collection is still abstracted away entirely, while there’s only one type of unit available to build, and no structures. In my opinion, Herzog Zwei is best seen as another of the RTS genre’s building blocks rather than an urtext.

The question of whether and to what extent Herzog Zwei influenced Dune II is a difficult one to answer with complete assurance. Brett Sperry and Louis Castle have claimed not to even have been aware of the Japanese game’s existence prior to making theirs. In fact, out of all of the widely acknowledged proto-RTS games I’ve just mentioned, they cite only Populous as a major influence. Their other three stated inspirations make for a rather counter-intuitive trio on the face of it: the 1984 Apple II game Rescue Raiders, a sort of Choplifter mated to a strategic wargame; the 1989 NEC TurboGrafx-16 game Military Madness, an abstract turn-based strategy game; and, later in the development process, Sid Meier’s 1991 masterpiece Civilization (in particular, the tech tree therein).

Muddying these waters, however, is an anecdote from Stephen Clarke-Willson, an executive in Virgin’s American offices during the early 1990s. He says that “everyone at the office was playing Herzog Zwei” circa April of 1991: “I was given the task of figuring out what to do with the Dune license since I’d read the book a number of times. I thought from a gaming point of view the real stress was the battle to control the spice, and that a resource-strategy game would be good.” Clarke-Willson further claims that from the outset “Westwood agreed to make a resource-strategy game based on Dune, and agreed to look at Herzog Zwei for design ideas.” Sperry and Castle, by contrast, describe a far more open-ended agreement that called for them simply to make something interesting out of the license, allowing the specifics of their eventual Dune to arise organically from the work they had already started on their fantasy-themed real-time wargame.

For what it’s worth, neither Sperry nor Castle has a reputation for dishonesty. Quite the opposite, in fact: Westwood throughout its life stood out as a bastion of responsibility and stability in an industry not much known for either. So, whatever the true facts may be, we’re better off ascribing these contradictory testimonies to the vagaries of memories than to disingenuousness. Certainly, regardless of the exact influences that went into it, Dune II has an excellent claim to the title of first RTS in the modern neologism’s sense. This really was the place where everything came together and a new genre was born.

In the novel of Dune, the spice is the key to everything. In the Westwood game, even in the absence of almost everything else that makes the novel memorable, the same thing is true. The spice was, notes Louis Castle, “very adaptable to this harvest, grow, build for war, attack gambit. That’s really how [Dune II] came about.” Thus was set up the gameplay loop that still defines the RTS genre to this day — all stemming from a novel published in 1965.

The overarching structure of Dune II is also far more typical of the games of today than those of its peers in the early 1990s. You play a “campaign” consisting of nine scenarios, linked by snippets of narrative, that grow progressively more difficult. There are three of these campaigns to choose from, depicting the war for Arrakis from the standpoint of House Atreides, House Harkonnen, and House Ordos — the last being a cartel of smugglers who don’t appear in the novel at all, having been invented for a non-canonical 1984 source book known as The Dune Encyclopedia. In addition to a different narrative, each faction has a slightly different slate of structures and units at its command.

There’s the suggestion of a more high-level strategic layer joining the scenarios together: between scenarios, the game lets you choose your next target for attack by clicking on a territory on a Risk-like map of the planet. Nothing you do here can change the fixed sequence of scenario goals and opposing enemy forces the game presents, but it does change the terrain on which the subsequent scenario takes place, thus adding a bit more replayability for the true completionists.

You begin a scenario with a single construction yard, a handful of pre-built units, and a sharply limited initial store of spice, that precious resource from which everything else stems. Fog of war is implemented; in the beginning, you can see only the territory that immediately surrounds your starting encampment. You’ll thus want to send out scouts immediately, to find deposits of spice ripe for harvesting and to learn where the enemy is.

While your scouts go about their business, you’ll want to get an economy of sorts rolling back at home. The construction yard with which you begin can build any structure available in a given scenario, although it’s advisable to first build a “concrete slab” to serve as its foundation atop the shifting sands of Arrakis. The first real structure you’re likely to build is a “wind trap” to provide power to those that follow. Then you’ll want a “spice refinery,” which comes complete with a unit known as a “harvester,” able to collect spice from the surrounding territory and return it to the refinery to become the stuff of subsequent building efforts. Next you’ll probably want an “outpost,” which not only lets you see much farther into the territory around your base without having to deploy units there but is a prerequisite for building any new units at all. After your outpost is in place, building each type of unit requires its own kind of structure, from a “barracks” for light infantry (read: cannon fodder) to a “high tech factory” for the ultimate weapon of airpower. Naturally, more powerful units are more expensive, both in terms of the spice required to build the structures that produce them and that required to build the units themselves afterward.

Your real goal, of course, is to attack and overwhelm the enemy — or, in some later scenarios, enemies — before he or they have the chance to do the same to you. There’s a balancing act here that one could describe as the central dilemma of the game. Just how long do you concentrate on building up your infrastructure and military before you throw your units into battle? Wait too long and the enemy could get overwhelmingly powerful before you cut him down to size; attack too soon and you could be defeated and left exposed to counterattack, having squandered the units you now need for defense. The amount of spice on the map is another stress point. The spice deposits are finite; once they’re gone, they’re gone, and it’s up to whatever units are left to battle it out. Do you stake your claim to that juicy spice deposit just over the horizon right now? Or do you try to eliminate that nearby enemy base first?

If you’ve played any more recent RTS games at all, all of this will sound thoroughly familiar. And, more so than anything else I could write here, it’s this sense of familiarity, clinging as it does to almost every aspect of Dune II, which crystallizes the game’s influence and importance. The only substantial piece of the RTS puzzle that’s entirely missing here is the multiplayer death match; this game is single-player only, lacking the element that for many is the most appealing of all about the RTS genre. Otherwise, though, the difference between this and more modern RTS games is in the details rather than the fundamentals. This anointed first example of an RTS is a remarkably complete example of the breed. All the pieces are here, and all the pieces fit together as we’ve come to expect them to.

So much for hindsight. As for foresight…

Upon its release in the fall of 1992, Dune II was greeted, like its predecessor from Cryo, with positive reviews, but with none of the fanfare one might expect for a game destined to go down in history as such a revolutionary genre-spawner. Computer Gaming World called it merely “a gratifying experience,” while The One was at least a bit more effusive, with the reviewer pronouncing it “one of the most absorbing games I’ve come across.” Yet everyone regarded it as just another fun game at bottom; no one had an inkling that it would in time birth a veritable new gaming subculture. It sold well enough to justify its development, but — very probably thanks in part to its billing as a sequel to a game with a completely different personality, which had itself only been on the market a few months — it never threatened Eye of the Beholder for the crown of Westwood’s biggest hit to date.

Nor did it prompt an immediate flood of games in the same mold, whether from Westwood or anyone else. The next notable example of the budding genre, Blizzard’s Warcraft, wouldn’t appear until late 1994. That title would be roundly mocked by the gaming intelligentsia for its similarities to Dune IIComputer Gaming World would call it “a perfect bit of creative larceny” — but it would sell much, much better, well and truly setting the flame to the RTS torch. To many Warcraft fans, Westwood would seem like the bandwagon jumpers when they belatedly returned to the genre they had invented with 1995’s Command & Conquer.

By the time that happened, Westwood would be a very different place. Just as they were finishing up Dune II, Louis Castle got a call from Richard Branson himself. “Hello, Louis, this is Richard. I’d like to buy your company.”

“I didn’t know it was for sale,” replied Castle.

“In my experience, everything is for sale!”

And, indeed, notwithstanding their unhappiness about Dune II‘s sequel billing, Brett Sperry and Louis Castle sold out to Virgin, with the understanding that their new parent company would stay out of their hair and let them make the games they wanted to make, holding them accountable only on the basis of the sales they generated. Unlike so many merger-and-acquisition horror stories, Westwood would have a wonderful relationship with Virgin and Martin Alper, who provided the investment they needed to thrive in the emerging new era of CD-ROM-based, multimedia-heavy gaming. We’ll doubtless be meeting Sperry, Castle, and Alper again in future articles.


Looked upon from the perspective of today, the two Dune games of 1992 make for an endlessly intriguing pairing, almost like an experiment in psychology or sociology. Not only did two development teams set out to make a game based on the same subject matter, but they each wound up with a strategy game running in real time. And yet the two games could hardly be more different.

In terms of historical importance, there’s no contest between the two Dunes. While Cryo’s Dune had no discernible impact on the course of gaming writ large, Westwood’s is one of the most influential games of the 1990s. A direct line can be traced from it to games played by tens if not hundreds of millions of people all over the world today. “He who controls the spice, controls the universe,” ran the blurb on the front cover of millions of Dune paperbacks and movie posters. Replace “spice” with the resource of any given game’s choice, and the same could be stated as the guiding tenet of the gaming genre Dune birthed.

And yet I’m going to make the perhaps-surprising claim that the less-heralded first Dune is the more enjoyable of the two to play today. Its fusion of narrative and strategy still feels bracing and unique. I’ve never seen another game which plays quite like this one, and I’ve never seen another ludic adaptation that does a better job of capturing the essential themes and moods of its inspiration.

Dune II, by contrast, can hardly be judged under that criterion at all, given that it’s just not much interested in capturing any of the subtleties of Herbert’s novel; it’s content to stop at “he who controls the spice controls the universe.” Judged on its own terms, meanwhile, strictly as a game rather than an adaptation, it’s become the ironic victim of its own immense influence. I noted earlier that all of the pieces of the RTS genre, with the exception only of the multiplayer death match, came together here for the first time, that later games would be left to worry only about the details. Yet it should also be understood that those details are important. The ability to give orders to groups of units; the ability to give more complex orders to units; ways to get around the map more quickly and easily; higher-resolution screens able to show more of the map at one time; a bigger variety of unit types, with greater variance between opposing factions; more varied and interesting scenarios and terrains; user-selectable difficulty levels (Dune II often seems to be stuck on “Brutal”)… later games would do all of this, and so much more besides. Again, these things do matter. Playing Dune II today is like playing your favorite RTS game stripped down to its most basic foundation. For a historian or a student of game design, that’s kind of fascinating. For someone who just wants to play a fun game, it’s harder to justify.

Still, none of this should detract from the creativity and sheer technical chops that went into realizing Dune II in its own time. Most gaming genres require some iteration to work out the kinks and hone the experience. The RTS genre in particular has been so honed by such a plethora of titles, all working within such a sharply demarcated set of genre markers, that Dune II is bound to seem like a blunt instrument indeed when we revisit it today.

So, there you have it: two disparate Dune games, both inspired and worthy, but in dramatically different ways. Dune as evocative storytelling experience or Dune as straightforward interactive ultra-violence? Take your pick. The choice seems appropriate for a novel that’s been pulled back and forth along much the same axis ever since its first publication in 1965. Does it have a claim to the mantle of High Literature or is it “just” an example of a well-crafted genre novel? Take your pick. The same tension shows itself in the troubled history of Dune as movie, in the way it could attract both filmmakers who pursued — or at least believed themselves to be pursuing — a higher artistic calling, like Alejandro Jodorowsky, and purveyors of the massiest of mass-market entertainments, like Arthur P. Jacobs. Dune as art film or Dune as blockbuster? Take your pick — but please, choose one or the other. Dino and Raffaella De Laurentiis, the first people to get an actual Dune film made, tried to split the difference, making it through a mainstream Hollywood studio with a blockbuster-sized budget, but putting all those resources in the hands of a director of art films. As we’ve seen, the result of that collision of sensibilities was unsatisfying to patrons of multiplexes and art-house theaters alike.

In that light, perhaps it really was for the best that Virgin wound up accidentally releasing two Dune games. Cryo’s Dune locked down the artsier side of Dune‘s split media personality, while Westwood’s was just good fun, satisfying the timeless urge of gamers to blow stuff up in entertaining ways. Thanks to a colossal bureaucratic cock-up at Virgin, there is, one might say, a Dune game for every Dune reader. Which one really is “better” is an impossible question to answer in the end. I’ve stated my opinion, but I have no doubt that plenty of you readers could make an equally compelling case in the other direction. So, vive la différence! With all due apologies to Frank Herbert, variety is the real spice of life.

(Sources: Computer Gaming World of April 1993, August 1993, and January 1995; Game Developer of June 2001; The One of October 1992, January 1993, and July 1993; Retro Gamer 90; Westwood Studios’s customer newsletter dated Fall 1992. Online sources include Louis Castle’s interview for Soren Johnson’s Designer Notes podcast, “Retro Throwback: Dune 2 by Cole Machin on CGM, “Build, gather, brawl, repeat: The history of real-time strategy games” by Richard Moss on Ars Technica, “A New Dawn: Westwood Studios 15th Anniversary” by Geoff Keighly with Amer Ajami on GameSpot, and “The Origin of Realtime Strategy Games on the PC” by Stephen Clarke Willson on his blog Random Blts.

Feel free to download Dune II from right here, packaged so as to make it as easy as possible to get running using your chosen platform’s version of DOSBox.)

Footnotes

Footnotes
1 Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 
 

Tags: , ,