RSS

Search results for ‘epyx’

Lemmings 2: The Tribes

When the lads at DMA Design started making the original Lemmings, they envisioned that it would allow you to bestow about twenty different “skills” upon your charges. But as they continued working on the game, they threw more and more of the skills out, both to make the programming task simpler and to make the final product more playable. They finally ended up with just eight skills, the perfect number to neatly line up as buttons along the bottom of the screen. In the process of this ruthless culling, Lemmings became a classic study in doing more with less in game design: those eight skills, combined in all sorts of unexpected ways, were enough to take the player through 120 ever-more-challenging levels in the first Lemmings, then 100 more in the admittedly less satisfying pseudo-sequel/expansion pack Oh No! More Lemmings.

Yet when the time came to make the first full-fledged sequel, DMA resurrected some of their discarded skills. And then they added many, many more of them: Lemmings 2: The Tribes wound up with no less than 52 skills in all. For this reason not least, it’s often given short shrift by critics, who compare its baggy maximalism unfavorably with the first game’s elegant minimalism. To my mind, though, Lemmings 2 is almost a Platonic ideal of a sequel, building upon the genius of the original game in a way that’s truly challenging and gratifying to veterans. Granted, it isn’t the place you should start; by all means, begin with the classic original. When you’ve made it through those 120 levels, however, you’ll find 120 more here that are just as perplexing, frustrating, and delightful — and with even more variety to boot, courtesy of all those new skills.



The DMA Design that made Lemmings 2 was a changed entity in some ways. The company had grown in the wake of the first game’s enormous worldwide success, such that they had been forced to move out of their cozy digs above a baby store in the modest downtown of Dundee, Scotland, and into a more anonymous office in a business park on the outskirts of town. The core group that had created the first Lemmings — designer, programmer, and DMA founder David Jones; artists and level designers Mike Dailly and Gary Timmons; programmer and level designer Russell Kay — all remained on the job, but they were now joined by an additional troupe of talented newcomers.

Lemmings 2 also reflects changing times inside the games industry in ways that go beyond the size of its development team. Instead of 120 unrelated levels, there’s now a modicum of story holding things together. A lengthy introductory movie — which, in another telling sign of the times, fills more disk space than the game itself and required almost as many people to make — tells how the lemmings were separated into twelve tribes, all isolated from one another, at some point in the distant past. Now, the island (continent?) on which they live is facing an encroaching Darkness which will end all life there. Your task is to reunite the tribes, by guiding each of them through ten levels to reach the center of the island. Once all of the tribes have gathered there, they can reassemble a magical talisman, of which each tribe conveniently has one piece, and use it to summon a flying ark that will whisk them all to safety.

It’s not exactly an air-tight plot, but no matter; you’ll forget about it anyway as soon as the actual game begins. What’s really important are the other advantages of having twelve discrete progressions of ten levels instead of a single linear progression of 120. You can, you see, jump around among all these tribes at will. As David Jones said at the time of the game’s release, “We want to get away from ‘you complete a level or you don’t.'” When you get frustrated banging your head against a single stubborn level — and, this being a Lemmings game, you will get frustrated — you can just go work on another one for a while.

Rather than relying largely on the same set of graphics over the course of its levels, as the original does, each tribe in Lemmings 2 has its own audiovisual theme: there are beach-bum lemmings, Medieval lemmings, spooky lemmings, circus lemmings, alpine lemmings, astronaut lemmings, etc. In a tribute to the place where the game was born, there are even Scottish Highland lemmings (although Dundee is actually found in the less culturally distinctive — or culturally clichéd — Lowlands). And there’s even a “classic” tribe that reuses the original graphics; pulling it up feels a bit like coming home from an around-the-world tour.


Teaching Old Lemmings New Tricks

In this Beach level, a lemming uses the “kayak” skill to cross a body of water.

In this Medieval level, one lemming has become an “attractor”: a minstrel who entrances all the lemmings around him with his music, keeping them from marching onward. Meanwhile one of his colleagues is blazing a trail in front for the rest to eventually follow.

In this Shadow level, the lemming in front has become a “Fencer.” This allows him to dig out a path in front of himself at a slight upward angle. (Most of the skills in the game that at first seem bewilderingly esoteric actually do have fairly simple effects.)

In this Circus level, one lemming has become a “rock climber”: a sort of super-powered version of an ordinary climber, who can climb even a canted wall like this one.

In this Polar level, a lemming has become a “roper,” making a handy tightrope up and over the tree blocking the path.

In this Space level, we’ve made a “SuperLem” who flies in the direction of the mouse cursor.


Other pieces of plumbing help to make Lemmings 2 feel like a real, holistic game rather than a mere series of puzzles. The first game, as you may recall, gives you an arbitrary number of lemmings which begin each level and an arbitrary subset of them which must survive it; this latter number thus marks the difference between success and failure. In the sequel, though, each tribe starts its first level with 60 lemmings, who are carried over through all of the levels that follow. Any lemmings lost on one level, in other words, don’t come back in the succeeding ones. It’s possible to limp to the final finish line with just one solitary survivor remaining — and, indeed, you quite probably will do exactly this with a few of the tribes the first time through. But it’s also possible to finish all but a few of the levels without killing any lemmings at all. At the end of each level and then again at the end of each tribe’s collection of levels, you’re awarded a bronze, silver, or gold star based on your performance. To wind up with gold at the end, you usually need to have kept every single one of the little fellows alive through all ten levels. There’s a certain thematic advantage in this: people often note how the hyper-cute original Lemmings is really one of the most violent videogames ever, requiring you to kill thousands and thousands of the cuties over its course. This objection no longer applies to Lemmings 2. But more importantly, it sets up an obsessive-compulsive-perfectionist loop. First you’ll just want to get through the levels — but then all those bronze and silver performances lurking in your past will start to grate, and pretty soon you’ll be trying to figure out how to do each level just that little bit more efficiently. The ultimate Lemmings 2 achievement, needless to say, is to collect gold stars across the board.

This tiered approach to success and failure might be seen as evidence of a kinder design sensibility, but in most other respects just the opposite is true; Lemmings 2 has the definite feel of a game for the hardcore. The first Lemmings does a remarkably good job of teaching you how to play it interactively over the course of its first twenty levels or so, introducing you one by one to each of its skills along with its potential uses and limitations. There’s nothing remotely comparable in Lemmings 2; it just throws you in at the deep end. While there is a gradual progression in difficulty within each tribe’s levels, the game as a whole is a lumpier affair, especially in the beginning. Each level gives you access to between one and eight of the 52 available skills, whilst evincing no interest whatsoever in showing you how to use any of them. There is some degree of thematic grouping when it comes to the skills: the Highland lemmings like to toss cabers; the beach lemmings are fond of swimming, kayaking, and surfing; the alpine lemmings often need to ski or skate. Nevertheless, the sheer number of new skills you’re expected to learn on the fly is intimidating even for a veteran of the first game. The closest Lemmings 2 comes to its predecessor’s training levels are a few free-form sandbox environments where you can choose your own palette of skills and have at it. But even here, your education can be a challenging one, coming down as it still does to trial and error.

Your first hours with the game can be particularly intimidating; as soon as you’ve learned how one group of skills works well enough to finish one level, you’re confronted with a whole new palette of them on the next level. Even I, a huge fan of the first game, bounced off the second one quite a few times before I buckled down, started figuring out the skills, and, some time thereafter, started having fun.

Luckily, once you have put in the time to learn how the skills work, Lemmings 2 becomes very fun indeed, — every bit as rewarding as the first game, possibly even more so. Certainly its level design is every bit as good — better in fact, relying more on logic and less on dodgy edge cases in the game engine than do the infamously difficult final levels of the first Lemmings. Even the spiky difficulty curve isn’t all bad; it can be oddly soothing to start on a new tribe’s relatively straightforward early levels after being taxed to the upmost on another tribe’s last level. If the first Lemmings is mountain climbing as people imagine it to be — a single relentless, ever-steeper ascent to a dizzying peak — the second Lemmings has more in common with the reality of the sport: a set of more or less difficult stages separated by more or less comfortable base camps. While it’s at least as daunting in the end, it does offer more ebbs and flows along the way.

One might say, then, that Lemmings 2 is designed around a rather literal interpretation of the concept of a sequel. That is to say, it assumes that you’ve played its predecessor before you get to it, and are now ready for its added complexity. That’s bracing for anyone who fulfills that criterion. But in 1993, the year of Lemmings 2‘s release, its design philosophy had more negative than positive consequences for its own commercial arc and for that of the franchise to which it belonged.

The fact is that Lemmings 2‘s attitude toward its sequel status was out of joint with the way sequels had generally come to function by 1993. In a fast-changing industry that was fast attracting new players, the ideal sequel, at least in the eyes of most industry executives, was a game equally welcoming to both neophytes and veterans. Audiovisual standards were changing so rapidly that a game that was just a couple of years old could already look painfully dated. What new player with a shiny new computer wanted to play some ugly old thing just to earn a right to play the latest and greatest?

That said, Lemmings 2 actually didn’t look all that much better than its predecessor either, flashy opening movie aside. Part of this was down to DMA Design still using the 1985-vintage Commodore Amiga, which was still very popular as a gaming computer in Britain and other European countries, as their primary development platform, then porting the game to MS-DOS and various other more modern platforms. Staying loyal to the Amiga meant working within some fairly harsh restrictions, such as that of having no more than 32 colors on the screen at once, not to mention making the whole game compact enough to run entirely off floppy disk; hard drives, much less CD-ROM drives, were still not common among European Amiga owners. Shortly before the release of Lemmings 2, David Jones confessed to being “a little worried” about whether people would be willing to look beyond the unimpressive graphics and appreciate the innovations of the game itself. As it happened, he was right to be worried.

Lemmings and Oh No! More Lemmings sold in the millions across a bewildering range of platforms, from modern mainstream computers like the Apple Macintosh and Wintel machines to antique 8-bit computers like the Commodore 64 and Sinclair Spectrum, from handheld systems like the Nintendo Game Boy and Atari Lynx to living-room game consoles like the Sega Master System and the Nintendo Entertainment System. Lemmings 2, being a much more complex game under the hood as well as on the surface, wasn’t quite so amenable to being ported to just about any gadget with a CPU, even as its more off-putting initial character and its lack of new audiovisual flash did it no favors either. It was still widely ported and still became a solid success by any reasonable standard, mind you, but likely sold in the hundreds of thousands rather than the millions. All indications are that the first game and its semi-expansion pack continued to sell more copies than the second even after the latter’s release.

In the aftermath of this muted reception, the bloom slowly fell off the Lemmings rose, not only for the general public but also for DMA Design themselves. The franchise’s true jump-the-shark moment ironically came as part of an attempt to re-jigger the creatures to become media superstars beyond the realm of games. The Children’s Television Workshop, the creator of Sesame Street among other properties, was interested in moving the franchise onto television screens. In the course of these negotiations, they asked DMA to give the lemmings more differentiated personalities in the next game, to turn them from anonymous marchers, each just a few pixels across, into something more akin to individualized cartoon characters. Soon the next game was being envisioned as the first of a linked series of no less than four of them, each one detailing the further adventures of three of the tribes after their escape from the island at the end of Lemmings 2, each one ripe for trans-media adaptation by the Children’s Television Workshop. But the first game of this new generation, called The Lemmings Chronicles, just didn’t work. The attempt to cartoonify the franchise was cloying and clumsy, and the gameplay fell to pieces; unlike Lemmings 2, Lemmings Chronicles eminently deserves its underwhelming critical reputation. DMA insiders like Mike Dailly have since admitted that its was developed more out of obligation than enthusiasm: “We were all ready to move on.” When it performed even worse than its predecessor, the Children’s Television Workshop dropped out; all of its compromises had been for nothing.

Released just a year after Lemmings 2, Lemmings Chronicles marked the last game in the six-game contract that DMA Design had signed with their publisher Psygnosis what seemed like an eternity ago — in late 1987 to be more specific, when David Jones had first come to Psygnosis with his rather generic outer-space shoot-em-up Menace, giving no sign that he was capable of something as ingenious as Lemmings. Now, having well and truly demonstrated their ingenuity, DMA had little interest in re-upping; they were even willing to leave behind all of their intellectual property, which the contract Jones had signed gave to Psygnosis in perpetuity. In fact, they were more than ready to leave behind the cute-and-cuddly cartoon aesthetic of Lemmings and return to more laddish forms of gaming. The eventual result of that desire would be a second, more long-lasting worldwide phenomenon, known as Grand Theft Auto.

Meanwhile Sony, who had acquired Psygnosis in 1993, continued off and on to test the waters with new iterations of the franchise, but all of those attempts evinced the same vague sense of ennui that had doomed Lemmings Chronicles; none became hits. The last Lemmings game that wasn’t a remake appeared in 2000.

It’s interesting to ask whether DMA Design and Psygnosis could have managed the franchise better, thereby turning it into a permanent rather than a momentary icon of gaming, perhaps even one on a par with the likes of Super Mario and Sonic the Hedgehog; they certainly had the sales to compete head-to-head with those other videogame icons for a few years there in the early 1990s. The obvious objection is that Mario and Sonic were individualized characters, while DMA’s lemmings were little more than a handful of tropes moving in literal lockstep. Still, more has been done with less in the annals of media history. If everyone had approached Lemmings Chronicles with more enthusiasm and a modicum more writing and branding talent, maybe the story would have turned out differently.

Many speculate today that the franchise must inevitably see another revival at some point, what with 21st-century pop culture’s tendency to mine not just the A-list properties of the past, but increasingly its B- and C-listers as well, in the name of one generation’s nostalgia and another’s insatiable appetite for kitsch. Something tells me as well that we haven’t seen the last of Lemmings, but, as of this writing anyway, the revival still hasn’t arrived.

As matters currently stand, then, the brief-lived but frenzied craze for Lemmings has gone down in history, alongside contemporaries like Tetris and The Incredible Machine, as one more precursor of the casual revolution in gaming that was still to come, with its very different demographics and aesthetics. But in addition to that, it gave us two games that are brilliant in their own right, that remain as vexing but oh-so-rewarding as they were in their heyday. Long may they march on.

One other surviving tribute to Dundee’s second most successful gaming franchise is this little monument at the entrance to the city’s Seabraes Park, erected by local artist Alyson Conway in 2013. Lemmings and Grand Theft Auto… not bad for a city of only 150,000 souls.

(Sources: the book Grand Thieves and Tomb Raiders by Magnus Anderson and Rebecca Levene; Compute! of January 1992; Amiga Format of May 1993 and the special 1992 annual; Retro Gamer 39; The One of November 1993; Computer Gaming World of July 1993.

Lemmings 2 has never gotten a digital re-release. I therefore make it available for download here, packaged to be as easy as possible to get running under DOSBox on your modern computer.)

 
 

Tags: , , ,

The 68000 Wars, Part 6: The Unraveling

Commodore International’s roots are in manufacturing, not computing. They’re used to making and selling all kinds of things, from calculators to watches to office furniture. Computers just happen to be a profitable sideline they stumbled into. Commodore International isn’t really a computer company; they’re a company that happens to make computers. They have no grand vision of computing in the future; Commodore International merely wants to make products that sell. Right now, the products they’re set up to sell happen to be computers and video games. Next year, they might be bicycles or fax machines.

The top execs at Commodore International don’t really understand why so many people love the Amiga. You see, to them, it’s as if customers were falling in love with a dishwasher or a brand of paper towels. Why would anybody love a computer? It’s just a chunk of hardware that we sell, fer cryin’ out loud.

— Amiga columnist “The Bandito,” Amazing Computing, January 1994

Commodore has never been known for their ability to arouse public support. They have never been noted for their ability to turn lemons into lemonade. But, they have been known to take a bad situation and create a disaster. At least there is some satisfaction in noting their consistency.

— Don Hicks, managing editor, Amazing Computing, July 1994

In the summer of 1992, loyal users of the Commodore Amiga finally heard a piece of news they had been anxiously awaiting for half a decade. At long last, Commodore was about to release new Amiga models sporting a whole new set of custom chips. The new models would, in other words, finally do more than tinker at the edges of the aging technology which had been created by the employees of little Amiga, Incorporated between 1982 and 1985. It was all way overdue, but, as they say, better late than never.

The story of just how this new chipset managed to arrive so astonishingly late is a classic Commodore tale of managerial incompetence, neglect, and greed, against which the company’s overtaxed, understaffed engineering teams could only hope to fight a rearguard action.



Commodore’s management had not woken up to the need to significantly improve the Amiga until the summer of 1988, after much of its technological lead over its contemporaries running MS-DOS and MacOS had already been squandered. Nevertheless, the engineers began with high hopes for what they called the “Advanced Amiga Architecture,” or the “AAA” chipset. It was to push the machine’s maximum screen resolution from 640 X 400 to 1024 X 768, while pushing its palette of available colors from 4096 to 16.8 million. The blitter and other pieces of custom hardware which made the machine so adept at 2D animation would be retained and vastly improved, even as the current implementation of planar graphics would be joined by “chunky-graphics” modes which were more suitable for 3D animation. Further, the new chips would, at the user’s discretion, do away with the flicker-prone interlaced video signal that the current Amiga used for vertical resolutions above 200 pixels, which made the machine ideal for desktop-video applications but annoyed the heck out of anyone trying to do anything else with it. And it would now boast eight instead of four separate sound channels, each of which would now offer 16-bit resolution — i.e., the same quality as an audio CD.

All of this was to be ready to go in a new Amiga model by the end of 1990. Had Commodore been able to meet that time table and release said model at a reasonable price point, it would have marked almost as dramatic an advance over the current state of the art in multimedia personal computing as had the original Amiga 1000 back in 1985.

Sadly, though, no plan at Commodore could long survive contact with management’s fundamental cluelessness. By this point, the research-and-development budget had been slashed to barely half what it had been when the company’s only products were simple 8-bit computers like the Commodore 64. Often only one engineer at a time was assigned to each of the three core AAA chips, and said engineer was more often than not young and inexperienced, because who else would work 80-hour weeks at the salaries Commodore paid? Throw in a complete lack of day-to-day oversight or management coordination, and you had a recipe for endless wheel-spinning. AAA fell behind schedule, then fell further behind, then fell behind some more.

Some fifteen months after the AAA project had begun, Commodore started a second chip-set project, which they initially called “AA.” The designation was a baseball metaphor rather than an acronym; the “AAA” league in American baseball is the top division short of the major leagues, while the “AA” league is one rung further down. As the name would indicate, then, the AA chipset was envisioned as a more modest evolution of the Amiga’s architecture, an intermediate step between the original chipset and AAA. Like the latter, AA would offer 16.8 million colors — of which 256 could be onscreen at once without restrictions, more with some trade-offs —  but only at a maximum non-interlaced resolution of 640 X 480, or 800 X 600 in interlace mode. Meanwhile the current sound system would be left entirely alone. On paper, even these improvements moved the Amiga some distance beyond the existing Wintel VGA standard — but then again, that world of technology wasn’t standing still either. Much depended on getting the AA chips out quickly.

But “quick” was an adjective which seldom applied to Commodore. First planned for release on roughly the same time scale that had once been envisioned for the AAA chipset, AA too fell badly beyond schedule, not least because the tiny engineering team was now forced to split their energies between the two projects. It wasn’t until the fall of 1992 that AA, now renamed to the “Advanced Graphics Architecture,” or “AGA,” made its belated appearance. That is to say, the stopgap solution to the Amiga’s encroaching obsolescence arrived fully two years after the comprehensive solution to the problem ought to have shipped. Such was life at Commodore.

Rather than putting the Amiga out in front of the competition, AGA at this late date could only move it into a position of rough parity with the majority of the so-called “Super VGA” graphics cards which had become fairly commonplace in the Wintel world over the preceding year or so. And with graphics technology evolving quickly in the consumer space to meet the demands of CD-ROM and full-motion video, even the Amiga’s parity wouldn’t last for long. The Amiga, the erstwhile pioneer of multimedia computing, was now undeniably playing catch-up against the rest of the industry.

The Amiga 4000

AGA arrived inside two new models which evoked immediate memories of the Amiga 2000 and 500 from 1987, the most successful products in the platform’s history. Like the old Amiga 2000, the new Amiga 4000 was the “professional” machine, shipping in a big case full of expansion slots, with 4 MB of standard memory, a large hard drive, and a 68040 processor running at 25 MHz, all for a street price of around $2600. Like the old Amiga 500, the Amiga 1200 was the “home” model, shipping in an all-in-one-case form factor without room for internal expansion, with 2 MB of standard memory, a floppy drive only, and a 68020 processor running at 14 MHz, all for a price of about $600.

The two models were concrete manifestations of what a geographically bifurcated computing platform the Amiga had become by the early 1990s. In effect, the Amiga 4000 was to be the new face of Amiga computing in North America; ditto the Amiga 1200 in Europe. Commodore would make only scattered, desultory attempts to sell each model outside of its natural market.



Although the Amiga 500 had once enjoyed some measure of success in the United States as a games machine and general-purpose home computer, those days were long gone by 1992. That year, MS-DOS and Windows accounted for 87 percent of all American computer-game sales and the Macintosh for 9 percent, while the Amiga was lumped rudely into the 4 percent labeled simply “other.” Small wonder that very few American games publishers still gave any consideration to the platform at all; what games were still available for the Amiga in North America must usually be acquired by mail order, often as pricey imports from foreign climes. Then, too, most of the other areas where the Amiga had once been a player, and as often as not a pioneer — computer-based art and animation, 3D modeling, music production, etc. — had also fallen by the wayside, with most of the slack for such artsy endeavors being picked up by the Macintosh.

The story of Eric Graham was depressingly typical of the trend. Back in 1986, Graham had created a stunning ray-traced 3D animation called The Juggler on the Amiga; it became a staple of shop windows, filling at least some of the gap left by Commodore’s inept marketing. User demand had then led him to create Sculpt 3D, one of the first two practical 3D modeling applications for a consumer-class personal computer, and release it through the publisher Byte by Byte in mid-1987. (The other claimant to the status of absolute first of the breed ran on the Amiga as well; it was called Videoscape 3D, and was released virtually simultaneously with Sculpt 3D). But by 1989 the latest Macintosh models had also become powerful enough to support Graham’s software. Therefore Byte by Byte and Graham decided to jump to that platform, which already boasted a much larger user base who tended to be willing to pay higher prices for their software. Orphaned on the Amiga, the Sculpt 3D line continued on the Mac until 1996. Thanks to it and many other products, the Mac took over the lead in the burgeoning field of 3D modeling. And as went 3D modeling, so went a dozen other arenas of digital creativity.

The one place where the Amiga’s toehold did prove unshakeable was desktop video, where its otherwise loathed interlaced graphics modes were loved for the way they let the machine sync up with the analog video-production equipment typical of the time: televisions, VCRs, camcorders, etc. From the very beginning, both professionals and a fair number of dedicated amateurs used Amigas for titling, special effects, color correction, fades and wipes, and other forms of post-production work. Amigas were used by countless television stations to display programming information and do titling overlays, and found their way onto the sets of such television and film productions as Amazing Stories, Max Headroom, and Robocop 2. Even as the Amiga was fading in many other areas, video production on the platform got an enormous boost in December of 1990, when an innovative little Kansan company called NewTek released the Video Toaster, a combination of hardware and software which NewTek advertised, with less hyperbole than you might imagine, as an “all-in-one broadcast studio in a box” — just add one Amiga. Now the Amiga’s production credits got more impressive still: Babylon 5, seaQuest DSV, Quantum Leap, Jurassic Park, sports-arena Jumbotrons all over the country. Amiga models dating from the 1980s would remain fixtures in countless local television stations until well after the millennium, when the transition from analog to digital transmission finally forced their retirement.

Ironically, this whole usage scenario stemmed from what was essentially an accidental artifact of the Amiga’s design; Jay Miner, the original machine’s lead hardware designer, had envisioned its ability to mix and match with other video sources not as a means of inexpensive video post-production but rather as a way of overlaying interactive game graphics onto the output from an attached laser-disc accessory, a technique that was briefly en vogue in videogame arcades. Nonetheless, the capability was truly a godsend, the only thing keeping the platform alive at all in North America.

On the other hand, though, it was hard not to lament a straitening of the platform’s old spirit of expansive, experimental creativity across many fields. As far as the market was concerned, the Amiga was steadily morphing from a general-purpose computer into a piece of niche technology for a vertical market. By the early 1990s, most of the remaining North American Amiga magazines had become all but indistinguishable from any other dryly technical trade journal serving a rigidly specialized readership. In a telling sign of the times, it was almost universally agreed that early sales of the Amiga 4000, which were rather disappointing even by Commodore’s recent standards, were hampered by the fact that it initially didn’t work with the Video Toaster. (An updated “Video Toaster 4000” wouldn’t finally arrive until a year after the Amiga 4000 itself.) Many users now considered the Amiga little more than a necessary piece of plumbing for the Video Toaster. NewTek and others sold turnkey systems that barely even mentioned the name “Commodore Amiga.” Some store owners whispered that they could actually sell a lot more Video Toaster systems that way. After all, Commodore was known to most of their customers only as the company that had made those “toy computers” back in the 1980s; in the world of professional video and film production, that sort of name recognition was worth less than no name recognition at all.

In Europe, meanwhile, the nature of the baggage that came attached to the Commodore name perhaps wasn’t all that different in the broad strokes, but it did carry more positive overtones. In sheer number of units sold, the Amiga had always been vastly more successful in Europe than in North America, and that trend accelerated dramatically in the 1990s. Across the pond, it enjoyed all the mass-market acceptance it lacked in its home country. It was particularly dominant in Britain and West Germany, two of the three biggest economies in Europe. Here, the Amiga was nothing more nor less than a great games machine. The sturdy all-in-one-case design of the Amiga 500 and, now, the Amiga 1200 placed it on a continuum with the likes of the Sinclair Spectrum and the Commodore 64. There was a lot more money to be made selling computers to millions of eager European teenagers than to thousands of sober American professionals, whatever the wildly different price tags of the individual machines. Europe was accounting for as much as 88 percent of Commodore’s annual revenues by the early 1990s.

And yet here as well the picture was less rosy than it might first appear. While Commodore sold almost 2 million Amigas in Europe during 1991 and 1992, sales were trending in the wrong direction by the end of that period. Nintendo and Sega were now moving into Europe with their newest console systems, and Microsoft Windows as well was fast gaining traction. Thus it comes as no surprise that the Amiga 1200, which first shipped in December of 1992, a few months after the Amiga 4000, was greeted with sighs of relief by Commodore’s European subsidiaries, followed by much nervous trepidation. Was the new model enough of an improvement to reverse the trend of declining sales and steal a march on the competition once again? Sega especially had now become a major player in Europe, selling videogame consoles which were both cheaper and easier to operate than an Amiga 1200, lacking as they did any pretense of being full-fledged computers.

If Commodore was facing a murky future on two continents, they could take some consolation in the fact that their old arch-rival Atari was, as usual, even worse off. The Atari Lynx, the handheld game console which Jack Tramiel had bilked Epyx out of for a song, was now the company’s one reasonably reliable source of revenue; it would sell almost 3 million units between 1989 and 1994. The Atari ST, the computer line which had caused such a stir when it had beat the original Amiga into stores back in 1985 but had been playing second fiddle ever since, offered up its swansong in 1992 in the form of the Falcon, a would-be powerhouse which, as always, lagged just a little bit behind the latest Amiga models’ capabilities. Even in Europe, where Atari, like Commodore, had a much stronger brand than in North America, the Falcon sold hardly at all. The whole ST line would soon be discontinued, leaving it unclear how Atari intended to survive after the Lynx faded into history. Already in 1992, their annual sales fell to $127 million — barely a seventh those of Commodore.

Still, everything was in flux, and it was an open question whether Commodore could continue to sell Amigas in Europe at the pace to which they had become accustomed. One persistent question dogging the Amiga 1200 was that of compatibility. Although the new chipset was designed to be as compatible as possible with the old, in practice many of the most popular old Amiga games didn’t work right on the new machine. This reality could give pause to any potential upgrader with a substantial library of existing games and no desire to keep two Amigas around the house. If your favorite old games weren’t going to work on the new machine anyway, why not try something completely different, like those much more robust and professional-looking Windows computers the local stores had just started selling, or one of those new-fangled videogame consoles?



The compatibility problems were emblematic of the way that the Amiga, while certainly not an antique like the 8-bit generation of computers, wasn’t entirely modern either. MacOS and Windows isolated their software environment from changing hardware by not allowing applications to have direct access to the bare metal of the computer; everything had to be passed through the operating system, which in turn relied on the drivers provided by hardware manufacturers to ensure that the same program worked the same way on a multitude of configurations. AmigaOS provided the same services, and its technical manuals as well asked applications to function this way — but, crucially, it didn’t require that they do so. European game programmers in particular had a habit of using AmigaOS only as a bootstrap. Doing so was more efficient than doing things the “correct” way; most of the audiovisually striking games which made the Amiga’s reputation would have been simply impossible using “proper” programming techniques. Yet it created a brittle software ecosystem which was ill-suited for the long term. Already before 1992 Amiga gamers had had to contend with software that didn’t work on machines with more than 512 K of memory, or with a hard drive attached, or with or without a certain minor revision of the custom chips, or any number of other vagaries of configuration. With the advent of the AGA chipset, such problems really came home to roost.

An American Amiga 1200. In the context of American computing circa 1992, it looked like a bizarrely antediluvian gadget. “Real” computers just weren’t sold in this style of all-in-one case anymore, with peripherals dangling messily off the side. It looked like a chintzy toy to Americans, whatever the capabilities hidden inside. A telling detail: notice the two blank keys on the keyboard, which were stamped with characters only in some continental European markets that needed them. Rather than tool up to produce more than one physical keyboard layout, Commodore just let them sit there in all their pointlessness on the American machines. Can you imagine Apple or IBM, or any other reputable computer maker, doing this? To Americans, and to an increasing number of Europeans as well, the Amiga 1200 just seemed… cheap.

Back in 1985, AmigaOS had been the very first consumer-oriented operating system to boast full-fledged preemptive multitasking, something that neither MacOS nor Microsoft Windows could yet lay claim to even in 1992; they were still forced to rely on cooperative multitasking, which placed them at the mercy of individual applications’ willingness to voluntarily cede time to others. Yet the usefulness of AmigaOS’s multitasking was limited by its lack of memory protection. Thanks to this lack, any individual program on the system could write, intentionally or unintentionally, all over the memory allocated to another; system crashes were a sad fact of life for the Amiga power user. AmigaOS also lacked a virtual-memory system that would have allowed more applications to run than the physical memory could support. In these respects and others — most notably its graphical interface, which still evinced nothing like the usability of Windows, much less the smooth elegance of the Macintosh desktop — AmigaOS lagged behind its rivals.

It is true that MacOS, dating as it did from roughly the same period in the evolution of the personal computer, was struggling with similar issues: trying to implement some form of multitasking where none at all had existed originally, kludging in support for virtual memory and some form of rudimentary memory protection. The difference was that MacOS was evolving, however imperfectly. While AmigaOS 3.0, which debuted with the AGA machines, did offer some welcome improvements in terms of cosmetics, it did nothing to address the operating system’s core failings. It’s doubtful whether anyone in Commodore’s upper management even knew enough about computers to realize they existed.

This quality of having one foot in each of two different computing eras dogged the platform in yet more ways. The very architectural approach of the Amiga — that of an ultra-efficient machine built around a set of tightly coupled custom chips — had become passé as Wintel and to a large extent even the Macintosh had embraced modular architectures where almost everything could live on swappable cards, letting users mix and match capabilities and upgrade their machines piecemeal rather than all at once. One might even say that it was down almost as much to the Amiga’s architectural philosophy as it was to Commodore’s incompetence that the machine had had such a devil of a time getting itself upgraded.

And yet the problems involved in upgrading the custom chips were as nothing compared to the gravest of all the existential threats facing the Amiga. It was common knowledge in the industry by 1992 that Motorola was winding down further development of the 68000 line, the CPUs at the heart of all Amigas. Indeed, Apple, whose Macintosh used the same CPUs, had seen the writing on the wall as early as the beginning of 1990, and had started projects to port MacOS to other architectures, using software emulation as a way to retain compatibility with legacy applications. In 1991, they settled on the PowerPC, a CPU developed by an unlikely consortium of Apple, IBM, and Motorola, as the future of the Macintosh. The first of the so-called “Power Macs” would debut in March of 1994. The whole transition would come to constitute one of the more remarkable sleights of hand in all computing history; the emulation layer combined with the ported version of MacOS would work so seamlessly that many users would never fully grasp what was really happening at all.

But Commodore, alas, was in no position to follow suit, even had they had the foresight to realize what a ticking time bomb the end of the 68000 line truly was. AmigaOS’s more easygoing attitude toward the software it enabled meant that any transition must be fraught with far more pain for the user, and Commodore had nothing like the resources Apple had to throw at the problem in any case. But of course, the thoroughgoing, eternal incompetence of Commodore’s management prevented them from even seeing the problem, much less doing anything about it. While everyone was obliviously rearranging the deck chairs on the S.S. Amiga, it was barreling down on an iceberg as wide as the horizon. The reality was that the Amiga as a computing platform now had a built-in expiration date. After the 68060, the planned swansong of the 68000 family, was released by Motorola in 1994, the Amiga would literally have nowhere left to go.

As it was, though, Commodore’s financial collapse became the more immediate cause that brought about the end of the Amiga as a vital computing platform. So, we should have a look at what happened to drive Commodore from a profitable enterprise, still flirting with annual revenues of $1 billion, to bankruptcy and dissolution in the span of less than two years.



During the early months of 1993, an initial batch of encouraging reports, stating that the Amiga 1200 had been well-received in Europe, was overshadowed by an indelibly Commodorian tale of making lemons out of lemonade. It turned out that they had announced the Amiga 1200 too early, then shipped it too late and in too small quantities the previous Christmas. Consumers had chosen to forgo the Amiga 500 and 600 — the latter being a recently introduced ultra-low-end model — out of the not-unreasonable belief that the generation of Amiga technology these models represented would soon be hopelessly obsolete. Finding the new model unavailable, they’d bought nothing at all — or, more likely, bought something from a competitor like Sega instead. The result was a disastrous Christmas season: Commodore didn’t yet have the new computers everybody wanted, and couldn’t sell the old computers they did have, which they’d inexplicably manufactured and stockpiled as if they’d had no inkling that the Amiga 1200 was coming. They lost $21.9 million in the quarter that was traditionally their strongest by far.

Scant supplies of the Amiga 1200 continued to devastate overall Amiga sales well after Christmas, leaving one to wonder why on earth Commodore hadn’t tooled up to manufacture sufficient quantities of a machine they had hoped and believed would be a big hit, for once with some justification. In the first quarter of 1993, Commodore lost a whopping $177.6 million on sales of just $120.9 million, thanks to a massive write-down of their inventory of older Amiga models piling up in warehouses. Unit sales of Amigas dropped by 25 percent from the same quarter of the previous year; Amiga revenues dropped by 45 percent, thanks to deep price cuts instituted to try to move all those moribund 500s and 600s. Commodore’s share price plunged to around $2.75, down from $11 a year before, $20 the year before that. Wall Street estimated that the whole company was now worth only $30 million after its liabilities had been subtracted — a drop of one full order of magnitude in the span of a year.

If anything, Wall Street’s valuation was generous. Commodore was now dragging behind them $115.3 million in debt. In light of this, combined with their longstanding reputation for being ethically challenged in all sorts of ways, the credit agencies considered them to be too poor a risk for any new loans. Already they were desperately pursuing “debt restructuring” with their major lenders, promising them, with little to back it up, that the next Christmas season would make everything right as rain again. Such hopes looked even more unfounded in light of the fact that Commodore was now making deep cuts to their engineering and marketing staffs — i.e., the only people who might be able to get them out of this mess. Certainly the AAA chipset, still officially an ongoing project, looked farther away than ever now, two and a half years after it was supposed to have hit the streets.

The Amiga CD32

It was for these reasons that the announcement of the Amiga CD32, the last major product introduction in Commodore’s long and checkered history, came as such a surprise to everyone in mid-1993. CD32 was perhaps the first comprehensively, unadulteratedly smart product Commodore had released since the Amiga 2000 and 500 back in 1987. It was as clever a leveraging of their dwindling assets as anyone could ask for. Rather than another computer, it was a games console, built around a double-speed CD-ROM drive. Commodore had tried something similar before, with the CDTV unit of two and a half years earlier, only to watch it go down in flames. For once, though, they had learned from their mistakes.

CDTV had been a member of an amorphously defined class of CD-based multimedia appliances for the living room — see also the Philips CD-I, the 3DO console, and the Tandy VIS — which had all been or would soon become dismal failures. Upon seeing such gadgets demonstrated, consumers, unimpressed by ponderous encyclopedias that were harder to use and less complete than the ones already on their bookshelves, trip-planning software that was less intuitive than the atlases already in their glove boxes, and grainy film clips that made their VCRs look high-fidelity, all gave vent to the same plaintive question: “But what good is it really?” The only CD-based console which was doing well was, not coincidentally, the only one which could give a clear, unequivocal answer to this question. The Sega Genesis with CD add-on was good for playing videogames, full stop.

Commodore followed Sega’s lead with the CD32. It too looked, acted, and played like a videogame console — the most impressive one on the market, with specifications far outshining the Sega CD. Whereas Commodore had deliberately obscured the CDTV’s technological connection to the Amiga, they trumpeted it with the CD32, capitalizing on the name’s association with superb games. At heart, the CD32 was just a repackaged Amiga 1200, in the same way that the CDTV had been a repackaged Amiga 500. Yet this wasn’t a problem at all. For all that it was a little underwhelming in the world of computers, the AGA chipset was audiovisually superior to anything the console world could offer up, while the 32-bit 68020 that served as the CD32’s brain gave it much more raw horsepower. Meanwhile the fact that at heart it was just an Amiga in a new form factor gave it a huge leg up with publishers and developers; almost any given Amiga game could be ported to the CD32 in a week or two. Throw in a price tag of less than $400 (about $200 less than the going price of a “real” Amiga 1200, if you could find one), and, for the first time in years, Commodore had a thoroughly compelling new product, with a measure of natural appeal to people who weren’t already members of the Amiga cult. Thanks to the walled-garden model of software distribution that was the norm in the console world, Commodore stood to make money not only on every CD32 sold but also from a licensing fee of $3 on every individual game sold for the console. If the CD32 really took off, it could turn into one heck of a cash cow. If only Commodore could have released it six months earlier, or have managed to remain financially solvent for six months longer, it might even have saved them.

As it was, the CD32 made a noble last stand for a company that had long made ignobility its calling card. Released in September of 1993 in Europe, it generated some real excitement, thanks not least to a surprisingly large stable of launch titles, fruit of that ease of porting games from the “real” Amiga models. Commodore sold CD32s as fast as they could make them that Christmas — which was unfortunately nowhere near as fast as they might have liked, thanks to their current financial straits. Nevertheless, in those European stores where CD32s were on-hand to compete with the Sega CD, the former often outsold the latter by a margin of four to one. Over 50,000 CD32s were sold in December alone.

The Atari Jaguar. There was some mockery, perhaps justifiable, of its “Jetsons” design aesthetic.

Ironically, Atari’s last act took much the same form as Commodore’s. In November of 1993, following a horrific third quarter in which they had lost $17.6 million on sales of just $4.4 million, they released a game console of their own, called the Jaguar, in North America. In keeping with the tradition dating back to 1985, it was cheaper than Commodore’s take on the same concept — its street price was under $250 — but not quite as powerful, lacking a CD-ROM drive. Suffering from a poor selection of games, as well as reliability problems and outright hardware bugs, the Jaguar faced an uphill climb; Atari shipped less than 20,000 of them in 1993. Nevertheless, the Tramiel clan confidently predicted that they would sell 500,000 units in 1994, and at least some people bought into the hype, sending Atari’s stock soaring to almost $15 even as Commodore’s continued to plummet.

For the reality was, the rapid unraveling of all other facets of Commodore’s business had rendered the question of the CD32’s success moot. The remaining employees who worked at the sprawling campus in West Chester, Pennsylvania, purchased a decade before when the VIC-20 and 64 were flying off shelves and Jack “Business is War” Tramiel was stomping his rival home-computer makers into dust, felt like dwarfs wandering through the ancient ruins of giants. Once there had been more than 600 employees here; now there were about 50. There was 10,000 square feet of space per employee in a facility where it cost $8000 per day just to keep the lights on. You could wander for hours through the deserted warehouses, shuttered production lines, and empty research labs without seeing another living soul. Commodore was trying to lease some of it out for an attractive rent of $4 per square foot, but, as with with most of their computers, nobody seemed all that interested. The executive staff, not wanting the stigma of having gone down with the ship on their resumes, were starting to jump for shore. Commodore’s chief financial officer threw up his hands and quit in the summer of 1993; the company’s president followed in the fall.

Apart from the CD32, for which they lacked the resources to manufacture enough units to meet demand, virtually none of the hardware piled up in Commodore’s European warehouses was selling at all anymore. In the third quarter of 1993, they lost $9.7 million, followed by $8 million in the fourth quarter, on sales of just $70 million. After a second disastrous Christmas in a row, it could only be a question of time.

In a way, it was the small things rather than the eye-popping financial figures which drove the point home. For example, the April 1994 edition of the New York World of Commodore show, for years already a shadow of its old vibrant self, was cancelled entirely due to lack of interest. And the Army and Air Force Exchange, which served as a storefront to American military personnel at bases all over the world, kicked Commodore off its list of suppliers because they weren’t paying their bills. It’s by a thousand little cuts like this one, each representing another sales opportunity lost, that a consumer-electronics company dies. At the Winter Consumer Electronics Show in January of 1994, at which Commodore did manage a tepid presence, their own head of marketing told people straight out that the Amiga had no future as a general-purpose computer; Commodore’s only remaining prospects, he said, lay with the American vertical market of video production and the European mass market of videogame consoles. But they didn’t have the money to continue building the hardware these markets were demanding, and no bank was willing to lend them any.

The proverbial straw which broke the camel’s back was a dodgy third-party patent relating to a commonplace programming technique used to keep a mouse pointer separate from the rest of the screen. Commodore had failed to pay the patent fee for years, the patent holder eventually sued, and in April of 1994 the court levied an injunction preventing Commodore from doing any more business at all in the United States until they paid up. The sum in question was a relatively modest $2.5 million, but Commodore simply didn’t have the money to give.

On April 29, 1994, in a briefly matter-of-fact press release, Commodore announced that they were going out of business: “The company plans to transfer its assets to unidentified trustees for the benefit of its creditors. This is the initial phase of an orderly voluntary liquidation.” And just like that, a company which had dominated consumer computing in the United States and much of Europe for a good part of the previous decade and a half was no more. The business press and the American public showed barely a flicker of interest; most of them had assumed that Commodore was already long out of business. European gamers reacted with shock and panic — few had realized how bad things had gotten for Commodore — but there was nothing to be done.

Thus it was that Atari, despite being chronically ill for a much longer period of time, managed to outlive Commodore in the end. Still, this isn’t to say that their own situation at the time of Commodore’s collapse was a terribly good one. When the reality hit home that the Jaguar probably wasn’t going to be a sustainable gaming platform at all, much less sell 500,000 units in 1994 alone, their stock plunged back down to less than $1 per share. In the aftermath, Atari limped on as little more than a patent troll, surviving by extracting judgments from other videogame makers, most notably Sega, for infringing on dubious intellectual property dating back to the 1970s. This proved to be an ironically more profitable endeavor for them than that of actually selling computers or game consoles. On July 30, 1996, the Tramiels finally cashed out, agreeing to merge the remnants of their company with JT Storage, a maker of hard disks, who saw some lingering value in the trademarks and the patents. It was a liquidation in all but name; only three Atari employees transitioned to the “merged” entity, which continued under the same old name of JT Storage.

And so disappeared the storied name of Atari and that of Tramiel simultaneously from the technology industry. Even as the trade magazines were publishing eulogies to the former, few were sorry to see the latter go, what with their long history of lawsuits, dirty dealing, and abundant bad faith. Jack Tramiel had purchased Atari in 1984 out of the belief that creating another phenomenon like the Commodore 64 — or for that matter the Atari VCS — would be easy. But the twelve years that followed were destined always to remain a footnote to his one extraordinary success, a cautionary tale about the dangers of conflating lucky timing and tactical opportunism with long-term strategic genius.

Even so, the fact does remain that the Commodore 64 brought affordable computing to millions of people all over the world. For that every one of those millions owes Jack Tramiel, who died in 2012, a certain debt of gratitude. Perhaps the kindest thing we can do for him is to end his eulogy there.



The story of the Amiga after the death of Commodore is long, confusing, and largely if not entirely dispiriting; for all these reasons, I’d rather not dwell on it at length here. Its most positive aspect is the surprisingly long commercial half-life the platform enjoyed in Europe, over the course of which game developers still found a receptive if slowly dwindling market ready to buy their wares. The last glossy newsstand magazine devoted to the Amiga, the British Amiga Active, didn’t publish its last issue until the rather astonishingly late date of November of 2001.

The Amiga technology itself first passed into the hands of a German PC maker known as Escom, who actually started manufacturing new Amiga 1200s for a time. In 1996, however, Escom themselves went bankrupt. The American PC maker Gateway 2000 became the last major company to bother with the aging technology when they bought it at the Escom bankruptcy auction. Afterward, though, they apparently had second thoughts; they did nothing whatsoever with it before selling it onward at a loss. From there, it passed into other, even less sure hands, selling always at a discount. There are still various projects bearing the Amiga name today, and I suspect they will continue until the generation who fell in love with the platform in its heyday have all expired. But these are little more than hobbyist endeavors, selling their products in minuscule numbers to customer motivated more by their nostalgic attachment to the Amiga name than by any practical need. It’s far from clear what the idea of an “Amiga computer” should even mean in 2020.

When the hardcore of the Amiga hardcore aren’t dreaming quixotically of the platform’s world-conquering return, they’re picking through the rubble of the past, trying to figure out where it all went wrong. Among a long roll call of petty incompetents in Commodore’s executive suites, two clear super-villains emerge: Irving Gould, Commodore’s chairman since the mid-1960s, and Mehdi Ali, his final hand-picked chief executive. Their mismanagement in the latter days of the company was so egregious that some have put it down to evil genius rather than idiocy. The typical hypothesis says that these two realized at some point in the very early 1990s that Commodore’s days were likely numbered, and that they could get more out of the company for themselves by running it into the ground than they could by trying to keep it alive. I usually have little time for such conspiracy theories; as far as I’m concerned, a good rule of thumb for life in general is never to attribute to evil intent what can just as easily be chalked up to good old human stupidity. In this case, though, there’s some circumstantial evidence lending at least a bit of weight to the theory.

The first and perhaps most telling piece of evidence is the two men’s ridiculously exorbitant salaries, even as their company was collapsing around them. In 1993, Mehdi Ali took home $2 million, making him the fourth highest-paid executive in the entire technology sector. Irving Gould earned $1.75 million that year — seventh on the list. Why were these men paying themselves as if they ran a thriving company when the reality was so very much the opposite? One can’t help but suspect that Gould at least, who owned 19 percent of Commodore’s stock, was trying to offset his losses on the one field by raising his personal salary on the other.

And then there’s the way that Gould, an enormously rich man whose personal net worth was much higher than that of all of Commodore by the end, was so weirdly niggardly in helping his company out of its financial jam. While he did loan fully $17.4 million back to Commodore, the operative word here is indeed “loan”: he structured his cash injections to ensure that he would be first in line to get his money back if and when the company went bankrupt, and stopped throwing good money after bad as soon as the threshold of the collateral it could offer up in exchange was exceeded. One can’t help but wonder what might have become of the CD32 if he’d been willing to go all-in to try to turn it into a success.

Of course, this is all rank speculation, which will quickly become libelous if I continue much further down this road. Suffice to say that questionable ethics were always an indelible part of Commodore. Born in scandal, the company would quite likely have ended in scandal as well if anyone in authority had been bothered enough by its anticlimactic bankruptcy to look further. I’d love to see what a savvy financial journalist could make of Commodore’s history. But, alas, I have neither the skill nor the resources for such a project, and the story is of little interest to the mainstream journalists of today. The era is past, the bodies are buried, and there are newer and bigger outrages to fill our newspapers.



Instead, then, I’ll conclude with two brief eulogies to mark the end of the Amiga’s role in this ongoing history. Rather than eulogizing in my own words, I’m going to use those of a true Amiga zealot: the anonymous figure known as “The Bandito,” whose “Roomers” columns in the magazine Amazing Computing were filled with cogent insights and nitty-gritty financial details every month. (For that reason, they’ve been invaluable sources for this series of articles.)

Jay Miner, the gentle genius, in 1990. In interviews like the one to which this photo was attached, he always seemed a little befuddled by the praise and love which Amiga users lavished upon him.

First, to Jay Miner, the canonical “father of the Amiga,” who died of the kidney disease he had been battling for most of his life on June 20, 1994, at age 62. If a machine can reflect the personality of a man, the Amiga certainly reflected his:

Jay was not only the inventive genius who designed the custom chips behind the Atari 800 and the Amiga, he also designed many more electronic devices, including a new pacemaker that allows the user to set their own heart rate (which allows them to participate in strenuous activities once denied to them). Jay was not only a brilliant engineer, he was a kind, gentle, and unassuming man who won the hearts of Amiga fans everywhere he went. Jay was continually amazed and impressed at what people had done with his creations, and he loved more than anything to see the joy people obtained from the Amiga.

We love you, Jay, for all the gifts that you have given to us, and all the fruits of your genius that you have shared with us. Rest in peace.

And now a last word on the Amiga itself, from the very last “Roomers” column, written by someone who had been there from the beginning:

The Amiga has left an indelible mark on the history of computing. [It] stands as a shining example of excellent hardware design. Its capabilities foreshadowed the directions of the entire computer industry: thousands of colors, multiple screen resolutions, multitasking, high-quality sound, fast animation, video capability, and more. It was the beauty and elegance of the hardware that sold the Amiga to so many millions of people. The Amiga sold despite Commodore’s neglect, despite their bumbling and almost criminal marketing programs. Developers wrote brilliantly for this amazing piece of hardware, creating software that even amazed the creators of the hardware. The Amiga heralded the change that’s even now transforming the television industry, with inexpensive CGI and video editing making for a whole new type of television program.

Amiga game software also changed the face of entertainment software. Electronic Arts launched themselves headlong into 16-bit entertainment software with their Amiga software line, which helped propel them into the $500 million giant they are today. Cinemaware’s Defender of the Crown showed people what computer entertainment could look like: real pictures, not blocky collections of pixels. For a while, the Amiga was the entertainment-software machine to have.

In light of all these accomplishments, the story of the Amiga really isn’t the tragedy of missed opportunities and unrealized potential that it’s so often framed as. The very design that made it able to do so many incredible things at such an early date — its tightly coupled custom chips, its groundbreaking but lightweight operating system — made it hard for the platform to evolve in the same ways that the less imaginative, less efficient, but modular Wintel and MacOS architectures ultimately did. While it lasted, however, it gave the world a sneak preview of its future, inspiring thousands who would go on to do good work on other platforms. We are all more or less the heirs to the vision embodied in the original Amiga Lorraine, whether we ever used a real Amiga or not. The platform’s most long-lived and effective marketing slogan, “Only Amiga Makes it Possible,” is of course no longer true. It is true, though, that the Amiga made many things possible first. May it stand forever in the annals of computing history alongside the original Apple Macintosh as one of the two most visionary computers of its generation. For without these two computers — one of them, alas, more celebrated than the other — the digital world that we know today would be a very different place.

(Sources: the books Commodore: The Final Years by Brian Bagnall and my own The Future Was Here; Amazing Computing of November 1992, December 1992, January 1993, February 1993, March 1993, April 1993, May 1993, June 1993, July 1993, September 1993, October 1993, November 1993, December 1993, January 1994, February 1994, March 1994, April 1994, May 1994, June 1994, July 1994, August 1994, September 1994, October 1994, and February 1995; Byte of January 1993; Amiga User International of June 1988; Electronic Gaming Monthly of June 1995; Next Generation of December 1996. My thanks to Eric Graham for corresponding with me about The Juggler and Sculpt 3D years ago when I was writing my book on the Amiga.

Those wishing to read about the Commodore story from the perspective of the engineers in the trenches, who so often accomplished great things in less than ideal conditions, should turn to Brian Bagnall’s full “Commodore Trilogy”: A Company on the Edge, The Amiga Years, and The Final Years.)

 

Tags: , , ,

Alone in the Dark

Most videogame stories are power fantasies. You spend your time getting ever stronger, ever tougher, ever more formidable as you accumulate experience points, gold, and equipment. Obstacles aren’t things to go around; they’re things you go through. If you can’t get past any given monster, the solution is to go kill some other monsters, then come back when you’re yet more powerful and slay the big beast at last. Life, these games tell us, is or ought to be one unadulterated ride up the escalator of success; a setback just means you haven’t yet risen high enough.

That dynamic held true in 1992 just as much as it usually does today. But during that year there came a well-nigh revolutionary game out of France that upended all of these traditional notions about what the medium of videogames can do and be. It cast you as a painfully ordinary, near-powerless individual adrift in a scary world, with no surefire panaceas in the form of experience points, gold, or portable rocket launchers to look forward to. It was just you and your wits, trapped in a haunted house full of creatures that were stronger than you and badly wanted to kill you. Despite its supernatural elements, this game’s scenario felt more disconcertingly close to real life than that of any of those other games. Here, you truly were alone in the dark. Aren’t we all from time to time?


Any story of how this shockingly innovative game came to be must begin with that of Frédérick Raynal, its mastermind. Born in the south-central French town of Brive-la-Gaillarde in 1966, Raynal was part of the first generation of European youths to have access to personal computers. In fact, right from the time his father first came home with a Sinclair ZX81, he was obsessed with them. He was also lucky: in a dream scenario for any budding hacker, his almost equally obsessed father soon added computers to the product line of the little videocassette-rental shop he owned, thus giving his son access to a wide variety of hardware. Raynal worked at the store during the day, renting out movies and watching them to kill the time — he was a particular fan of horror movies, a fact which would soon have a direct impact on his career — and helping customers with their computer problems. Then, with a nerdy young man’s total obliviousness to proportion, he hacked away most of the night on one or another of the machines he brought home with him. He programmed his very first released game, a platformer called Robix, in 1986 on an obscure home-grown French computer called the Exelvision which his father sold at the store. His father agreed to sell his son’s Exelvision game there as well, managing to shift about 80 units to customers desperate for software for the short-lived machine.

Raynal’s lifestyle was becoming so unbalanced that his family was beginning to worry about him. One day, he ran out of his room in a panic, telling them that all of the color had bled out of his vision. His mother bustled him off to an ophthalmologist, who told him he appeared to have disrupted the photoreceptors in his eyes by staring so long at a monitor screen. Thankfully, the condition persisted only a few hours. But then there came a day when he suddenly couldn’t understand anything that was said to him; he had apparently become so attuned to the language of computer code that he could no longer communicate with humans. That worrisome condition lasted several weeks.

Thus just about everyone around him took it as a good thing on the whole when he was called up for military service in 1988. Just before leaving, Raynal released his second game, this time for MS-DOS machines. Not knowing what else to do with it, he simply posted it online for free. Popcorn was a Breakout clone with many added bells and whistles, the latest entry in a sub-genre which was enjoying new popularity following the recent success of the Taito arcade game Arkanoid and its many ports to home computers and consoles. Raynal’s game could hold its head high in a crowded field, especially given its non-existent price tag. One magazine pronounced it one of the five best arcade games available for MS-DOS, whether commercial or free, and awarded it 21 points on a scale of 20.

Raynal was soon receiving letters at his military posting from all over the world. “Popcorn has made my life hell!” complained one player good-naturedly. Another wrote that “I caught acute Popcornitus. And, it being contagious, now my wife has it as well.” When Raynal completed his service in the summer of 1989, his reputation as the creator of Popcorn preceded him. Most of the companies in the French games industry were eager to offer him a job. His days working at his father’s computer store, it seemed, were behind him. The Lyon-based Infogrames, the most prominent French publisher of all, won the Raynal sweepstakes largely by virtue of its proximity to his hometown.

Yet Raynal quickly realized that the company he had elected to join was in a rather perilous state. An ambitious expansion into many European markets hadn’t paid off; in fact, it had very nearly bankrupted them. Bruno Bonnell, Infogrames’s co-founder and current chief executive, had almost sold the company to the American publisher Epyx, but that deal had fallen through as soon as the latter had gotten their first good look at the state of his books. It seemed that Infogrames would have to dig themselves out of the hole they’d made. Thus Bonnell had slashed costs and shed subsidiaries ruthlessly just to stay alive. Now, having staunched the worst of the bleeding, he knew that he needed as many talented programmers as he could get in order to rebuild his company — especially programmers like Raynal, who weren’t terribly assertive and were naive enough to work cheap. So, Raynal was hired as a programmer of ports, an unglamorous job but an absolutely essential one in a European market that had not yet consolidated around a single computer platform.

Bonnell, for his part, was the polar opposite of the shy computer obsessive he had just hired; he had a huge personality which put its stamp on every aspect of life at Infogrames. He believed his creativity to be the equal of anyone who worked for him, and wasn’t shy about tossing his staff ideas for games. He called one of them, which he first proposed when Raynal had been on the job for about a year, In the Dark. A typically high-concept French idea, its title was meant to be taken literally. The player would wander through a pitch-dark environment, striking the occasional match from her limited supply, but otherwise relying entirely on sound cues for navigation. Bonnell and Raynal were far from bosom buddies, then or ever, but this idea struck a chord with the young programmer.

As Raynal saw it, the question that would make or break the idea was that of how to represent a contiguous environment with enough verisimilitude to give the player an embodied sense of really being there in the dark. Clearly, a conventional adventure-game presentation, with its pixel graphics and static views, wouldn’t do. Only one approach could get the job done: 3D polygonal graphics. Not coincidentally, 3D was much on Raynal’s mind when he took up Bonnell’s idea; he’d been spending his days of late porting an abstract 3D puzzle game known as Alpha Waves from the Atari ST to MS-DOS.

I’ve had occasion to discuss the advantages and disadvantages of this burgeoning new approach to game-making in previous articles, so I won’t rehash that material here. Suffice to say that the interest so many European programmers had in 3D reflected not least a disparity in the computing resources available to them in comparison to their American counterparts. American companies in this period were employing larger and larger teams, who were filling handfuls of floppy disks — and soon CD-ROMs — with beautiful hand-drawn art and even digitized snippets of real-world video. European companies had nothing like the resources to compete with the Americans on those terms. But procedurally-generated 3D graphics offered a viable alternative. At this stage in the evolution of computer technology, they couldn’t possibly be as impressively photorealistic as hand-drawn pixel art or full-motion video, but they could offer far more flexible, interactive, immersive environments, with — especially when paired with a French eye for aesthetics — a certain more abstracted allure of their own.

This, then, was the road Raynal now started down. It was a tall order for a single programmer. Not only was he trying to create a functional 3D engine from scratch, but the realities of the European market demanded that he make it run on an 80286-class machine, hardware the Americans by now saw as outdated. Even Bonnell seemed to have no confidence in Raynal’s ability to bring his brainstorm to fruition. He allowed Raynal to work on it only on nights and weekends, demanding that he spend his days porting SimCity to the Commodore CDTV.

An artist named Didier Chanfray was the closest thing to a partner and confidante which Raynal had at Infogrames during his first year of working on the engine. It was Chanfray who provided the rudimentary graphics used to test it. And it was also Chanfray who, in September of 1991, saw the full engine in action for the first time. A character roamed freely around a room under the control of Raynal, able to turn about and bend his body and limbs at least semi-realistically. The scene could be viewed from several angles, and it could be lit — or not — by whatever light sources Raynal elected to place in the room. Even shadows appeared; that of the character rippled eerily over the furniture in the room as he moved from place to place. Chanfray had never seen anything like it. He fairly danced around Raynal’s desk, pronouncing it a miracle, magic, alchemy.

In the meantime, Bruno Bonnell had negotiated and signed a new licensing deal — not exactly a blockbuster, but something commensurate with a rebuilding Infogrames’s circumstances.


Something tentacled and other-worldly, it seems, got into the water at Infogrames from the start: Didier Chanfray provided this very Lovecraftian concept drawing for Raynal’s game long before the conscious decision was made to turn it into a Lovecraft pastiche. Raynal kept the sketch tacked on the wall beside his desk throughout the project as a reminder of the atmosphere he was going for.

The American horror writer H.P. Lovecraft, who died well before the advent of the computer age in 1937, was nowhere near as well-known in 1991 as he is today, but his so-called “Cthulhu Mythos” of extra-dimensional alien beings, terrifying by virtue of their sheer indifference to humanity and its petty morality, had already made appearances in games. The very first work of ludic Lovecraftia would appear to be the 1979 computer game Kadath, an odd sort of parser-less text adventure. Two years later, at the height of the American tabletop-RPG craze, a small company called Chaosium published Call of Cthulhu, a game which subverted the power fantasy of tabletop Dungeons & Dragons in much the same way that Raynal’s project would soon be subverting that of so many computer games. Still, although Call of Cthulhu was well-supported by Chaosium and remained reasonably popular by the standards of its niche industry throughout the 1980s and beyond, its success didn’t lead to any Lovecraftian onslaught in the realm of digital games. The most notable early example of the breed is Infocom’s very effective 1987 interactive fiction The Lurking Horror. But, being all text at a time when text adventures were becoming hard sells, it didn’t make much commercial impact.

Now, though, Bonnell believed the time had come for a more up-to-date Lovecraftian computer game; he believed such a thing could do well, both in France and elsewhere.

Lovecraft had long had a strong following in France. From the moment his books were first translated into the language in 1954, they had sold in considerable numbers. Indeed, in 1991 H.P. Lovecraft was about as popular in France as he was anywhere — arguably more popular on a per-capita basis than in his native land. The game of Call of Cthulhu too had long since been translated into French, giving a potential digital implementation of it as much natural appeal there as in its homeland. So, Bonnell approached Chaosium about licensing their Call of Cthulhu rules for computers, and the American company agreed.

When viewed retrospectively, it seems a confusing deal to have made, one that really wasn’t necessary for what Infogrames would ultimately choose to do with Lovecraft. When Lovecraft died in obscurity and poverty, he left his literary estate in such a shambles that no one has ever definitively sorted out its confusing tangle of copyright claimants; his writing has been for all intents and purposes in the public domain ever since his death, despite numerous parties making claims to the contrary. Prior to publishing their Lovecraft tabletop RPG, Chaosium had nevertheless negotiated a deal with Arkham House, the publisher that has long been the most strident of Lovecraft’s copyright claimants. With that deal secured, Chaosium had promptly trademarked certain catchphrases, including “Call of Cthulhu” itself, in the context of games. Yet as it turned out Infogrames would use none of them; nor would they draw any plots directly from any of Lovecraft’s published stories. Like the countless makers of Lovecraftian games and stories that would follow them, they would instead draw from the author’s spirit and style of horror, whilst including just a few of his more indelible props, such as the forbidden book of occult lore known as the Necronomicon.

The first Lovecraftian game Infogrames would make would, of course, be the very game that Frédérick Raynal had now spent the last year or so prototyping during his free time. By the time news of his work reached Bonnell, most of Infogrames’s staff were already talking about it like the second coming. While the idea that had inspired it had been wonderfully innovative, it seemed absurd even to the original source of said idea to devote the best 3D engine anyone had ever seen to a game that literally wouldn’t let you see what it could do most of the time. It made perfect sense, on the other hand, to apply its creepy visual aesthetic to the Lovecraft license. The sense of dread and near-powerlessness that was so consciously designed into the tabletop RPG seemed a natural space for the computer game as well to occupy. It was true that it would have to be Call of Cthulhu in concept only: the kinetic, embodied, real-time engine Raynal had created wasn’t suitable for the turn-based rules of the tabletop RPG. For that matter, Raynal didn’t even like the Chaosium game all that much; he considered it too complicated to be fun.

Still, Bonnell, who couldn’t fail to recognize the potential of Raynal’s project, put whatever resources he could spare from his still-rebuilding company at the mild-mannered programmer’s disposal: four more artists to join Chanfray, a sound designer, a second programmer and project manager. When the team’s first attempts at writing an authentic-feeling Lovecraftian scenario proved hopelessly inadequate, Bonnell hired for the task Hubert Chardot, a screenwriter from 20th Century Fox’s French division, a fellow who loved Lovecraft so much that he had turned his first trip to the United States into a tour of his dead hero’s New England haunts. One of Chardot’s first suggestions was to add the word “alone” to the title of the game. He pointed out, correctly, that it would convey the sense of existential loneliness that was such an integral part of Lovecraftian horror — even, one might say, the very thing that sets it apart from more conventional takes on horror.

You can choose to enter the mansion as either of two characters.

The game takes place in the 1920s, the era of Lovecraft himself and of most of his stories (and thus the default era as well for Chaosium’s Call of Cthulhu game). It begins as you arrive in the deserted Louisiana mansion known as Derceto, whose owner Jeremy Hartwood has recently hanged himself. You play either as Edward Carnby, a relic hunter on the trail of a valuable piano owned by the deceased, or as Emily Hartwood, the deceased’s niece, eager to clear up the strange rumors that have dogged her uncle’s reputation and to figure out what really went down on his final night of life. The direction in which the investigation leads you will surprise no one familiar with Lovecraft’s oeuvre or Chaosium’s RPG: occult practices, forbidden books, “things man was never meant to know,” etc. But, even as Chardot’s script treads over this ground that was well-worn already in the early 1990s, it does so with considerable flair, slowly revealing its horrifying backstory via the books and journals you find hidden about the mansion as you explore. (There is no in-game dialog and no real foreground story whatsoever, only monsters and traps to defeat or avoid.) Like most ludic adaptations of Lovecraft, the game differs markedly from its source material only in that there is a victory state; the protagonist isn’t absolutely guaranteed to die or become a gibbering lunatic at the end.

One of the in-game journals, which nails the spirit and style of Lovecraft perfectly. As I noted in an earlier article about the writer, the emotion he does better than any other is disgust.

Yet Chaosium wasn’t at all pleased when Infogrames sent them an early build of the game for their stamp of approval. It seems that the American company had believed they were licensing not just their trademarks to their French colleagues, nor even the idea of a Lovecraft game in the abstract, but rather the actual Call of Cthulhu rules, which they had expected to see faithfully implemented. And, indeed, this may have been Bonnell’s intention when he was making the deal — until Raynal’s 3D engine had changed everything. Chaosium, who had evidently been looking forward to an equivalent of sorts to the Gold Box line of licensed Dungeons & Dragons CRPGs, felt betrayed. After some tense negotiation, they agreed to let Alone in the Dark continue without the Call of Cthulhu name on the box; some editions would include a note saying the game had been “inspired by the works of H.P. Lovecraft,” while others wouldn’t even go that far. In return for Chaosium’s largess on this front, Infogrames agreed to make a more conventional adventure game that would make explicit use of the Call of Cthulhu trademarks.

Call of Cthulhu: Shadow of the Comet, the fruit of that negotiation, would prove a serviceable game, albeit one that still didn’t make much direct use of the tabletop rules. But, whatever its merits, it would come and go without leaving much of a mark on an industry filled to bursting with graphical adventures much like it in terms of implementation. Alone in the Dark, on the other hand, would soon be taking the world by storm — and Chaosium could have had their name on it, a form of advertisement which could hardly have failed to increase their commercial profile dramatically. Chalk it up as just one more poor decision in the life of a company that had a strange talent for surviving — Chaosium is still around to this day — without ever quite managing to become really successful.

Infogrames got their first preview of just what an impact Alone in the Dark was poised to make in the spring of 1992, when Dany Boolauck, a journalist from the French videogame magazine Tilt, arrived to write a rather typical industry puff piece, a set of capsule previews of some of the company’s current works-in-progress. He never got any further than Alone in the Dark. After just a few minutes with it, he declared it “the best game of the last five years!” and asked for permission to turn the capsule blurb about it into a feature-length article, complete with a fawning interview with Raynal. (He described him in thoroughly overwrought terms: as a reincarnation of The Little Prince from Antoine de Saint-Exupéry’s beloved novella of the same name.) In a “review” published in the summer of 1992, still a couple of months before Infogrames anticipated releasing the game, he gave it 19 of 20 stars, gushing over its “exceptional staging” and “almost perfect character movement,” calling it “a revolution in the field of play” that “people must buy!”

Bruno Bonnell was pleased with the positive press coverage, but less thrilled by Boolauck’s portrayal of Raynal as the game’s genius auteur. He called in his introverted young programmer, who seemed a bit befuddled by all the attention, and told him to scrub the words “a Frédérick Raynal creation” from the end credits. Alone in the Dark, he said, was an Infogrames creation, full stop. Raynal agreed, but a grievance began to fester in his heart.

Thanks to Bonnell’s policy of not advertising the individuals behind Infogrames’s games, Raynal’s name didn’t spread quite so far and wide as that of such other celebrated gaming auteurs as Éric Chahi, the mastermind of Another World, France’s standout game from the previous year. Nevertheless, upon its European release in September of 1992, Raynal’s game stood out on its own terms as something special — as an artistic creation that was not just fun or scary but important to its medium. As one would expect, the buzz started in France. “We review many games,” wrote one magazine there. “Some are terrible, some mediocre, some excellent. And occasionally there comes along the game that will revolutionize the world of microcomputers, one that causes sleepless nights, one that you cannot tear yourself away from, can only marvel at. We bid welcome now to the latest member of this exclusive club: Alone in the Dark.” By the end of 1992, the game was a hit not only in France but across most of Europe. Now for America.

Bonnell closed a deal with the American publisher Interplay for distribution of the game there. Interplay had also published Another World, which had turned into a big success Stateside, and the company’s head Brian Fargo was sure he saw similar potential in Alone in the Dark. He thus put the game through his company’s internal testing wringer, just as he had Another World; the French studios had their strengths, but such detail work didn’t tend to be among them. Raynal’s game became a much cleaner, much more polished experience thanks to Interplay’s QA team. Yet Bonnell still had big international ambitions for Infogrames, and he wasn’t willing to let such a remarkable game as this one share with Another World the fate of becoming known to American players simply as an Interplay title. Instead he convinced Fargo to accept a unique arrangement. Interplay and Infogrames each took a stake in a new shared American subsidiary known as I-Motion, under which imprint they published Alone in the Dark.

The game took North America by storm in early 1993, just as it had Europe a few months earlier. It was that rarest of things in games, a genuine paradigm shift; no one had ever seen one that played quite like this. Worldwide, it sold at least 400,000 copies, putting Infogrames on the map in the United States and other non-European countries in the process. Indeed, amidst the international avalanche of praise and punditry, perhaps the most gratifying press notice of all reached Frédérick Raynal’s ears from all the way off in Japan. Shigeru Miyamoto, the designer of Super Mario Bros. and many other iconic Nintendo classics, proclaimed Alone in the Dark to be, more so than any other game, the one he wished he could have come up with.


Arguably the creepiest visual in the game is the weird mannequin’s head of your own character. Its crudely painted expression rather smacks of Chucky the doll from the Child’s Play horror films.

Seen from the perspective of a modern player, however, the verdict on Alone in the Dark must be more mixed. Some historically important games transcend that status to remain vital experiences even today, still every bit as fun and playable as the day they were made. But others — and please forgive me the hoary old reviewer’s cliché! — haven’t aged as well. This game, alas, belongs to the latter category.

Today, in an era when 3D graphics have long since ceased to impress us simply for existing at all, those of Alone in the Dark are pretty painful to look at, all jagged pixels sticking out everywhere from grotesquely octagonal creatures. Textures simply don’t exist, leaving everything to be rendered out of broad swatches of single colors. And the engine isn’t even holistically 3D: the 3D characters move across pasted-on pre-rendered backgrounds, which looks decidedly awkward in many situations. (On the other hand, it could have been worse: Raynal first tried to build the backgrounds out of digitized photographs of a real spooky mansion, a truly unholy union that he finally had to give up on.) Needless to say, a comparison with the lovingly hand-drawn pixel art in the adventure games being put out by companies like LucasArts and Sierra during this period does the crude graphics found here no favors whatsoever. Some of the visuals verge on the unintentionally comical; one of the first monsters you meet was evidently meant to be a fierce dragon-like creature, but actually looks more like a sort of carnivorous chicken. (Shades of the dragon ducks from the original Atari Adventure…)

Dead again! Killed by… Prince during his Purple Rain period?

Then, too, the keyboard-only controls are clunky and unintuitive, and they aren’t made any less awkward by a fixed camera that’s constantly shifting about to new arbitrary locations as you move through the environment; some individual rooms have as many as nine separate camera angles. This is confusing as all get-out when you’re just trying to get a sense of the space, and quickly becomes infuriating when you’re being chased by a monster and really, really don’t have time to stop and adjust your thinking to a new perspective.

The more abstract design choices also leave something to be desired. Sudden deaths abound. The very first room of the game kills you when you step on a certain floorboard, and every book is either a source of backstory and clues or an instant game-ender; the only way to know which it is is to save your game and open it. Some of the puzzles are clever, some less so, but even those that are otherwise worthy too often depend on you standing in just the right position; if you aren’t, you get no feedback whatsoever on what you’re doing wrong, and are thus likely to go off on some other track entirely, never realizing how close you were to the solution. This fiddliness and lack of attention to the else in the “if, then, else” dynamic of puzzle design is a clear sign of a game that never got sufficiently tested for playability and solubility. At times, the game’s uncommunicativeness verges on the passive-aggressive. You’ll quickly grow to loathe the weirdly stilted message, “There is a mechanism which can be triggered here,” which the game is constantly spitting out at you as you gaze upon the latest pixelated whatsit. Is it a button? A knob? A keyhole? Who knows… in the end, the only viable course of action is to try every object in your inventory on it, then go back and start trying all the other objects you had to leave lying around the house thanks to your character’s rather brutal inventory limit.

Fighting is a strange, bloodless pantomime.

Yes, one might be able to write some of the game’s issues off as an aesthetic choice — as merely more ways to make the environment feel unsettling. Franck de Girolami, the second programmer on the development team as well as its project leader, has acknowledged using the disorienting camera consciously for just that purpose: “We realized that the camera angles in which the player was the most helpless were the best to bring in a monster. Players would instantly run for a view in which they felt comfortable.” While one does have to admire the team’s absolute commitment to the core concept of the game, the line between aesthetic choice and poor implementation is, at best, blurred in cases like this one.

And yet the fact remains that it was almost entirely thanks to that same commitment to its core concept that Alone in the Dark became one of the most important games of its era. Not a patch on a contemporary like Ultima Underworld as a demonstration of the full power and flexibility of 3D graphics — to be fair, it ran on an 80286 processor with just 640 K of memory while its texture-mapped, fully 3D rival demanded at least an 80386 with 2 MB — it remained conceptually unlike anything that had come before in daring to cast you as an ordinary mortal, weak and scared and alone, for whom any aspirations toward glory quickly turn into nothing more than a desperate desire to just escape the mansion. For all that it threw the Call of Cthulhu rules completely overboard, it retained this most fundamental aspect of its inspiration, bringing Chaosium’s greatest innovation to a digital medium for the first time. It’s not always impossible to kill the monsters in Alone in the Dark — often it’s even necessary to do so — but, with weapons and ammunition scarce and your health bar all too short, doing so never fails to feel like the literal death struggle it ought to. When you do win a fight, you feel more relieved than triumphant. And you’re always left with that nagging doubt in the back of the mind as you count your depleted ammo and drag your battered self toward the next room: was it worth it?


The legacy of this brave and important game is as rich as that of any that was released in its year, running along at least three separate tracks. We’ll begin with the subsequent career of Frédérick Raynal, its original mastermind.

The seeds of that career were actually planted a couple of weeks before the release of Alone in the Dark, when Raynal and others from Infogrames brought a late build of it to the European Computer Trade Show in London. There he met the journalist Dany Boolauck once again, learning in the process that Boolauck had switched gigs: he had left his magazine and now worked for Delphine Software, one of Infogrames’s French competitors. Delphine had recently lost the services of their biggest star: Éric Chahi, the auteur behind the international hit Another World. As his first assignment in his own new job, Boolauck had been given the task of replacing Chahi with a similarly towering talent. Raynal struck him as the perfect choice; he rather resembled Chahi in many respects, what with his very French aesthetic sensibility, his undeniable technical gifts, and his obsessive commitment to his work. Boolauck called in Paul de Senneville, the well-known composer who had launched Delphine Software as a spinoff from his record label of the same name, to add his dulcet voice to the mix. “We wish to place you in a setting where you will be able to create, where you will not be bullied, where we can make you a star,” said the distinguished older gentleman. “We want to give free rein to the fabulous talent you showed in Alone in the Dark.” When Raynal returned to Lyon to a reprimand from Bruno Bonnell for letting his game’s planned release date slip by a week, the contrast between his old boss and the possible new one who was courting him was painted all too clearly.

Much to Raynal’s dismay, Bonnell was already pushing him and the rest of the team that had made the first Alone in the Dark to make a sequel as quickly as possible using the exact same engine. One Friday just before the new year, Bonnell threw his charges a party to celebrate what he now believed would go down in history as the year when his struggling company turned the corner, thanks not least to Raynal’s game. On the following Monday morning, Raynal knocked on Bonnell’s office door along with three other members of the newly christened Alone in the Dark 2 team, including his most longstanding partner Didier Chanfray. They were all quitting, going to work for Delphine, Raynal said quietly. Much to their surprise, Bonnell offered to match Delphine’s offer, the first overt sign he’d ever given that he understood how talented and valuable they really were. But his counteroffer only prompted Delphine to raise the stakes again. Just after New Years Day, Bonnell bowed out of the bidding in a huff: “You want to leave? Goodbye!”

A couple of weeks later, the videogame magazine Génération 4 held an awards ceremony for the previous year’s top titles at Disneyland Paris. Everyone who had been involved with Alone in the Dark, both those who still worked at Infogrames and those who didn’t, was invited. When, as expected, it took the prize for top adventure game, Bruno Bonnell walked onto the stage to accept the award on behalf of his company. The departure of Raynal and crew being the talk of the industry, the room held its collective breath to see what would happen next. “My name is Bruno Bonnell,” he said from behind the rostrum. “I’d like to thank God, my dog, my grandmother, and of course the whole team at Infogrames for a beautiful project.” And with that he stumped offstage again.

It hadn’t been a particularly gracious acceptance speech, but Raynal and his colleagues nonetheless had much to feel good about. Dany Boolauck and Paul de Senneville were true to their word: they set Raynal up with a little auteur’s studio all his own, known as Adeline Software. They even allowed him to run it from Lyon rather than joining the rest of Delphine in Paris.

Naturally, all of the Alone in the Dark technology, along with the name itself and the Chaosium license (whatever that was worth), stayed with Infogrames. Raynal and his colleagues were thus forced to develop a new engine in the style of the old and to devise a fresh game idea for it to execute. Instead of going dark again, they went light. Released in 1994, Little Big Adventure (known as Relentless: Twinsen’s Adventure in North America) was a poetic action-adventure set in a whimsical world of cartoon Impressionism, consciously conceived by Raynal as an antidote to the ultra-violent Doom mania that was sweeping the culture of gaming at the time. He followed it up in 1997 with Little Big Adventure 2 (known as Twinsen’s Odyssey in North America). Although both games were and remain lovely to look at, Raynal still struggled to find the right balance between the art and the science of game design; both games are as absurdly punishing to play as they are charming to watch, with a paucity of save points between the countless places where they demand pin-point maneuvering and split-second timing. This sort of thing was, alas, something of a theme with the French games industry for far too many years.

This, then, is one legacy of Alone in the Dark. Another followed on even more directly, taking the form of the two sequels which Infogrames published in 1993 and 1994. Both used the same engine, as Bruno Bonnell had demanded in the name of efficiency, and both continued the story of the first game, with Edward Carnby still in the role of protagonist. (Poor Emily Hartwood got tossed by the wayside.) But, although Hubert Chardot once again provided their scripts, much of the spirit of the first game got lost, as the development team began letting the player get away with much more head-to-head combat. Neither sequel garnered as many positive reviews or sales as the original game, and Infogrames left the property alone for quite some time thereafter. A few post-millennial attempts to revive the old magic, still without the involvement of Raynal, have likewise yielded mixed results at best.

But it’s with Alone in the Dark‘s third legacy, its most important by far, that we should close. For several years, few games — not even its own sequels — did much to build upon the nerve-wracking style of play it had pioneered. But then, in 1996, the Japanese company Capcom published a zombie nightmare known as Resident Evil for the Sony Playstation console. “When I first played Resident Evil,” remembers Infogrames programmer Franck de Girolami, “I honestly thought it was plagiarism. I could recognize entire rooms from Alone in the Dark.” Nevertheless, Resident Evil sold in huge numbers on the consoles, reaching a mass market the likes of which Alone in the Dark, being available only on computers and the 3DO multimedia appliance, could never have dreamed. In doing so, it well and truly cemented the new genre that became known as survival-horror, which had gradually filtered its way up from the obscure works of a poverty-stricken writer to a niche tabletop RPG to a very successful computer game to a mainstream ludic blockbuster. Culture does move in mysterious ways sometimes, doesn’t it?

(Sources: the books La Saga des Jeux Vidéo by Daniel Ichbiah, Designers & Dragons: A History of the Roleplaying Game Industry, Volume 1 by Shannon Appelcline, and Alone in the Dark: The Official Strategy Guide by Johan Robson; Todd David Spaulding’s PhD thesis H.P. Lovecraft & The French Connection: Translations, Pulps, and Literary History”; Computer Gaming World of February 1993; Amiga Format of June 1991; Edge of November 1994; Retro Gamer 98. Online sources include Adventure Europe‘s interview with Frédérick Raynal, Just Adventure‘s interview with Hubert Chardot, and the video of Frédérick Raynal’s Alone in the Dark postmortem at the 2012 Game Developers Conference. Note that many of the direct quotations in this article were translated by me into English from their French originals.

The original Alone in the Dark trilogy is available as a package download at GOG.com.)

 
 

Tags: , , ,

Life Off the Grid, Part 1: Making Ultima Underworld

The 1980s was the era of the specialist in game development, when many of the most successful studios did just one or two things, but did them very, very well. For Infocom, that meant text adventures; for Sierra, graphic adventures; for MicroProse, military simulations; for SSI, strategic wargames and Dungeons & Dragons; for Epyx, joystick-twiddling sports and action games; for Origin, Ultima. When such specialists stepped outside of their comfort zones, the results were occasionally a triumph, but more often merely served to reemphasize their core competencies.

The most respected studios of the 1990s, however, tended toward more eclecticism. Developers like Dynamix and Westwood may have had their roots in the previous decade, but they really came into their own in this one, and did so with games of very diverse types. Westwood, for example, was happily making CRPGs, graphic adventures, real-time-strategy games, and Monopoly, for Pete’s sake, all virtually at the same time. Even the holdover specialists from the 1980s — those who were still standing — aggressively tried to diversify in the 1990s: Sierra moved into strategy games, MicroProse into CRPGs and graphic adventures, Origin into Wing Commander.

Still, if we look harder at many 1990s developers, we can find themes that bind together their output. In the case of Dynamix, we might posit that to be an interest in dynamic simulation, even when working in traditionally static genres like the graphic adventure. In that of Westwood, we can identify an even more pronounced interest in bringing the excitement of real time to traditionally turn-based genres like the CRPG and the wargame. And in the case of the studio we’ll be meeting for the first time today — Looking Glass Technologies, arguably the most respected and beloved 1990s studio of all — the binding thread is crystal clear. From beginning to end, they used the flexibility of 3D graphics to bring virtual environments to life in unprecedentedly immersive ways. Whether making a CRPG or a flight simulator, a first-person shooter or a first-person sneaker, this was their constant.


3D graphics were, one might say, baked right into Looking Glass’s DNA. Paul Neurath and Ned Lerner, its two eventual founders, met one another in 1978 in a computer-science course at Wesleyan University, where Neurath was studying environmental science, Lerner physics. For the course’s final project, they teamed up to make a 3D space game rendered in ASCII text. They got a B-minus on it only because their professor considered games to be beneath his course’s dignity.

After university, the two New England boys remained friends as they started their professional careers. When the home-computer craze got rolling in earnest, each bought an Apple II. They started experimenting, together and apart, on games much like the one they had written for that computer-science class, only implemented in real bitmap graphics, with a real joystick as a controller. These efforts culminated in a joint game known as Deep Space: Operation Copernicus, which they sold in 1985 to the publisher Sir-Tech, purveyors of the Wizardry CRPG series. Sir-Tech didn’t seem to know quite what to do with Neurath and Lerner’s very different sort of game, and it never escaped Wizardry‘s long shadow. Nevertheless, the experience of making a game and getting paid for it — however modestly — lit a fire in both partners. Each went off to pursue his own agenda, but they remained in touch, keeping one another updated on their progress and often sharing code and technical tricks.

Initially, it was Ned Lerner who made the most determined effort to become a real commercial game developer. He formed a little company called Lerner Research, and started gathering like-minded souls around him. As fixated as ever on 3D graphics, he decided that an at least semi-realistic flight simulator would be a good application for the technology. The leading product of that type on the market, subLOGIC’s generically titled Flight Simulator, he considered akin to a “textbook lesson”; he envisioned a flight simulator of his own that would be more accessible and fun. He hired an aerodynamic engineer to design a flight model for his game, which would focus on high-performance military aircraft like the legendary SR-71 Blackbird rather than the little Cessna that was forever tooling around from airport to airport in subLOGIC’s simulator. In fact, his game would let you fly any of fourteen different airplanes, in contrast to its rival’s one, and would concentrate on goal-oriented activities — “Flight Instruction,” “Test Flight,” “Formation Flying,” or “Airplane Racing” — instead of just expecting you to choose a starting airport and do whatever tickled your fancy.

Chuck Yeager and Ned Lerner discuss the vagaries of aerodynamics.

Electronic Arts, who lacked a competitive flight simulator and were eager to get in on one of the industry’s fastest-growing segments, signed on as publisher. Unlike Sir-Tech, they knew the appeal of snazzy packaging and celebrity endorsements. They convinced Chuck Yeager to put his name on the product. This was quite the coup; Yeager, a World War II fighter ace and the first man to break the sound barrier, was by far the most famous pilot in the country, after having been brought to indelible life by the actor Sam Shepard in the recent hit movie The Right Stuff. It was a decidedly nervous group of nerds and businessmen who met this aerospace legend for the first time in March of 1987. Lerner:

As we were sitting there in the office, listening to the rain outside, Rich Hilleman, associate producer at EA, was first to spot the Blazer entering the parking lot (license plate “BELL X1”). A few minutes later, we heard the unmistakable West Virginia drawl outside the door, as pure and easygoing as the man on TV who sells spark plugs with a shotgun. For a brief second, I remembered the opening scene of Patton where George C. Scott steps forward, dressed to the teeth in full military regalia. The door suddenly opened, and there he was: wearing cowboy boots, blue jeans, and a polo shirt under his racing-style jacket. General Yeager had a trim figure, and his face was tan, well-weathered, as if he had spent a lot of time outdoors. The general stepped forward, shaking hands with the members of the group, but I sensed a certain degree of reservation in his actions.

To get past this awkward beginning, we loaded in the current version of Advanced Flight Trainer. I flew the simulator for a while, then offered to let General Yeager take over. “I never fooled with these things,” he said. “That’s because, you know, the damned things are so…” — he searched for the word — “…insignificant. If you want to really scorch something, hell, you can program the X-31 in there, the aerospace plane. Now, see, you got some kid who can say, ‘Man, this thing is smoking along at mach 25.'”

The ice had finally been broken, and we all began contributing to the conversation. After discussing the subjects of liquid-oxygen fuel and the current type of aircraft that are touching the edge of space, the day was practically over. “This thing’s pretty dang realistic,” he told us. “You’ve got a lot of goodies in there.”

Released about six months later with much publicity, Chuck Yeager’s Advanced Flight Trainer became by far EA’s biggest hit of the year, and one of their biggest of the whole decade. With that push to get them off and running, Lerner Research continued their work on the frontiers of 3D graphics, giving EA a substantially revised version 2.0 of their flagship game in 1989.

Even as Ned Lerner was hobnobbing with famous test pilots, Paul Neurath was making his own inroads with the games industry. Shortly after finishing Deep Space, he had heard that Origin Systems of Ultima fame was located in New Hampshire, not all that far from him at all. On a lark, he drove down one day to introduce himself and take the temperature of the place. He hit it off immediately with Richard Garriott and the rest of the crew there. While he never became a full-fledged employee, he did become a regular around the Origin offices, contributing play-testing, design ideas, and occasional bits of code to their games on a contract basis.

In early 1987, Richard Garriott, who loathed New England with every fiber of his being, packed up and moved back to Austin, Texas, with most of Origin’s technical and creative staff. He left behind his older brother and business manager Robert, along with the latter’s support staff of accountants, secretaries, and marketers. A few developers who for one reason or another didn’t want to make the move also stayed behind. Neurath was among this group.

At about this same time, Neurath got the green light to make a game all his own for Origin. Space Rogue began as another 3D space shooter — another Deep Space, enhanced with some of the latest graphics technology from his friends at Lerner Research. To this template Neurath grafted a trading economy, a customizable spaceship, and a real plot. The player was even able to exit her spaceship and wander around the space stations she visited, talking to others and taking on quests. There was a surprising amount of ambition in this fusion of Deep Space, Elite, and Ultima, especially considering that Neurath designed, wrote, and programmed it all almost single-handedly from New Hampshire while most of his friends at Origin pursued other projects down in Austin. Although its disparate parts don’t ever gel quite well enough to make it a true classic, it’s remarkable that it works as well as it does.

Space Rogue sold in moderate numbers upon its release in 1989. More importantly in terms of gaming history, Chris Roberts of Origin spent a lot of time with it. Its melding of a space shooter with an adventure-game-like plot became one of the inspirations behind Wing Commander, the first Origin game to fully escape the shadow of Ultima — and, indeed, the beginning of one of the blockbuster franchises of the 1990s.

Space Rogue‘s hilarious cover art, with its artfully pouting male model who looks better suited to a Harlequin-romance cover. Paul Neurath remembers Origin’s marketing department asking him about his packaging preferences for his game. He said he would prefer a “non-representational” cover picture. Naturally, the marketers delivered about the most representational thing imaginable.

By the time of Space Rogue‘s release, Paul Neurath was a lonelier game developer than ever. In January of 1989, the last remnants of Origin’s New Hampshire operation had moved to Austin, leaving Neurath stranded in what Richard Garriott liked to call “the frozen wastes of New England.” For him, this was a crossroads of life if ever there was one. Did he want to continue to make games, and, if so, how? Sure, he could probably move down to Austin and get a job with Origin, but, truth be told, he had no more desire to live in Texas than Garriott had to live in New England. But how else could he stay in games?

At last, Neurath decided to take a page from Ned Lerner’s book. He would put together his own little company and try to establish it as an independent studio; after all, it had worked out pretty well for Ned so far. He registered his company under the name of Blue Sky Productions.

Neurath had always loved the CRPG genre, ever since Wizardry had become one of the first games he bought for his new Apple II. That love had once led him to publish Deep Space through Sir-Tech, and sent him out to Origin’s New Hampshire offices for that fateful visit. Now, he dreamed of taking the first-person dungeon crawl beyond the turn-based Wizardry, even beyond the real-time but still grid-based Dungeon Master, the state of the art in the genre as the 1980s expired. On a visit to Lerner Research, he saw the technology that he believed would make the genre’s next step possible — the foundation, one might even say, for everything he and his fabled studio Looking Glass would do in the 1990s. What he saw was the first 3D texture mapper that was suitable for use in an ordinary computer game.

3D graphics were hardly unknown on personal computers of the 1980s, as can be seen not least through the early careers of Ned Lerner and Paul Neurath. Yet, being enormously taxing to implement in the context of an interactive game, they demanded a lot of aesthetic compromise. Some early 3D games, such as Elite and the first versions of subLogic’s Flight Simulator, didn’t draw in the surfaces of their polygons at all, settling for wire frames. With the arrival of more powerful 16-bit computers in the mid-1980s, filled surfaces became more common in 3D games, but each side of a polygon was drawn in a single color. Combine this fact with the low polygon count that was still necessitated by the hardware of the time — resulting in big, fairly crude polygons — and you had a recipe for blotchy landscapes made up of garishly clashing primary colors.

A few clever developers were able to turn the limitations of 3D graphics into an aesthetic statement in its own right. But most of those who used them — among them makers of flight simulators and space shooters, such as Lerner and Neurath — suffered with their limitations because there just wasn’t any practical alternative for the sorts of games they were making. For an out-the-cockpit view from an airplane, the aesthetic compromises necessitated by going 3D were just about acceptable, given the way the distant landscape below tends to blur into hazy abstractions of color even in real life. But for a more personal, embodied experience, such as a first-person dungeon crawl, real-time 3D graphics were just too crude, too ugly. You couldn’t render the grain of a wooden door or the patina of a stone wall as one uniform splotch of color and expect to get away with it — not with the way that gamers’ audiovisual expectations were increasing every year.

A screenshot from Dungeon Master, the state of the art in dungeon crawls at the end of the 1980s. Notice how the walls, floor, and ceiling are shown in aesthetically pleasing detail. This was possible because movement in Dungeon Master was still based on a grid, which meant that each view could be assembled from pre-rendered component parts rather than needing to be rendered from scratch. A free-scrolling, real-time-rendered 3D version would have had to replace all of this detail with great uniform slabs of gray in order to run at an acceptable speed. The result, needless to say, would not have been pretty.

None of these problems were unknown to academic computer-graphics researchers; they’d been wrestling with them since well before the first personal computer hit the market. And they’d long since come up with a solution: texture mapping. The texture in question takes the form of an ordinary image file, which might be drawn by hand or digitized from a real-world photograph. A texture suitable for a wooden door, for example, could be an extreme closeup of any slab of wood. The texture is “glued” onto a polygon’s face in lieu of a solid color. Just like that, you suddenly have doors that look like real doors, slimy dungeon walls that look like real slimy dungeon walls.

The problem with texture mapping from the perspective of game development was the same one that haunted the whole field of 3D graphics: the problem of performance. Simple though the basic concept is, a lot of tricky math comes into play when one introduces textures; figuring out how they should wrap and fit together with one another over so many irregular polygonal surfaces is much more complicated than the lay observer might initially believe. At a time when just managing to paint the sides of your polygons in solid colors while maintaining a respectable frame rate was a real achievement, texture mapping was hopeless. Maybe it could be used in another decade or so, said the conventional wisdom, when Moore’s Law put a supercomputer on every desk.

But one recent arrival at Lerner Research wasn’t so sure that texture mapping was impossible using extant PC hardware. Chris Green had considerable experience with interactive 3D graphics, having spent several years at subLogic working on products like Flight Simulator and Jet. He arrived at Lerner Research knowing that texture mapping couldn’t be done on the likes of an 8-bit Apple II, the computer on which Neurath and Lerner among so many others had gotten their start. On the latest 16- and 32-bit MS-DOS hardware, however… he suspected that, with the right compromises, he could make it work there.

There was doubtless much efficient code in the texture mapper Green created, but it was indeed an unabashed compromise that made it feasible to attempt at all. The vertices of the polygons in a 3D graphics system are defined with an X, a Y, and a Z coordinate; it’s this last, of course, that makes such a system a 3D system at all. And it’s also the Z coordinate that is the source of all of the complications relating to 3D graphics in general. Nowhere is this more true than in the case of texture mapping. To do it correctly, textures have to be scaled and transformed to account for their position in relation to the viewing location, as largely defined by their Z coordinate. But Green didn’t bother to do texture mapping correctly; he effectively threw away the Z coordinate and glued his textures onto their polygons as if they were in a 2D space. This technique would come to be known inside the industry as “affine texture mapping.” It yielded an enormous increase in rendering speed, balanced by a degree of distortion that was almost unnoticeable in some situations, very noticeable indeed in others. Still, an imperfect texture mapper, Green decided, was better than no texture mapper at all.

The video clip above, from the finished game of Ultima Underworld, shows some of the spatial distortion that results from affine texture mapping, especially when viewing things from a very short distance. Moving through the game’s virtual space can look and feel a bit like moving through real space after having drunk one beer too many. Nonetheless, the environment is far more realistic, attractive, and immersive than any first-person 3D environment to appear in any game before this one.

Ned Lerner had recently signed a contract with EA to make a driving game bearing the name of Car and Driver magazine. Knowing the technology’s limitations, he planned to use Chris Green’s texture mapper in a somewhat constrained fashion therein, to draw onto the faces of billboards and the like. Yet he wasn’t averse to sharing it with Paul Neurath, who as soon as he saw it wanted to use it to take the next step beyond Dungeon Master.

To do so, however, he’d need more programmers, not to mention artists and all the rest; if there was one thing the two years or so he had spent making Space Rogue had taught him, it was that the days of the one-man development team were just about over. Luckily, a friend of his had a nephew who had a friend who was, as Neurath would be the first to admit, a far better programmer than he would ever be.

Doug Church was an MIT undergraduate who had let himself get so consumed by the fun going on inside the university’s computer labs that it had all but derailed his official education. He and his buddies spent hours every day hacking on games and playing them. Their favorite was a 3D tank game called Xtank, written by one of their number, a fellow student named Terry Donahue. They tinkered with its code endlessly, producing variations that departed radically from the original concept, such as a Frisbee simulator. When not coding or playing, they talked about what kinds of games they would like to make, if only they had infinite amounts of money and time and no hardware limitations whatsoever. They envisioned all sorts of little simulated worlds, rendered, naturally, in photo-realistic 3D graphics. Thus when Neurath introduced himself to Church in early 1990 and asked if he’d like to work on a free-scrolling, texture-mapped 3D dungeon crawl running in real time, he dropped his classes and rushed to get in on the chance. (Terry Donahue would doubtless have been another strong candidate to become lead programmer on the project, but he felt another calling; he would go on to become a priest.)

Neurath also found himself an artist, a fellow named Doug Wike who had worked on various projects for Origin in New Hampshire before those offices had been shuttered. Together the three men put together a crude non-interactive demo in a matter of weeks, showing the “player” moving up a texture-mapped dungeon corridor and bumping into a monster at the end of it. At the beginning of June, they took the demo to the Summer Consumer Electronics Show, where, behind all of the public-facing hype, many of the games industry’s most important deals got made.

As Neurath tells the story, the response from publishers was far from overwhelming. The demo was undeniably crude, and most were highly skeptical whether this unproven new company could get from it to a real, interactive game. It turned out that the only publisher willing to give the project any serious consideration at all was none other than Neurath’s old friends from Origin.

That Neurath hadn’t taken his idea to Origin straight away was down to his awareness of a couple of strategic decisions that had recently been made there, part of a whole collection of changes that were being made to greet the new decade’s challenges. Origin had, first of all, decided to stop giving contracts to outside developers, taking all development in-house so as to have complete control over the products they released. And secondly, they had decided, for the time being anyway, to make all of their output fit into one of two big franchises, Ultima and Wing Commander. Both of these decisions would seem to exclude Blue Sky’s proposed dungeon crawler, which they were calling simply Underworld, from becoming an Origin product. Nor did it help that a sexy public demonstration of the first Wing Commander game [1]Wing Commander was actually still known as Wingleader at this time. had become the hit of the show, making it difficult for Origin to focus on anything else; they could practically smell the money they were about to make from their new franchise.

Luckily, Blue Sky and Underworld found a champion at Origin even amidst all the distractions. Warren Spector was a relatively recent arrival at the company, but Neurath knew him pretty well; as his very first task for Origin, Spector had spent about a month expanding and polishing the text in Space Rogue just before its release. Now, looking at Underworld, he was sure he saw not just a game with real commercial potential but a technologically and aesthetically important one. “I was blown away,” he says today. “I remember thinking as I watched that demo that the world had just changed.” Spector convinced his colleagues to take a chance, to violate their rule of in-house development and sign a contract with Blue Sky, giving them a modest advance of $30,000. If the game worked out, they might be in on the ground floor of something major. It might also be something they could brand with the Ultima name, make into the beginning of a whole new sub-series in the franchise — a revival of the first-person (albeit turn-based) dungeons that had been in every Ultima through Ultima V. And if it didn’t work out, the $30,000 they’d lose on the flier was far from a fortune. The deal was done.

With that mission accomplished, Neurath’s little team returned to the office space he’d rented for them in New Hampshire. They spent almost a year there trying to understand the new set of technical affordances which Chris Green’s texture mapper had put at their disposal. They didn’t invent anything fundamentally new in terms of 3D graphics technology during that time. Like the texture mapper which spawned the project, everything they put into Underworld could be found in any number of books and journals at the MIT library, many of them dating well back into the 1970s and even 1960s. It was just a matter of adapting it all to the MS-DOS architecture. As it happened, the hardware they had to work with was about equal to the cutting-edge research workstations of ten years ago, so the old journal articles they pored over actually made a pretty good fit to it.

They kept coming back to the theme of embodiment, what Neurath called “a feeling of presence beyond what other games give you.” None of the earlier dungeon crawlers — not even those in the Dungeon Master tradition that ran in real time — had been able to deliver this. They could be exciting, stressful, even terrifying, but they never gave you the feeling of being physically embodied in their environments. It was the difference between reading a book or watching a movie and really being someplace.

It went without saying that Underworld must place you in control of just one character rather than the usual party of them. You needed to be able to sense the position of “your” body and limbs in the virtual space. Neurath:

We wanted to get a feeling that you were really in this dungeon. What would you expect to do in a dungeon? You might need to jump across a narrow chasm. You might expect to batter down a wooden door. You might expect to look up if there was a precipice above you. All these sorts of physical activities. And we tried to achieve, at least to a reasonable degree, that kind of freedom of motion and freedom of action. That really extended the R&D stage. It was about nine months, even a year, before we had all the underlying technology in place that allowed us to visualize this fantasy universe in a manner that we felt was appropriate and would work well and would allow the player the freedom to maneuver around and perform different kinds of actions.

Over the course of this time, Neurath hired only one more programmer, one Jonathan “J.D.” Arnold, who had previously worked on Infocom’s Z-Machine technology in that company’s twilight years. But finally, in the late spring of 1991, with the basic interface and the basic technical architecture all in place, Neurath decided it was time to hire some more people and make a real game out of it all. Doug Church immediately thought of his old friends back at MIT, and Neurath had no objections to recruiting from that pool; they were smart and passionate and, just as importantly, they were all happy to work for peanuts. Given the time of year it was, Church’s old buddies were all either graduating or finishing up their semester of coursework, leaving them free to come to Blue Sky.

None of these people had ever worked on a commercial computer game before. In fact, most of them hadn’t even played any commercial computer games recently, having been ensconced for the last several years inside the ivory tower of MIT, where the nature of gaming was markedly different, being a culture of creation rather than strictly one of consumption. And yet, far from being a disadvantage, the team’s sheer naivete proved to be the opposite, making them oblivious to the conventional wisdom about what was possible. Doug Church:

I had actually played Space Rogue because one of my friends had a Mac, but the clusters [at MIT] were all Unix boxes so I ran X-Trek and NetHack and things, but I hadn’t played a PC game in five years or something. So we just said, “Let’s do a really cool dungeon game in 3D, let’s go.” It’s interesting because a lot of people talk about how we were doing such a Dungeon Master game, but as far as I know none of us had ever played Dungeon Master. We didn’t have any idea we were doing anything that wasn’t just obvious in some sense because we had no context and the last time any of us had played a [commercial] game was back when we were fourteen. We played games in college, but they were very different; you’re playing networked X-Trek or something, it doesn’t feel like a home-computer game.

At first, the new arrivals all crowded into the New Hampshire office Neurath was renting. But most of them were actually living together in a rambling old three-story house in Cambridge, Massachusetts, and it struck them as silly to make the drive out to New Hampshire every day. They soon convinced Neurath to let them work on the game from home. From dawn until night, seven days a week, they ate, drank, slept, and breathed Underworld there.

At a time when most studios had begun to systematize the process of game development, dividing their employees into rigid tiers of specialists — programmers, artists, designers, writers — Blue Sky made a virtue of their complete lack of formal organization. It was an org-chart-wielding middle manager’s nightmare; just about everybody wound up doing a little bit of everything. There was nothing like a designer giving instructions to a technical team. Instead, Blue Sky’s method of working was more akin to the way that things got done among the hackers at MIT — a crowd of equals pulling together (and occasionally pulling apart) to work toward a common goal. Anyone could contribute absolutely anywhere, knowing his ideas would be judged only on their intrinsic worth.

When it became clear that it was time to start making the actual dungeon the Underworld player would have to explore, the team divided up this design work in the most democratic manner imaginable: everybody made one level, then they were all combined together to make the eight-level final dungeon. Dan Schmidt, who had officially been hired for the role of “AI programmer,” agreed to take on the mantle of “writer,” which really meant coordinating with everyone to merge the levels into a seamless whole.

For most of the time the game was in development, Origin’s role and overall interest — or, rather, lack thereof — was a consistent sore spot. It often seemed to Blue Sky that the folks in Austin had entirely forgotten their existence way off in the frozen wastes of New England. This was good in the sense that they got to make exactly the game they wanted to make, but it didn’t do much for their confidence that a committed publisher would be ready and eager to market it properly when they were done. Warren Spector was busy with Wing Commander and, later, with an Ultima spinoff called Martian Dreams, so Origin initially assigned Jeff Johannigman to Blue Sky in the role of producer. Communication with him was nothing short of terrible. After going two full months without hearing a peep from him, Neurath tried to call him down in Austin, only to be told that he had left the company. A second producer was finally selected, but he wasn’t much more engaged. Blue Sky believed they were making a great, groundbreaking game, but it seemed that Origin really couldn’t care less.

In many ways, Underworld was at odds with the prevailing trends inside Origin, not to mention in much of the games industry at large. Following the huge success of the first Wing Commander, Origin was banking heavily on cinematic games with big, set-piece storylines. The company’s org chart reflected the new impetus, with film-making terminology — producer, director, screenwriter — shoehorned in absolutely everywhere. Blue Sky, on the other hand, was making something very different, an immersive, emergent, non-linear experience without cut scenes or chapter breaks. Yes, there was a plot of sorts — the player got cast into a dungeon to rescue a princess or die trying — along with puzzles to be solved, quests to be fulfilled, and other characters to be spoken to, but it was all driven by the player, not by any relentlessly unspooling Hollywood-style script. Origin, it seemed, wasn’t quite sure what to make of it, wasn’t quite sure where it fit. And certainly it’s easy enough, given Blue Sky’s unorthodox working methods, to understand why so many at Origin were skeptical of their ability to deliver a finished game at all.

The danger of Blue Sky’s approach was that they would keep iterating endlessly as they kept having better and better ideas. This tendency among hackers to never be able to finish something and walk away from it had already derailed more than one promising games studio — not least among them FTL, the makers of the storied Dungeon Master, who had yet to release a proper followup after some four years. (Dungeon Master II wouldn’t finally arrive until 1995.) The need to finish games on a timetable was, one might say, the reason that industry executives had begun to impose the very organizational structures that Blue Sky was now so happily eschewing. Doug Church remembers creating “four movement systems and three combat systems because we’d just write something: ‘Oh, this seems cool, go for it.'” Would they just continue chasing whatever shiny objects struck their fancy until the money ran out? That wouldn’t take much longer, given that Paul Neurath was largely financing the whole effort out of his pocket, with some help from his ever-loyal friend Ned Lerner, whose success with his Chuck Yeager flight simulators had left him with a bit of money to spare.

Thus they were all fortunate that Warren Spector, their once and future savior, suddenly returned on the scene late in 1991. Virtually alone among his colleagues down in Austin, Spector had been watching Blue Sky’s progress with intense interest. Now, having finished up Martian Dreams, he got himself assigned as Underworld‘s third producer. He had considerable clout inside the bigger company; as soon as he started to press the issue there, things started to happen on Origin’s side to reassure Blue Sky that their game would in fact be released if they could only deliver it.

Indeed, after almost eighteen months of uncertainty on the question, Origin finally made it official that, yes, Underworld would be released as an Ultima game. As usual, the star would be the Avatar, who was becoming quite a busy fellow between this game, the mainline Ultima games, and the recent pair of Worlds of Ultima spinoffs. The dungeon in question, meanwhile, would be none other than the Stygian Abyss, where the Avatar had found the Codex of Ultimate Wisdom at the end of Ultima IV. Underworld‘s backstory would need to be bent and hammered enough to make this possible.

Blue Sky soon discovered that becoming an official Ultima game, while great for marketing purposes and for their own sense of legitimacy, was something of a double-edged sword. Origin demanded that they go back through all the text in the game to insert Ultima‘s trademark (and flagrantly misused) “thees” and “thous,” provoking much annoyance and mockery. And Origin themselves made a cinematic introduction for the game in Austin, featuring Richard Garriott, one of the industry’s worst voice actors of all time — and that, friends, is really saying something — in the leading role, bizarrely mispronouncing the word “Stygian.” It seems no one at Origin, much less at Blue Sky, dared to correct Lord British’s diction… (The British magazine PC Review‘s eventual reaction to the finished product is one for the ages: “I had to listen to it two or three times before I fully grasped what was going on because for the first couple of times I was falling about laughing at the badly dubbed Dick Van Dyke cockney accents that all these lovable Americans think we sound like. You know: ‘Awlright, Guv’noor, oop the happle un stairs!'”)

While Origin made the dodgy intro in Texas, Warren Spector got everybody in New England focused on the goal of a finished, shipped game. Doug Church:

Not only was he [Spector] great creatively to help us put finishing touches on it and clean it up and make it real, but he also knew how to finish projects and keep us motivated and on track. He had that ability to say, “Guys, guys, you’re focused in totally the wrong place.” He had that ability to help me and the rest of the guys reset, from the big-picture view of someone who has done it before and was really creative, but who also understood getting games done. It was a huge, huge win.

It’s very easy in hacker-driven game development to wind up with a sophisticated simulation that’s lots of fun for the programmers to create but less fun to actually play. Spector was there to head off this tendency as well at Blue Sky, as when he pared down an absurdly complex combat system to something simple and intuitive, or when he convinced the boys not to damage the player’s character every time he accidentally bumped into a wall. That, said Spector, “doesn’t sound like fun to me” as a player — and it was the player’s fun, he gently taught Blue Sky, that had to be the final arbitrator.

At Spector’s behest, Neurath rented a second office near Boston — officially known as the “Finish Underworld Now” office — and insisted that everyone leave the house and come in to work there every day during the last two months of the project. The more businesslike atmosphere helped them all focus on getting to the end result, as did Spector himself, who spent pretty much all of those last two months in the office with the team.

Spector did much to make Blue Sky feel like a valued part of the Origin family, but the relationship still remained rocky at times — especially when the former learned that the latter intended to release Ultima Underworld just two weeks before Ultima VII, the long-awaited next title in the franchise’s main series. It seemed all but certain that their game would get buried under the hype for Ultima VII, would be utterly forgotten by Origin’s marketers. Certainly marketing’s initial feedback hadn’t been encouraging. They were, they said, having trouble figuring out how to advertise Ultima Underworld. Its graphics were spectacular when seen in motion, but in still screenshots they didn’t look like much at all compared to a Wing Commander II or an Ultima VII. Blue Sky seethed with frustration, certain this was just an excuse for an anemic, disinterested advertising campaign.

In Origin’s defense, the problem their marketers pointed to was a real one. And it wasn’t really clear what they could have done about the release-date issue either. The original plan had been, as they didn’t hesitate to remind Blue Sky, to release Ultima Underworld in time for the Christmas of 1991, but the protracted development had put paid to that idea. Now, Blue Sky themselves needed Ultima Underworld to come out as quickly as possible because they needed the royalties in order to survive; for them, delaying it was simply impossible. Meanwhile Origin, who had cash-flow concerns of their own, certainly wasn’t going to delay Ultima VII, quite possibly the most expensive computer game ever made to that point, for a mere spinoff title. The situation was what it was.

The balloons fly as Doug Church, Paul Neurath, and Warren Spector celebrate Ultima Underworld‘s release.

Whatever was to happen in terms of sales, Blue Sky’s young hackers did get the satisfaction in late March of 1992 of seeing their game as a boxed product on store shelves, something more than one of them has described as a downright surreal experience. Dan Schmidt:

We were a bunch of kids straight out of school. This was the first professional project we’d ever done. We felt lucky that anyone would see it at all. We’d go into a games store and see our game there on the shelf. Someone would walk up to it, and we’d want to say, “No! No! You don’t want to buy that! We just hacked that together. It’s not, like, a real game.”

In the beginning, sales went about as expected. A snapshot from Origin’s in-house newsletter dated July 31, 1992, shows 71,000 copies of Ultima VII shipped, just 41,000 copies of Ultima Underworld. But, thanks to ecstatic reviews and strong word of mouth — Origin may have struggled to see how groundbreaking the game really was, but gamers got it immediately — Ultima Underworld kept on selling, getting stronger every month. “It was the first game that ever gave me a sense of actually being in a real place,” wrote one buyer in a letter to Origin, clear evidence that Blue Sky had absolutely nailed their original design goal. Soon industry scuttlebutt had it outselling Ultima VII by two to one. Paul Neurath claims that Ultima Underworld eventually sold more than half a million copies worldwide, an extraordinary figure for the time, and considerably more than Ultima VII or, indeed, any previous Ultima had managed.

Shortly after Ultima Underworld‘s release, Paul Neurath and Ned Lerner finally did the obvious: they merged their two companies. They had recently discovered that another, slightly older company was already operating under the name of “Blue Sky Software,” making educational products. So, they named the merged entity Looking Glass Technologies. Their first release under the name would be Ultima Underworld II.

Two months after the first Ultima Underworld appeared, a tiny company out of Dallas, Texas, who called themselves id Software released Wolfenstein 3D, another first-person game set in a 3D environment. Their game, however, had none of the complexity of Ultima Underworld, with its quests and puzzles and magic spells and its character to develop and even feed. In id’s game, you ran through the environment and killed things — period.

For the remainder of the 1990s, 3D games would exist on a continuum between the cool, high-concept innovation of Looking Glass and the hot, visceral action of id, who were interested in innovation in the area of their graphics technology but somewhat less so in terms of their basic gameplay template. id would win the argument in terms of sales, but Looking Glass would make some of the most fascinating and forward-looking games of the decade. “We were thinking, ‘Why don’t we just run around and shoot?’” says Austin Grossman, another early Looking Glass employee. “But we were interested in simulation and depth. We were driven by this holy grail of simulated worlds, by that enabled choice and creativity of the player.”

We’ll be following the two companies’ artistic dialog for a long time to come as we continue with this history. First, though, we need to give Ultima Underworld a closer look, from the perspective of the player this time, to understand why it’s not just an example of groundbreaking technology but a superb example of pure game design as well.

(Sources: the books Game Design: Theory & Practice 2nd edition by Richard Rouse III, Ultima VII and Underworld: More Avatar Adventures by Caroline Spector, Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, and Principles of Three-Dimensional Computer Animation: Modeling, Rendering, and Animating with 3D Computer Graphics by Michael O’Rourke; Questbusters of February 1992 and September 1992; PC Review of June 1992; Game Developer of April/May 1995, June/July 1995, August/September 1995, December 1995/January 1996, and April/May 1996; Commodore Magazine of January 1988; Origin’s internal newsletter Point of Origin from January 17 1992, March 27 1992, May 8 1992, August 28 1992, and December 18 1992. Online sources include “Ahead of Its Time: A History of Looking Glass” on Polygon, an interview with Paul Neurath and Doug Church on the old Ultima Online site, Gambit Game Lab’s interviews with Paul Neurath and Dan Schmidt, and Matt Barton’s interview with Paul Neurath. My thanks to Dan Schmidt and Ned Lerner for making the time to talk with me personally about their careers.

Ultima Underworld and its sequel can be purchased from GOG.com.)

Footnotes

Footnotes
1 Wing Commander was actually still known as Wingleader at this time.
 
 

Tags: , , ,

Star Control II

In this vaguely disturbing picture of Toys for Bob from 1994, Paul Reiche is at center and Fred Ford to the left. Ken Ford, who joined shortly after Star Control II was completed, is to the right.

There must have been something in the games industry’s water circa 1992 when it came to the subject of sequels. Instead of adhering to the traditional guidelines — more of the same, perhaps a little bigger — the sequels of that year had a habit of departing radically from their predecessors in form and spirit. For example, we’ve recently seen how Virgin Games released a Dune II from Westwood Studios that had absolutely nothing to do with the same year’s Dune I, from Cryo Interactive. But just as pronounced is the case of Accolade’s Star Control II, a sequel which came from the same creative team as Star Control I, yet which was so much more involved and ambitious as to relegate most of what its predecessor had to offer to the status of a mere minigame within its larger whole. In doing so, it made gaming history. While Star Control I is remembered today as little more than a footnote to its more illustrious successor, Star Control II remains as passionately loved as any game from its decade, a game which still turns up regularly on lists of the very best games ever made.



Like those of many other people, Paul Reiche III’s life was irrevocably altered by his first encounter with Dungeons & Dragons in the 1970s. “I was in high school,” he remembers, “and went into chemistry class, and there was this dude with glasses who had these strange fantasy illustrations in front of him in these booklets. It was sort of a Napoleon Dynamite moment. Am I repulsed or attracted to this? I went with attracted to it.”

In those days, when the entire published corpus of Dungeons & Dragons consisted of three slim, sketchy booklets, being a player all but demanded that one become a creator — a sort of co-designer, if you will — as well. Reiche and his friends around Berkeley, California, went yet one step further, becoming one of a considerable number of such folks who decided to self-publish their creative efforts. Their most popular product, typed out by Reiche’s mother on a Selectric typewriter and copied at Kinko’s, was a book of new spells called The Necromican.

That venture eventually crashed and burned when it ran afoul of that bane of all semi-amateur businesses, the Internal Revenue Service. It did, however, help to secure for Reiche what seemed the ultimate dream job to a young nerd like him: working for TSR itself, the creator of Dungeons & Dragons, in Lake Geneva, Wisconsin. He contributed to various products there, but soon grew disillusioned by the way that his own miserable pay contrasted with the rampant waste and mismanagement around him, which even a starry-eyed teenage RPG fanatic like him couldn’t fail to notice. The end came when he spoke up in a meeting to question the purchase of a Porsche as an executive’s company car. That got him “unemployed pretty dang fast,” he says.

So, he wound up back home, attending the University of California, Berkeley, as a geology major. But by now, it was the 1980s, and home computers — and computer games — were making their presence felt among the same sorts of people who tended to play Dungeons & Dragons. In fact, Reiche had been friends for some time already with one of the most prominent designers in the new field: Jon Freeman of Automated Simulations, designer of Temple of Apshai, the most sophisticated of the very early proto-CRPGs. Reiche got his first digital-game credit by designing The Keys of Acheron, an “expansion pack” for Temple of Apshai‘s sequel Hellfire Warrior, for Freeman and Automated. Not long after, Freeman had a falling-out with his partner and left Automated to form Free Fall Associates with his wife, programmer Anne Westfall. He soon asked Reiche to join them. It wasn’t a hard decision to make: compared to the tabletop industry, Reiche remembers, “there was about ten times the money in computer games and one-tenth the number of people.”

Freeman, Westfall, and Reiche made a big splash very quickly, when they were signed as one of the first group of “electronic artists” to join a new publisher known as Electronic Arts. Free Fall could count not one but two titles among EA’s debut portfolio in 1983: Archon, a chess-like game where the pieces fought it out with one another, arcade-style, under the players’ control; and Murder on the Zinderneuf, an innovative if not entirely satisfying procedurally-generated murder-mystery game. While the latter proved to be a slight commercial disappointment, the former more than made up for it by becoming a big hit, prompting the trio to make a somewhat less successful sequel in 1984.

After that, Reiche parted ways with Free Fall to become a sort of cleanup hitter of a designer for EA, working on whatever projects they felt needed some additional design input. With Evan and Nicky Robinson, he put together Mail Order Monsters, an evolution of an old Automated Simulations game of monster-movie mayhem, and World Tour Golf, an allegedly straight golf simulation to which the ever-whimsical Reiche couldn’t resist adding a real live dinosaur as the mother of all hazards on one of the courses. Betwixt and between these big projects, he also lent a helping hand to other games: helping to shape the editor in Adventure Construction Set, making some additional levels for Ultimate Wizard.

Another of these short-term consulting gigs took him to a little outfit called Binary Systems, whose Starflight, an insanely expansive game of interstellar adventure, had been in production for a couple of years already and showed no sign of being finished anytime soon. This meeting would, almost as much as his first encounter with Dungeons & Dragons, shape the future course of Reiche’s career, but its full import wouldn’t become clear until years later. For now, he spent two weeks immersed in the problems and promise of arguably the most ambitious computer game yet proposed, a unique game in EA’s portfolio in that it was being developed exclusively for the usually business-oriented MS-DOS platform rather than a more typical — and in many ways more limited — gaming computer. He bonded particularly with Starflight‘s scenario designer, an endlessly clever writer and artist named Greg Johnson, who was happily filling his galaxy with memorable and often hilarious aliens to meet, greet, and sometimes beat in battle.

Reiche’s assigned task was to help the Starflight team develop a workable conversation model for interacting with all these aliens. Still, he was thoroughly intrigued with all aspects of the project, so much so that he had to be fairly dragged away kicking and screaming by EA’s management when his allotted tenure with Binary Systems had expired. Even then, he kept tabs on the game right up until its release in 1986, and was as pleased as anyone when it became an industry landmark, a proof of what could be accomplished when designers and programmers had a bigger, more powerful computer at their disposal — and a proof that owners of said computers would actually buy games for them if they were compelling enough. In these respects, Starflight served as nothing less than a harbinger of computer gaming’s future. At the same, though, it was so far out in front of said future that it would stand virtually alone for some years to come. Even its sequel, released in 1989, somehow failed to recapture the grandeur of its predecessor, despite running in the same engine and having been created by largely the same team (including Greg Johnson, and with Paul Reiche once again helping out as a special advisor).

Well before Starflight II‘s release, Reiche left EA. He was tired of working on other people’s ideas, ready to take full control of his own creative output for the first time since his independent tabletop work as a teenager a decade before. With a friend named Fred Ford, who was the excellent programmer Reiche most definitely wasn’t, he formed a tiny studio — more of a partnership, really — called Toys for Bob. The unusual name came courtesy of Reiche’s wife, a poet who knew the value of words. She said, correctly, that it couldn’t help but raise the sort of interesting questions that would make people want to look closer — like, for instance, the question of just who Bob was. When it was posed to him, Reiche liked to say that everyone who worked on a Toys for Bob game should have his own Bob in mind, serving as an ideal audience of one to be surprised and delighted.

Reiche and Ford planned to keep their company deliberately tiny, signing only short-term contracts with outsiders to do the work that they couldn’t manage on their own. “We’re just people getting a job done,” Reiche said. “There are no politics between [us]. Once you start having art departments and music departments and this department and that department, the organization gets a life of its own.” They would manage to maintain this approach for a long time to come, in defiance of all the winds of change blowing through the industry; as late as 1994, Toys for Bob would permanently employ only three people.

Yet Reiche and Ford balanced this small-is-beautiful philosophy with a determination to avoid the insularity that could all too easily result. They made it a policy to show Toys for Bob’s designs-in-progress to many others throughout their evolution, and to allow the contractors they hired to work on them the chance to make their own substantive creative inputs. For the first few years, Toys for Bob actually shared their offices with another little collective who called themselves Johnson-Voorsanger Productions. They included in their ranks Greg Johnson of Starflight fame and one Robert Leyland, whom Reiche had first met when he did the programming for Murder on the Zinderneuf — Anne Westfall had had her hands full with Archon — back in the Free Fall days. Toys for Bob and Johnson-Voorsanger, these two supposedly separate entities, cross-pollinated one another to such an extent that they might almost be better viewed as one. When the latter’s first game, the cult-classic Sega Genesis action-adventure ToeJam & Earl, was released in 1991, Reiche and Ford made the credits for “Invaluable Aid.” And the influence which Leyland and particularly Johnson would have on Toys for Bob’s games would be if anything even more pronounced.

Toys for Bob’s first game, which they developed for the publisher Accolade, was called Star Control. With it, Reiche looked all the way back to the very dawn of digital gaming — to the original Spacewar!, the canonical first full-fledged videogame ever, developed on a DEC PDP-1 at the Massachusetts Institute of Technology circa 1962. In Star Control as in Spacewar!, two players — ideally, two humans, but potentially one human and one computer player, or even two computer players if the “Cyborg Mode” is turned on — fight it out in an environment that simulates proper Newtonian physics, meaning objects in motion stay in motion until a counter-thrust is applied. Players also have to contend with the gravity wells of the planets around them — these in place of the single star which affects the players’ ships in Spacewar! — as they try to blow one another up. But Star Control adds to this formula a wide variety of ships with markedly differing weaponry, defensive systems, sizes, and maneuvering characteristics. In best rock-paper-scissors fashion, certain units have massive advantages over others and vice versa, meaning that a big part of the challenge is that of maneuvering the right units into battle against the enemy’s. As in real wars, most of the battles are won or lost before the shooting ever begins, being decided by the asymmetries of the forces the players manage to bring to bear against one another. Reiche:

It was important to us that each alien ship was highly differentiated. What it means is, unlike, say, Street Fighter, where your characters are supposedly balanced with one another, our ships weren’t balanced at all, one on one. One could be very weak, and one could be very strong, but the idea was, your fleet of ships, your selection of ships in total, was as strong as someone else’s, and then it came down to which match-up did you find. One game reviewer called it, “Rock, Scissors, Vapor,” which I thought was a great expression.

Of course, even the worst match-ups leave a sliver of hope that a brilliant, valorous performance on the field of battle can yet save the day.

You can play Star Control in “Melee” mode as a straight-up free-for-all. Each player gets seven unique ships from the fourteen in the game, from which she gets to choose one for each battle. First player to destroy all of her opponent’s ships wins. But real strategy — that is to say, strategy beyond the logic of rock-paper-scissors match-ups — comes into play only with the full game, which takes the form of a collection of scenarios where each player must deploy her fleet over a galactic map. In the more complex scenarios, controlling more star systems means more resources at one’s disposal, which can be used to build more and better ships at a player’s home starbase; this part of the game draws heavily from the beloved old Atari 8-bit classic Star Raiders. A scenario editor is also included for players who get bored with the nine scenarios that come with the game.

Star Control strains nobly to accommodate many different play styles and preferences. Just as it’s possible to turn on Cyborg Mode in the strategy game and let the computer do the fighting, it’s also possible to turn on “Psytron Mode” and let the computer do the strategy while you concentrate on blowing stuff up.

Star Control in action. The red ship is the infamous Syreen Penetrator.

Yet the aspect of Star Control that most players seem to remember best has nothing to do with any of these efforts to be all things to all players. At some point in the development process, Reiche and Ford realized they needed a context for all this interstellar violence. They came up with an “Alliance of Free Stars” — which included Earthlings among its numbers — fighting a war against the evil “Ur-Quan Hierarchy.” Each group of allies/thralls conveniently consists of seven species, each with their own unique model of spaceship. Not being inclined to take any of this too seriously, Toys for Bob let their whimsy run wild in creating all these aliens, enlisting Greg Johnson — the creator of the similarly winsome and hilarious aliens who inhabit the galaxy of Starflight — to add his input as well. The rogue’s gallery of misfits, reprobates, and genetic oddities that resulted can’t help but make you smile, even if they are more fleshed-out in the manual rather than on the screen.

Reiche on the origins of the Illwrath, a race of arachnid fundamentalists who “receive spiritual endorsement in the accomplishment of vicious surprise attacks”:

The name “Illwrath” comes from an envelope I saw at the post office, which was being sent to a Ms. McIlwrath in Glasgow, Scotland. I didn’t see the “Mc” at first, and I swear, my first thought was that they must be sending that envelope to an alien. I am sure that somewhere there is a nice little Scottish lady laughing and saying, “Oh, those crazy Americans! Here’s one now calling me an evil, giant, religiously-intolerant space spider — ha, ha, ha, how cute!” Hmm… on second thought, if I am ever found beaten with bagpipes or poisoned with haggis, please contact the authorities.

Around the office, Fred Ford liked to say that the Illwrath had become so darn evil by first becoming too darn righteous, wrapping right around the righteousness scale and yielding results akin to all those old computer games which suddenly started showing negative statistics if you built up your numbers too far. (Personally, I favor this idea greatly, and, indeed, even believe it might serve as an explanation for certain forces in current American politics.)

Reiche on the Mmrnmhrm, an “almost interesting robot race” who “fear vowels almost as much as they do a Dreadnought closing in at full bore”:

When I first named the Mmrnmhrm, they actually had a pronounceable name, with vowels and everything. Then, in a sketch for the captain’s window illustration, I forgot to give them a mouth. Later, someone saw the sketch and asked me how they talked, so I clamped my lips shut and said something like, “Mrrk nsss,” thereby instituting a taboo on vowels in anything related to the race. Though the Mmrnmhrm ended up looking more like Daleks than Humans, the name stuck.

Reiche on the Syreen, a group of “humanoid females” who embody — knowingly, one likes to believe — every cliché about troglodyte gamers and the fairer sex, right down to their bulbous breasts that look like they’re filled with sand (their origin story also involves the San Francisco earthquake of 1989):

It was an afternoon late last October in San Francisco when Fred Ford, Greg Johnson, and I sat around a monitor trying to name the latest ship design for our new game. The space vessel on the computer screen looked like a copper-plated cross between Tin Tin’s Destination Moon rocketship and a ribbed condom. Needless to say, we felt compelled to christen this ship carefully, with due consideration for our customers’ sensibilities as well as our artistic integrity. “How about the Syreen Penetrator?” Fred suggested without hesitation. Instantly, the ground did truly rise up and smite us! WHAM-rumble-rumble-WHAM! We were thrown around our office like the bridge crew of the starship Enterprise when under fire by the Klingons. I dimly remember standing in a doorframe, watching the room flex like a cheap cardboard box and shouting, “Maybe that’s not such a great name!” and “Gee, do you think San Francisco’s still standing?” Of course, once the earth stopped moving, we blithely ignored the dire portent, and the Syreen’s ship name, “The Penetrator,” was graven in code.

Since then, we haven’t had a single problem. I mean, everyone has a disk crash two nights before a program is final, right? And hey, accidents happen. Brake pads just don’t last forever! My limp is really not that bad, and Greg is almost speaking normally these days.

Star Control was released in 1990 to cautiously positive reviews and reasonable sales. For all its good humor, it proved a rather polarizing experience. The crazily fast-paced action game at its heart was something that about one-third of players seemed to take to and love, while the rest found it totally baffling, being left blinking and wondering what had just happened as the pieces of their exploded ship drifted off the screen about five seconds after a fight had begun. For these people, Star Control was a hard sell: the strategic game just wasn’t deep enough to stand on its own for long, and, while the aliens described in the manual were certainly entertaining, this was a computer game, not a Douglas Adams book.

Still, the game did sufficiently well that Accolade was willing to fund a sequel. And it was at this juncture that, as I noted at the beginning of this article, Reiche and Ford and their associates went kind of nuts. They threw out the less-than-entrancing strategy part of the first game, kept the action part and all those wonderful aliens, and stuck it all into a grand adventure in interstellar space that owed an awful lot to Starflight — more, one might even say, than it owed to Star Control I.

As in Starflight, you roam the galaxy in Star Control II: The Ur-Quan Masters to avert an apocalyptic threat, collecting precious resources and even more precious clues from the planets you land on, negotiating with the many aliens you meet and sometimes, when negotiations break down, blowing them away. The only substantial aspect of the older game that’s missing from its spiritual successor is the need to manage a bridge crew who come complete with CRPG-style statistics. Otherwise, Star Control II does everything Starflight does and more. The minigame of resource collection on planets’ surfaces, dodging earthquakes and lightning strikes and hostile lifeforms, is back, but now it’s faster paced, with a whole range of upgrades you can add to your landing craft in order to visit more dangerous planets. Ditto space combat, which is now of the arcade style from Star Control I — if, that is, you don’t have Cyborg Mode turned on, which is truly a godsend, the only thing that makes the game playable for many of us. You still need to upgrade your ship as you go along to fight bigger and badder enemies and range faster and farther across space, but now you also can collect a whole fleet of support ships to accompany you on your travels (thus preserving the rock-paper-scissors aspect of Star Control I). I’m not sure that any of these elements could quite carry a game alone, but together they’re dynamite. Much as I hate to employ a tired reviewer’s cliché like “more than the sum of its parts,” this game makes it all but unavoidable.

And yet the single most memorable part of the experience for many or most of us remains all those wonderful aliens, who have been imported from Star Control I and, even better, moved from the pages of the manual into the game proper. Arguably the most indelible of them all, the one group of aliens that absolutely no one ever seems to forget, are the Spathi, a race of “panicked mollusks” who have elevated self-preservation into a religious creed. Like most of their peers, they were present in the first Star Control but really come into their own here, being oddly lovable despite starting the game on the side of the evil Ur-Quan. The Spathi owe more than a little something to the Spemin, Starflight‘s requisite species of cowardly aliens, but are based at least as much, Reiche admits a little sheepishly, on his own aversion to physical danger. Their idea of the perfect life was taken almost verbatim from a conversation about same that Reiche and Ford once had over Chinese food at the office. Here, then, is Reiche and the Spathi’s version of the American Dream:

I knew that someday I would be vastly rich, wealthy enough to afford a large, well-fortified mansion. Surrounding my mansion would be vast tracts of land, through which I could slide at any time I wished! Of course, one can never be too sure that there aren’t monsters hiding just behind the next bush, so I would plant trees to climb at regular, easy-to-reach intervals. And being a Spathi of the world, I would know that some monsters climb trees, though often not well, so I would have my servants place in each tree a basket of perfect stones. Not too heavy, not too light — just the right size for throwing at monsters.

“Running away and throwing rocks,” explains Reiche, “extrapolated in all ways, has been one of my life strategies.”

The Shofixti, who breed like rabbits. Put the one remaining female in the galaxy together with the one remaining male, wait a couple of years… and poof, you have an army of fuzzy little warmongers on your side. They fight with the same enthusiasm they have for… no, we won’t go there.

My personal favorite aliens, however, are the bird-like Pkunk, a peaceful, benevolent, deeply philosophical race whose ships are nevertheless fueled by the insults they spew at their enemies during battle. They are, of course, merely endeavoring to make sure that their morality doesn’t wrap back around to zero and turn them evil like the Illwrath. “Never be too good,” says Reiche. “Insults, pinching people when they aren’t looking… that’ll keep you safe.”

In light of the aliens Greg Johnson had already created for Starflight, not to mention the similarities between Starflight‘s Spemin and Star Control‘s Spathi, there’s been an occasional tendency to perhaps over-credit his contribution — valuable though it certainly was — to Toys for Bob’s own space epic. Yet one listen to Reiche and Ford in interviews should immediately disabuse anyone of the notion that the brilliantly original and funny aliens in Star Control II are there entirely thanks to Johnson. After listening to Reiche in particular for a few minutes, it really is blindingly obvious that this is the sense of humor behind the Spathi and so many others. Indeed, anyone who has played the game can get a sense of this just from reading some of his quotes in this very article.

There’s a rich vein of story and humor running through even the most practical aspects of Star Control II, as in this report from a planet’s surface. The two complement one another rather than clashing, perhaps because Toys for Bob is clever enough to understand that less is sometimes more. Who are the Liebermann triplets? Who knows? But the line makes you laugh, and that’s the important thing. When a different development team took the reins to make a Star Control III, Reiche’s first piece of advice to them was, “For God’s sake, don’t try to explain everything.” Many a lore-obsessed modern game could afford to take the same advice to heart.

Long after every other aspect of the game has faded from memory, its great good humor, embodied in all those crazy aliens, will remain. It may be about averting a deadly serious intergalactic apocalypse, but, for all that, Star Control II is as warm and fuzzy a space opera as you’ll ever see.

Which isn’t to say that it doesn’t go in for plot. In fact, the sequel’s plot is as elaborate as its predecessor’s was thin; the backstory alone takes up some twenty pages in the manual. The war which was depicted in Star Control I, it turns out, didn’t go so well for the good guys; the sequel begins with you entering our solar system in command of the last combat-worthy craft among a shattered and defeated Alliance of Free Stars. The Ur-Quan soon get wind of your ship’s existence and the last spark of defiance against their rule that it represents, and send a battlefleet toward Earth to snuff it out. And so the race is on to rebuild the Alliance and assemble a fleet of your own before the Ur-Quan arrive. How you do so is entirely up to you. Suffice to say that Earth’s old allies are out there. It’s up to you to find the aliens and convince them to join you in whatever sequence seems best, while finding the resources you need to fuel and upgrade your spaceship and juggling a whole lot of other problems at the same time. This game is as nonlinear as they come.

Star Control II takes itself seriously in the places where it’s important to do so, but never too seriously. Anyone bored with the self-consciously “dark” fictions that so often dominate in our current era of media will find much to appreciate here.

When asked to define what makes a good game, Paul Reiche once said that it “has to have a fun core, which is a one-sentence description of why it’s fun.” Ironically, Star Control II is an abject failure by this standard, pulling in so many directions as to defy any such holistic description. It’s a strategy game of ship and resource management; it’s an action game of ship-versus-ship combat; it’s an adventure game of puzzle-solving and clue-tracking. Few cross-genre games have ever been quite so cross-genre as this one. It really shouldn’t work, but, for the most part anyway, it does. If you’re a person whose ideal game lets you do many completely different things at every session, this might just be your dream game. It really is an experience of enormous richness and variety, truly a game like no other. Small wonder that it’s attracted a cult of players who will happily declare it to be nothing less than the best game ever made.

For my part, I have a few too many reservations to go quite that far. Before I get to them, though, I’d like to let Reiche speak one more time. Close to the time of Star Control II‘s release, he outlined his four guiding principles of game design. Star Control II conforms much better to these metrics than it does to that of the “one-sentence description.”

First, [games should be] fun, with no excuses about how the game simulates the agony and dreariness of the real world (as though this was somehow good for you). Second, they [should] be challenging over a long period of time, preferably with a few ability “plateaus” that let me feel in control for a period of time, then blow me out of the water. Third, they [should] be attractive. I am a sucker for a nice illustration or a funky riff. Finally, I want my games to be conceptually interesting and thought-provoking, so one can discuss the game with an adult and not feel silly.

It’s in the intersection between Reiche’s first and second principles that I have my quibbles with Star Control II. It’s a rather complicated, difficult game by design, which is fair enough as long as it’s complex and difficult in a fun way. Some of its difficulty, however, really doesn’t strike me as being all that much fun at all. Those of you who’ve been reading this blog for a while know that I place enormous weight on fairness and solubility when it comes to the games I review, and don’t tend to cut much slack to those that can only be enjoyed and/or solved with a walkthrough or FAQ to hand. On this front, Star Control II is a bit problematic, due largely to one questionable design choice.

Star Control II, you see, has a deadline. You have about five years before Earth is wiped out by the Ur-Quan (more precisely, by the eviller of the two factions of the Ur-Quan, but we won’t get into that here). Fans will tell you, by no means entirely without justification, that this is an essential part of the game. One of the great attractions of Star Control II is its dynamic universe which just keeps evolving, with or without your intervention: alien spaceships travel around the galaxy just like yours is doing, alien races conquer others and are themselves conquered, etc.

All of this is undoubtedly impressive from a game of any vintage, let alone one as old and technologically limited as this one. And the feeling of inhabiting such a dynamic universe is undoubtedly bracing for anyone used to the more static norm, where things only happen when you push them to happen. Yet it also has its drawbacks, the most unfortunate of which is the crushing sense of futility that comes after putting dozens of hours into the game only to lose it irrevocably. The try-and-try-again approach can work in small, focused games that don’t take long to play and replay, such as the early mysteries of Infocom. In a sprawling epic like this, however… well, does anyone really want to put those dozens of hours in all over again, clicking through page after page of the same text?

Star Control II‘s interface felt like something of a throwback even in its own time. By 1992, computer games had almost universally moved to the mouse-driven point-and-click model. Yet this game relies entirely on multiple-choice menus, activated by the cursor keys and/or a joystick. Toys for Bob was clearly designing with possible console ports in mind. (Star Control was ported to the Sega Genesis, but, as it happened, Star Control II would never get the same honor, perhaps because its sales didn’t quite justify the expense and/or because its complexity was judged unsuited to the console market.) Still, for all that it’s a little odd, the interface is well thought-through, and you get used to it quickly.

There’s an undeniable tension between this rich galaxy, full of unusual sights and entertaining aliens to discover, and the need to stay relentlessly on-mission if you hope to win in the end. I submit that the failure to address this tension is, at bottom, a failure of game design. There’s much that could have been done. One solution might have been to tie the evolving galaxy to the player’s progress through the plot rather than the wall clock, a technique pioneered in Infocom’s Ballyhoo back in 1986 and used in countless narrative-oriented games since. It can convey the impression of rising danger and a skin-of-the-teeth victory every time without ever having to send the player back to square one. In the end, the player doesn’t care whether the exhilarating experience she’s just had is the result of a meticulous simulation coincidentally falling into place just so, or of a carefully manipulated sleight of hand. She just remembers the subjective experience.

But if such a step is judged too radical — too counter to the design ethos of the game — other remedies could have been employed. To name the most obvious, the time limit could have been made more generous; Starflight as well has a theoretical time limit, but few ever come close to reaching it. Or the question of time could have been left to the player — seldom a bad strategy in game design — by letting her choose from a generous, moderate, and challenging time limit before starting the game. (This approach was used to good effect by the CRPG The Magic Candle among plenty of other titles over the years.)

Instead of remedying the situation, however, Reiche and his associates seemed actively determined to make it worse with some of their other choices. To have any hope of finishing the game in time, you need to gain access to a new method of getting around the galaxy, known as “quasi-space,” as quickly as possible. Yet the method of learning about quasi-space is one of the more obscure puzzles in the game, mentioned only in passing by a couple of the aliens you meet, all too easy to overlook entirely. Without access to quasi-space, Star Control II soon starts to feel like a fundamentally broken, unbalanced game. You trundle around the galaxy in your truck of a spaceship, taking months to reach your destinations and months more to return to Earth, burning up all of the minerals you can mine just to feed your engines. And then your time runs out and you lose, never having figured out what you did wrong. This is not, needless to say, a very friendly way to design a game. Had a few clues early on shouted, “You need to get into quasi-space and you may be able to do so here!” just a little more loudly, I may not have felt the need to write any of the last several paragraphs.

I won’t belabor the point any more, lest the mob of Star Control II zealots I can sense lurking in the background, sharpening their pitchforks, should pounce. I’ll say only that this game is, for all its multifaceted brilliance, also a product of its time — a time when games were often hard in time-extending but not terribly satisfying ways, when serious discussions about what constituted fair and unfair treatment of the player were only just beginning to be had in some quarters of the industry.

Searching a planet’s surface for minerals, lifeforms, and clues. Anyone who has played Starflight will feel right at home with this part of the game in particular.

Certainly, whatever our opinion of the time limit and the game’s overall fairness, we have to recognize what a labor of love Star Control II was for Paul Reiche, Fred Ford, and everyone who helped bring it to fruition, from Greg Johnson and Robert Leyland to all of the other writers and artists and testers who lent it their talents. Unsurprisingly given its ambition, the project went way beyond the year or so Accolade had budgeted for it. When their publisher put their foot down and said no more money would be forthcoming, Reiche and Ford reached deep into their own pockets to carry it through the final six months.

As the project was being wrapped up, Reiche realized he still had no music, and only about $1500 left for acquiring some. His solution was classic Toys for Bob: he ran an online contest for catchy tunes, with prizes of $25, $50, and $100 — in addition to the opportunity to hear one’s music in (hopefully) a hit game, of course. The so-called “tracker” scene in Europe stepped up with music created on Commodore Amigas, a platform for which the game itself would never be released. “These guys in Europe [had] just built all these ricky-tink programs to play samples out,” says Reiche. “They just kept feeding samples, really amazing soundtracks, out into the net just for kicks. I can’t imagine any of these people were any older than twenty. It makes me feel like I’m part of a bigger place.”

Upon its release on November 30, 1992 — coincidentally, the very same day as Dune II, its companion in mislabeled sequels — Star Control II was greeted with excellent reviews, whose enthusiasm was blunted only by the game’s sheer unclassifiability. Questbusters called it “as funny a parody of science-fiction role-playing as it is a well-designed and fun-to-play RPG,” and named it “Best RPG of the Year” despite it not really being a CRPG at all by most people’s definitions. Computer Gaming World placed it on “this reviewer’s top-ten list of all time” as “one of the most enjoyable games to review all year,” and awarded it “Adventure Game of the Year” alongside Legend Entertainment’s far more traditional adventure Eric the Unready.

Sales too were solid, if not so enormous as Star Control II‘s staying power in gamers’ collective memory might suggest. Like Dune II, it was probably hurt by being billed as a sequel to a game likely to appeal most to an entirely different type of player, as it was by the seeming indifference of Accolade. In the eyes of Toys for Bob, the developer/publisher relationship was summed up by the sticker the latter started putting on the box after Star Control II had collected its awards: “Best Sports Game of 1992.” Accolade was putting almost all of their energy into sports games during this period, didn’t have stickers handy for anything else, and just couldn’t be bothered to print up some new ones.

Still, the game did well enough that Toys for Bob, after having been acquired by a new CD-ROM specialist of a publisher called Crystal Dynamics, ported it to the 3DO console in 1994. This version added some eight hours of spoken dialog, but cut a considerable amount of content that the voice-acting budget wouldn’t cover. Later, a third Star Control would get made — albeit not by Toys for Bob but by Legend Entertainment, through a series of intellectual-property convolutions we won’t go into in this article.

Toys for Bob themselves have continued to exist right up to the present day, a long run indeed in games-industry terms, albeit without ever managing to return to the Star Control universe. They’re no longer a two-man operation, but do still have Paul Reiche III and Fred Ford in control.

To this day, Star Control II remains as unique an experience as it was in 1992. You’ve never played a game quite like this one, no matter how many other games you’ve played in your time. Don’t even try to categorize it. Just play it, and see what’s possible when a talented design team throws out all the rules. But before you do, let me share just one piece of advice: when an alien mentions something about a strange stellar formation near the Chandrasekhar constellation, pay attention! Trust me, it will save you from a world of pain…

(Sources: Compute!’s Gazette of November 1984; Compute! of January 1992 and January 1993; Computer Gaming World of November 1990, December 1990, March 1993, and August 1993; InterActivity of November/December 1994; Questbusters of January 1993; Electronic Gaming Monthly of May 1991; Sega Visions of June 1992; Retro Gamer 14 and 15. Online sources include Ars Technica‘s video interview with Paul Reiche III and Fred Ford; Matt Barton’s interviews with the same pair in Matt Chat 95, 96, and 97; Grognardia‘s interview with Reiche; The Escapist‘s interview with Reiche; GameSpot‘s interview with Reiche.

Star Control I and II are available as a package purchase at GOG.com. Another option for experiencing Star Control II is The Ur-Quan Masters, a loving open-source re-creation based on Toys for Bob’s 3DO source code.)

 
52 Comments

Posted by on December 21, 2018 in Digital Antiquaria, Interactive Fiction