RSS

Search results for ‘epyx’

Controlling the Spice, Part 3: Westwood’s Dune

Brett Sperry and Louis Castle

Louis Castle first became friends with Brett Sperry in 1982, when the two were barely out of high school. Castle was selling Apple computers at the time at a little store in his native Las Vegas, and Sperry asked him to print out a file for him. “I owned a printer, so I invited him over,” remembers Castle, “and he looked at some animation and programming I was working on.”

They found they had a lot in common. They were both Apple II fanatics, both talented programmers, and both go-getters accustomed to going above and beyond what was expected of them. Through Castle’s contacts at the store — the home-computer industry was quite a small place back then — they found work as contract programmers, porters who moved software from one platform to another. It wasn’t the most glamorous job in the industry, but, at a time when the PC marketplace was fragmented into close to a dozen incompatible platforms, it was certainly a vital one. Sperry and Castle eventually came to specialize in the non-trivial feat of moving slick action games such as Dragonfire and Impossible Mission from the Commodore 64 to the far less audiovisually capable Apple II without sacrificing all of their original appeal.

In March of 1985, they decided to give up working as independent contractors and form a real company, which they named Westwood Associates. The “Westwood” came from the trendy neighborhood of Los Angeles, around the UCLA campus, where they liked to hang out when they drove down from Las Vegas of a weekend. “We chose Westwood as the company name,” says Castle, “to capture some of the feeling of youthful energy and Hollywood business.” The “Associates,” meanwhile, was nicely non-specific, meaning they could easily pivot into other kinds of software development if the games work should dry up for some reason. (The company would become known as Westwood Studios in 1992, by which time it would be pretty clear that no such pivot would be necessary.)

The story of Westwood’s very first project is something of a harbinger of their future. Epyx hired them to port the hoary old classic Temple of Apshai to the sexy new Apple Macintosh, and Sperry and Castle got a bit carried away. They converted the game from a cerebral turn-based CRPG to a frenetic real-time action-adventure, only to be greeted with howls of protest from their employers. “Epyx felt,” remembers Castle with no small sense of irony, “that gamers would not want to make complicated tactical and strategic decisions under pressure.” More sensibly, Epyx noted that Westwood had delivered not so much a port as a different game entirely, one they couldn’t possibly sell as representing the same experience as the original. So, they had to begrudgingly switch it back to turn-based.

This blind alley really does have much to tell us about Westwood’s personality. Asked many years later what common thread binds together their dizzily eclectic catalog of games, Louis Castle hit upon real-time gameplay as the one reasonable answer. This love of immediacy would translate, as we’ll soon see, into the invention of a whole new genre known as real-time strategy, which would become one of the most popular of them all by the end of the 1990s.

But first, there were more games to be ported. Having cut their teeth making Commodore 64 games work within the constraints of the Apple II, they now found themselves moving them in the other direction: “up-porting” Commodore 64 hits like Super Cycle and California Games to the Atari ST and Commodore Amiga. Up-porting was in its way as difficult as down-porting; owners of those more expensive 16-bit machines expected their capabilities to be used to good effect, even by games that had originated on more humble platforms, and complained loudly at straight, vanilla ports that still looked like they were running on an 8-bit computer. Westwood became one of the best in the industry at a very tricky task, not so much porting their source games in any conventional sense as remaking them, with dramatically enhanced graphics and sound. They acquired a reputation for technical excellence, particularly when it came to their compression systems, which allowed them to pack their impressive audiovisuals into very little space and stream them in quickly from disk. And they made good use of the fact that the Atari ST and Amiga were both built around the same Motorola 68000 CPU by developing a library for the Amiga which translated calls to the ST’s operating system into their Amiga equivalents on the fly; thus they could program a game for the ST and get the same code running on the Amiga with very few changes. If you wanted an 8-to-16-bit port done efficiently and well, you knew you could count on Westwood.

Although they worked with quite a number of publishers, Westwood cultivated a particularly close relationship with SSI, a publisher of hardcore wargames who badly needed whatever pizazz Sperry and Castle’s flashier aesthetic could provide. When SSI wanted to convince TSR to give them the hugely coveted Dungeons & Dragons license in 1987, they hired Westwood to create some of the graphics demos for their presentation. The pitch worked; staid little SSI shocked the industry by snatching the license right out from under the noses of heavier hitters like Electronic Arts. Westwood remained SSI’s most trusted partner thereafter. They ported the  “Gold Box” line of Dungeons & Dragons CRPGs to the Atari ST and Amiga with their usual flair, adding mouse support and improving the graphics, resulting in what many fans consider to be the best versions of all.

Unfortunately, Westwood’s technical excellence wasn’t always paired with equally good design sense when they occasionally got a chance to make an original game of their own. Early efforts like Mars Saga, Mines of Titan, Questron II, and BattleTech: The Crescent Hawk’s Inception all have a lot of ideas that aren’t fully worked through and never quite gel, along with third acts that fairly reek of, “We’re out of time and money, and now we just have to get ‘er done.” Ditto the first two original games they did for SSI under the Dungeons & Dragons license: the odd California Games/Gold Box mashup Hillsfar and the even odder dragon flight simulator Dragon Strike.

Still, Brett Sperry and Louis Castle were two very ambitious young men, and neither was willing to settle for the anonymous life of a strict porting house. Nor did such a life make good business sense: with the North American market at least slowly coalescing around MS-DOS machines, it looked like porting houses might soon have no reason to exist. The big chance came when Sperry and Castle convinced SSI to let them make a full-fledged Dungeons & Dragons CRPG of their own — albeit one that would be very different from the slow-paced, turn-based Gold Box line. Westwood’s take on the concept would run in — you guessed it — real time, borrowing much from FTL’s Dungeon Master, one of the biggest sensations of the late 1980s on the Atari ST and Amiga. The result was Eye of the Beholder.

At the time of the game’s release in February of 1991, FTL had yet to publish an MS-DOS port of Dungeon Master. Eye of the Beholder was thus the first real-time dungeon crawl worth its salt to become available on North America’s computer-gaming platform of choice, and this fact, combined with the Dungeons & Dragons logo on the box, yielded sales of 130,000 copies in the United States alone — a sales figure far greater than that of any previous original Westwood game, greater even than all but the first two of SSI’s flagship Gold Box line. The era of Westwood as primarily a porting house had passed.


Over at Virgin Games, the indefatigable Martin Alper, still looking to make a splash in the American market, liked what he saw in Westwood, this hot American developer who clearly knew how to make the sorts of games Americans wanted to buy. And yet they were also long-established experts at getting the most out of the Amiga, Europe’s biggest gaming computer; Westwood would do their own port of Eye of the Beholder to the Amiga, in which form it would sell in considerable numbers in Europe as well. Such a skill set made the little Las Vegas studio immensely attractive to this executive of Virgin, a company of truly global reach and vision.

Alper knew as soon as he saw Eye of the Beholder that he wanted to make Westwood a permanent part of the Virgin empire, but, not wanting to spook his target, he approached them initially only to ask them to develop a game for him. As far as Alper or anyone else outside Virgin’s French subsidiary knew at this point, the Cryo Dune game was dead. But Alper hadn’t gone to all the trouble of securing the license not to use it. In April of 1991 — just one month before the departure of Jean-Martial Lefranc from Virgin Loisirs, combined with a routine audit, would bring the French Dune conspiracy to light — Alper signed Westwood to make a Dune game of their own. It wasn’t hard to convince them to take it on; it turned out that Dune was Brett Sperry’s favorite novel of all time.

Even better, Westwood, perhaps influenced by their association with the turn-based wargame mavens at SSI, had already been playing around with ideas for a real-time (of course!) game of military conflict. “It was an intellectual puzzle for me,” says Sperry. “How can we take this really small wargame category, bring in some fresh ideas, and make it a fun game that more gamers can play?” The theme was originally to be fantasy. But, says Louis Castle, “when Virgin offered up the Dune license, that sealed our fate and pulled us away from a fantasy theme.”

Several months later, after Martin Alper reluctantly concluded that Cryo’s Dune had already cost too much money and had too much potential of its own to cancel, he found himself with quite a situation on his hands. Westwood’s Dune hadn’t been in development anywhere near as long as Cryo’s, but he was already loving what he had seen of it, and was equally unwilling to cancel that project. In an industry where the average game frankly wasn’t very good at all, having two potentially great ones might not seem like much of a problem. For Virgin’s marketers, however, it was a nightmare. Their solution, which pleased neither Cryo nor Westwood much at all, was to bill the latter’s game as a sequel to the former’s, naming it Dune II: The Building of a Dynasty.

Westwood especially had good reason to feel disgruntled. They were understandably concerned that saddling their fresh, innovative new game with the label of sequel would cause it to be overlooked. The fact was, the sequel billing made no sense whatsoever, no matter how you looked at it. While both games were, in whole or in part, strategy games that ran in real time, their personalities were otherwise about as different as it was possible for two games to be. By no means could one imagine a fan of Cryo’s plot-heavy, literary take on Dune automatically embracing Westwood’s action-heavy, militaristic effort. Nor did the one game follow on from the other in the sense of plot chronology; both games depict the very same events from the novel, albeit with radically different sensibilities.

The press too was shocked to learn that a sequel to Cryo’s Dune was due to be released the very same year as its predecessor. “This has got to be a new world record for the fastest ever followup,” wrote the British gaming magazine The One a few weeks after the first Dune‘s release. “Unlike the more adventure-based original, Dune II is expected to be more of a managerial experience comparable to (if anything) the likes of SimCity, as the two warring houses of Atreides and Harkonnen attempt to mine as much spice as possible and blow each other up at the same time.”

The Westwood Studios team who made Dune II. On the front row are Ren Olsen and Dwight Okahara; on the middle row are Judith Peterson, Joe Bostic, Donna Bundy, and Aaron Powell; on the back row are Lisa Ballan and Scott Bowen. Of this group, Bostic and Powell were the game’s official designers, and thus probably deserve the most credit for inventing the genre of real-time strategy. Westwood’s co-founder Brett Sperry also played a critical — perhaps the critical — conceptual role.

It was, on the whole, about as good a description of Dune II as any that appeared in print at the time. Not only was the new game dramatically different from its predecessor, but it wasn’t quite like anything at all which anyone had ever seen before, and coming to grips with it wasn’t easy. Legend has it that Brett Sperry started describing Dune II in shorthand as “real-time strategy” very early on, thus providing a new genre with its name. If so, though, Virgin’s marketers didn’t get the memo. They would struggle mightily to describe the game, and what they ended up with took unwieldiness to new heights: a “strategy-based resource-management simulation with a heavy real-time combat element.” Whew! “Real-time strategy” does have a better ring to it, doesn’t it?

These issues of early taxonomy, if you will, are made intensely interesting by Dune II‘s acknowledged status as the real-time-strategy urtext. That is to say that gaming histories generally claim, correctly on the whole in my opinion, that it was the first real-time strategy game ever.

Yet we do need to be careful with our semantics here. There were actually hundreds of computerized strategy games prior to Dune II which happened to be played in real time, not least among them Cryo’s Dune. The neologism of “real-time strategy” (“RTS”) — like, say, those of “interactive fiction” or even “CRPG” — has a specific meaning separate from the meanings of the individual words which comprise it. It has come to denote a very specific type of game — a game that, yes, runs in real time, but also one where players start with a largely blank slate, gather resources, and use them to build a variety of structures. These structures can in turn build military units who can carry out simple orders of the “attack there” or “defend this” stripe autonomously. The whole game plays on an accelerated time scale which yields bursts if not sustained plateaus of activity as frantic as any action game. This combination of qualities is what Westwood invented, not the abstract notion of a strategy game played in real time rather than turns.

Of course, all inventions stand on the shoulders of those that came before, and RTS is no exception. It can be challenging to trace the bits and pieces which would gel together to become Dune II only because there are so darn many of them.

Utopia

The earliest strategy game to replace turns with real time may have been Utopia, an abstract two-player game of global conquest designed and programmed by Don Daglow for the Intellivision console in 1982. The same year, Dan Bunten’s [1]Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. science-fiction-themed Cytron Masters and Chris Crawford’s Roman-themed Legionnaire became the first computer-based strategy games to discard the comfortable round of turns for something more stressful and exciting. Two years later, Brøderbund’s very successful The Ancient Art of War exposed the approach to more players than ever before.

In 1989, journalists started talking about a new category of “god game” in the wake of Will Wright’s SimCity and Peter Molyneux’s Populous. The name derived from the way that these games cast you as a god able to control your people only indirectly, by altering their city’s infrastructure in SimCity or manipulating the terrain around them in Populous. This control was accomplished in real time. While, as we’ve seen, this in itself was hardly a new development, the other innovations of these landmark games were as important to the eventual RTS genre as real time itself. No player can possibly micromanage an army of dozens of units in real time — at least not if the clock is set to run at anything more than a snail’s pace. For the RTS genre as we’ve come to know it to function, units must have a degree of autonomous artificial intelligence, must be able to carry out fairly abstract orders and react to events on the ground in the course of doing so. SimCity and Populous demonstrated for the first time how this could work.

By 1990, then, god games had arrived at a place that already bore many similarities to the RTS games of today. The main things still lacking were resource collecting and building. And even these things had to some extent already been done in non-god games: a 1987 British obscurity called Nether Earth demanded that you build robots in your factory before sending them out against your enemy, although there was no way of building new structures beyond your starting factory. Indeed, even the multiplayer death matches that would come to dominate so much of the RTS genre a generation later had already been pioneered before 1990, perhaps most notably in Dan Bunten’s 1988 game Modem Wars.

Herzog Zwei

But the game most often cited as an example of a true RTS in form and spirit prior to Dune II, if such a thing is claimed to exist at all, is one called Herzog Zwei, created by the Japanese developer Technosoft and first published for the Sega Genesis console in Japan in 1989. And yet Herzog Zwei‘s status as an alternative RTS urtext is, at the very least, debatable.

Players each start the game with a single main base, and an additional nine initially neutral “outposts” are scattered over the map. Players “purchase” units in the form of Transformers-like flying robots, which they then use to try to conquer outposts; controlling more of them yields more revenue, meaning one can buy more units more quickly. Units aren’t completely out of the player’s direct control, as in the case of SimCity and Populous, but are ordered about in a rather general way: stand and fight here, patrol this radius, retreat to this position or outpost. The details are then left to the unit-level artificial intelligence. For this reason alone, perhaps, Herzog Zwei subjectively feels more like an RTS than any game before it. But on the other hand, much that would come to mark the genre is still missing: resource collection is still abstracted away entirely, while there’s only one type of unit available to build, and no structures. In my opinion, Herzog Zwei is best seen as another of the RTS genre’s building blocks rather than an urtext.

The question of whether and to what extent Herzog Zwei influenced Dune II is a difficult one to answer with complete assurance. Brett Sperry and Louis Castle have claimed not to even have been aware of the Japanese game’s existence prior to making theirs. In fact, out of all of the widely acknowledged proto-RTS games I’ve just mentioned, they cite only Populous as a major influence. Their other three stated inspirations make for a rather counter-intuitive trio on the face of it: the 1984 Apple II game Rescue Raiders, a sort of Choplifter mated to a strategic wargame; the 1989 NEC TurboGrafx-16 game Military Madness, an abstract turn-based strategy game; and, later in the development process, Sid Meier’s 1991 masterpiece Civilization (in particular, the tech tree therein).

Muddying these waters, however, is an anecdote from Stephen Clarke-Willson, an executive in Virgin’s American offices during the early 1990s. He says that “everyone at the office was playing Herzog Zwei” circa April of 1991: “I was given the task of figuring out what to do with the Dune license since I’d read the book a number of times. I thought from a gaming point of view the real stress was the battle to control the spice, and that a resource-strategy game would be good.” Clarke-Willson further claims that from the outset “Westwood agreed to make a resource-strategy game based on Dune, and agreed to look at Herzog Zwei for design ideas.” Sperry and Castle, by contrast, describe a far more open-ended agreement that called for them simply to make something interesting out of the license, allowing the specifics of their eventual Dune to arise organically from the work they had already started on their fantasy-themed real-time wargame.

For what it’s worth, neither Sperry nor Castle has a reputation for dishonesty. Quite the opposite, in fact: Westwood throughout its life stood out as a bastion of responsibility and stability in an industry not much known for either. So, whatever the true facts may be, we’re better off ascribing these contradictory testimonies to the vagaries of memories than to disingenuousness. Certainly, regardless of the exact influences that went into it, Dune II has an excellent claim to the title of first RTS in the modern neologism’s sense. This really was the place where everything came together and a new genre was born.

In the novel of Dune, the spice is the key to everything. In the Westwood game, even in the absence of almost everything else that makes the novel memorable, the same thing is true. The spice was, notes Louis Castle, “very adaptable to this harvest, grow, build for war, attack gambit. That’s really how [Dune II] came about.” Thus was set up the gameplay loop that still defines the RTS genre to this day — all stemming from a novel published in 1965.

The overarching structure of Dune II is also far more typical of the games of today than those of its peers in the early 1990s. You play a “campaign” consisting of nine scenarios, linked by snippets of narrative, that grow progressively more difficult. There are three of these campaigns to choose from, depicting the war for Arrakis from the standpoint of House Atreides, House Harkonnen, and House Ordos — the last being a cartel of smugglers who don’t appear in the novel at all, having been invented for a non-canonical 1984 source book known as The Dune Encyclopedia. In addition to a different narrative, each faction has a slightly different slate of structures and units at its command.

There’s the suggestion of a more high-level strategic layer joining the scenarios together: between scenarios, the game lets you choose your next target for attack by clicking on a territory on a Risk-like map of the planet. Nothing you do here can change the fixed sequence of scenario goals and opposing enemy forces the game presents, but it does change the terrain on which the subsequent scenario takes place, thus adding a bit more replayability for the true completionists.

You begin a scenario with a single construction yard, a handful of pre-built units, and a sharply limited initial store of spice, that precious resource from which everything else stems. Fog of war is implemented; in the beginning, you can see only the territory that immediately surrounds your starting encampment. You’ll thus want to send out scouts immediately, to find deposits of spice ripe for harvesting and to learn where the enemy is.

While your scouts go about their business, you’ll want to get an economy of sorts rolling back at home. The construction yard with which you begin can build any structure available in a given scenario, although it’s advisable to first build a “concrete slab” to serve as its foundation atop the shifting sands of Arrakis. The first real structure you’re likely to build is a “wind trap” to provide power to those that follow. Then you’ll want a “spice refinery,” which comes complete with a unit known as a “harvester,” able to collect spice from the surrounding territory and return it to the refinery to become the stuff of subsequent building efforts. Next you’ll probably want an “outpost,” which not only lets you see much farther into the territory around your base without having to deploy units there but is a prerequisite for building any new units at all. After your outpost is in place, building each type of unit requires its own kind of structure, from a “barracks” for light infantry (read: cannon fodder) to a “high tech factory” for the ultimate weapon of airpower. Naturally, more powerful units are more expensive, both in terms of the spice required to build the structures that produce them and that required to build the units themselves afterward.

Your real goal, of course, is to attack and overwhelm the enemy — or, in some later scenarios, enemies — before he or they have the chance to do the same to you. There’s a balancing act here that one could describe as the central dilemma of the game. Just how long do you concentrate on building up your infrastructure and military before you throw your units into battle? Wait too long and the enemy could get overwhelmingly powerful before you cut him down to size; attack too soon and you could be defeated and left exposed to counterattack, having squandered the units you now need for defense. The amount of spice on the map is another stress point. The spice deposits are finite; once they’re gone, they’re gone, and it’s up to whatever units are left to battle it out. Do you stake your claim to that juicy spice deposit just over the horizon right now? Or do you try to eliminate that nearby enemy base first?

If you’ve played any more recent RTS games at all, all of this will sound thoroughly familiar. And, more so than anything else I could write here, it’s this sense of familiarity, clinging as it does to almost every aspect of Dune II, which crystallizes the game’s influence and importance. The only substantial piece of the RTS puzzle that’s entirely missing here is the multiplayer death match; this game is single-player only, lacking the element that for many is the most appealing of all about the RTS genre. Otherwise, though, the difference between this and more modern RTS games is in the details rather than the fundamentals. This anointed first example of an RTS is a remarkably complete example of the breed. All the pieces are here, and all the pieces fit together as we’ve come to expect them to.

So much for hindsight. As for foresight…

Upon its release in the fall of 1992, Dune II was greeted, like its predecessor from Cryo, with positive reviews, but with none of the fanfare one might expect for a game destined to go down in history as such a revolutionary genre-spawner. Computer Gaming World called it merely “a gratifying experience,” while The One was at least a bit more effusive, with the reviewer pronouncing it “one of the most absorbing games I’ve come across.” Yet everyone regarded it as just another fun game at bottom; no one had an inkling that it would in time birth a veritable new gaming subculture. It sold well enough to justify its development, but — very probably thanks in part to its billing as a sequel to a game with a completely different personality, which had itself only been on the market a few months — it never threatened Eye of the Beholder for the crown of Westwood’s biggest hit to date.

Nor did it prompt an immediate flood of games in the same mold, whether from Westwood or anyone else. The next notable example of the budding genre, Blizzard’s Warcraft, wouldn’t appear until late 1994. That title would be roundly mocked by the gaming intelligentsia for its similarities to Dune IIComputer Gaming World would call it “a perfect bit of creative larceny” — but it would sell much, much better, well and truly setting the flame to the RTS torch. To many Warcraft fans, Westwood would seem like the bandwagon jumpers when they belatedly returned to the genre they had invented with 1995’s Command & Conquer.

By the time that happened, Westwood would be a very different place. Just as they were finishing up Dune II, Louis Castle got a call from Richard Branson himself. “Hello, Louis, this is Richard. I’d like to buy your company.”

“I didn’t know it was for sale,” replied Castle.

“In my experience, everything is for sale!”

And, indeed, notwithstanding their unhappiness about Dune II‘s sequel billing, Brett Sperry and Louis Castle sold out to Virgin, with the understanding that their new parent company would stay out of their hair and let them make the games they wanted to make, holding them accountable only on the basis of the sales they generated. Unlike so many merger-and-acquisition horror stories, Westwood would have a wonderful relationship with Virgin and Martin Alper, who provided the investment they needed to thrive in the emerging new era of CD-ROM-based, multimedia-heavy gaming. We’ll doubtless be meeting Sperry, Castle, and Alper again in future articles.


Looked upon from the perspective of today, the two Dune games of 1992 make for an endlessly intriguing pairing, almost like an experiment in psychology or sociology. Not only did two development teams set out to make a game based on the same subject matter, but they each wound up with a strategy game running in real time. And yet the two games could hardly be more different.

In terms of historical importance, there’s no contest between the two Dunes. While Cryo’s Dune had no discernible impact on the course of gaming writ large, Westwood’s is one of the most influential games of the 1990s. A direct line can be traced from it to games played by tens if not hundreds of millions of people all over the world today. “He who controls the spice, controls the universe,” ran the blurb on the front cover of millions of Dune paperbacks and movie posters. Replace “spice” with the resource of any given game’s choice, and the same could be stated as the guiding tenet of the gaming genre Dune birthed.

And yet I’m going to make the perhaps-surprising claim that the less-heralded first Dune is the more enjoyable of the two to play today. Its fusion of narrative and strategy still feels bracing and unique. I’ve never seen another game which plays quite like this one, and I’ve never seen another ludic adaptation that does a better job of capturing the essential themes and moods of its inspiration.

Dune II, by contrast, can hardly be judged under that criterion at all, given that it’s just not much interested in capturing any of the subtleties of Herbert’s novel; it’s content to stop at “he who controls the spice controls the universe.” Judged on its own terms, meanwhile, strictly as a game rather than an adaptation, it’s become the ironic victim of its own immense influence. I noted earlier that all of the pieces of the RTS genre, with the exception only of the multiplayer death match, came together here for the first time, that later games would be left to worry only about the details. Yet it should also be understood that those details are important. The ability to give orders to groups of units; the ability to give more complex orders to units; ways to get around the map more quickly and easily; higher-resolution screens able to show more of the map at one time; a bigger variety of unit types, with greater variance between opposing factions; more varied and interesting scenarios and terrains; user-selectable difficulty levels (Dune II often seems to be stuck on “Brutal”)… later games would do all of this, and so much more besides. Again, these things do matter. Playing Dune II today is like playing your favorite RTS game stripped down to its most basic foundation. For a historian or a student of game design, that’s kind of fascinating. For someone who just wants to play a fun game, it’s harder to justify.

Still, none of this should detract from the creativity and sheer technical chops that went into realizing Dune II in its own time. Most gaming genres require some iteration to work out the kinks and hone the experience. The RTS genre in particular has been so honed by such a plethora of titles, all working within such a sharply demarcated set of genre markers, that Dune II is bound to seem like a blunt instrument indeed when we revisit it today.

So, there you have it: two disparate Dune games, both inspired and worthy, but in dramatically different ways. Dune as evocative storytelling experience or Dune as straightforward interactive ultra-violence? Take your pick. The choice seems appropriate for a novel that’s been pulled back and forth along much the same axis ever since its first publication in 1965. Does it have a claim to the mantle of High Literature or is it “just” an example of a well-crafted genre novel? Take your pick. The same tension shows itself in the troubled history of Dune as movie, in the way it could attract both filmmakers who pursued — or at least believed themselves to be pursuing — a higher artistic calling, like Alejandro Jodorowsky, and purveyors of the massiest of mass-market entertainments, like Arthur P. Jacobs. Dune as art film or Dune as blockbuster? Take your pick — but please, choose one or the other. Dino and Raffaella De Laurentiis, the first people to get an actual Dune film made, tried to split the difference, making it through a mainstream Hollywood studio with a blockbuster-sized budget, but putting all those resources in the hands of a director of art films. As we’ve seen, the result of that collision of sensibilities was unsatisfying to patrons of multiplexes and art-house theaters alike.

In that light, perhaps it really was for the best that Virgin wound up accidentally releasing two Dune games. Cryo’s Dune locked down the artsier side of Dune‘s split media personality, while Westwood’s was just good fun, satisfying the timeless urge of gamers to blow stuff up in entertaining ways. Thanks to a colossal bureaucratic cock-up at Virgin, there is, one might say, a Dune game for every Dune reader. Which one really is “better” is an impossible question to answer in the end. I’ve stated my opinion, but I have no doubt that plenty of you readers could make an equally compelling case in the other direction. So, vive la différence! With all due apologies to Frank Herbert, variety is the real spice of life.

(Sources: Computer Gaming World of April 1993, August 1993, and January 1995; Game Developer of June 2001; The One of October 1992, January 1993, and July 1993; Retro Gamer 90; Westwood Studios’s customer newsletter dated Fall 1992. Online sources include Louis Castle’s interview for Soren Johnson’s Designer Notes podcast, “Retro Throwback: Dune 2 by Cole Machin on CGM, “Build, gather, brawl, repeat: The history of real-time strategy games” by Richard Moss on Ars Technica, “A New Dawn: Westwood Studios 15th Anniversary” by Geoff Keighly with Amer Ajami on GameSpot, and “The Origin of Realtime Strategy Games on the PC” by Stephen Clarke Willson on his blog Random Blts.

Feel free to download Dune II from right here, packaged so as to make it as easy as possible to get running using your chosen platform’s version of DOSBox.)

Footnotes

Footnotes
1 Dan Bunten died in 1998 as the woman Danielle Bunten Berry. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 
 

Tags: , ,

Controlling the Spice, Part 2: Cryo’s Dune

Philippe Ulrich

To hear him tell the story at any rate, Philippe Ulrich had always been destined to make a computer game out of Dune. On July 21, 1980, he was a starving young musician living in an attic closet in Paris without heat or electricity, having just been dropped by his tiny record label after his first album had stiffed. Threading his way through the tourists packing the Champs-Élysées that scorching summer day, he saw an odd little gadget called a Sinclair ZX80 in the window of an electronics shop. The name of the shop? Dune. His destiny was calling.

But a busy decade still lay between Ulrich and his Dune game. For now, he fell in love at first sight with the first personal computer he had ever seen. His only goal became to scrape together enough money to buy it. Through means fair or foul, he did so, and within a year he had sold his first game, a BASIC implementation of the board game Othello, to Sinclair’s French distributor. He soon partnered up with one Emmanuel Viau, a medical student eager to drop out of university and pursue his real love of programming games. The two pumped out arcade clones and educational drills to raise cash, and officially incorporated their own little software studio, ERE Informatique, on April 28, 1983.

Rémi Herbulot

ERE moved up from the ranks of regional developers and arcade-clone-makers to score their first big international hit thanks to one Rémi Herbulot, a financial controller at the automotive supplier Valeo who had learned BASIC to save his company money on accounting software, only to get himself hopelessly hooked on the drug that was programming to personalities like his. Without ever having seen the American Bill Budge’s landmark Pinball Construction Set, Herbulot wrote a program along the same lines: one that let you build your own pinball table from a box of interchangeable parts and then play and share it with your friends. As soon as Herbulot showed his pinball game to Ulrich, he knew that it had far more potential than anything ERE had made so far, and didn’t waste any time hiring the creator and publishing his creation. Upon its release in 1985, Macadam Bumper topped sales charts in both France and Britain, selling almost 100,000 copies in all. It was even picked up by the American publisher Accolade, who released it as Pinball Wizard and saw it get as high as number 5 on the American charts despite the competition from Pinball Construction Set. Just like that, ERE Informatique had made it onto the international stage. For a second act, Rémi Herbulot soon provided the action-adventure Crafton & Xunk — released as Get Dexter! in some places — and it too became a hit across Europe.

Yet none of the free spirits who made up ERE Informatique was much of a businessman — least of all Philippe Ulrich — and the little collective lived constantly on the ragged edge of insolvency. Hoping to secure the funding needed to make more ambitious games to suit the new 16-bit computers entering the market, Ulrich and Viau sold their company to the Lyon-based Infogrames, the largest games publisher in France, in June of 1987. The plan was for ERE to continue making their games, still under their old company name, while Infogrames quietly took care of the accounting and the publishing.

For the past year already, much of ERE’s energy had been absorbed by Captain Blood, a game designed by Ulrich himself and a newer arrival named Didier Bouchon, a student of biology, interior design, film, and painting whom Ulrich liked to describe as his company’s very own “mad scientist.” And, indeed, Captain Blood was something of a Frankenstein’s monster of a game, combining a fractal-based space-flight simulator with a conversation engine that had you talking with the aliens you met in an invented symbolic language. With its Giger-inspired tangles of onscreen organics and technology and a color palette dominated by neon blues and deep purples, it was all extremely strange stuff, looking and playing more like a conceptual-art installation than a videogame. Not least strange was the plot, which cast the player as a programmer who got sucked into an alternate dimension inside his computer, then saw his identity fractured into six by a “hyperspace accident.” Now he must scour the galaxy to find and destroy his clones and reconstitute his full identity. In a major publicity coup, Ulrich managed to convince the famous composer and keyboardist Jean-Michel Jarre to license to ERE the piece of music that became the game’s main theme. Such a collaboration matched perfectly with the company’s public persona, which depicted their games not so much as commercial entertainments as an emerging artistic movement, in line with, as Ulrich liked to say, Impressionism, Dadaism, or surrealism: “Why should it not be the same with software?”

Captain Blood

Released for the Atari ST in France just in time for the Christmas of 1987, Captain Blood certainly was, whatever else you could say about it, a bold artistic gambit. The French gaming magazine SVM talked it up if anything even more than Ulrich himself, declaring it “a masterpiece,” “the most beautiful game in the world,” the herald of a new generation of games “where narrative sense and programming talent are at the service of a new art.” This sort of stilted grandiosity — sounding, at least when translated into English, a bit like some of the symbolic dialogs you had with the aliens in Captain Blood — would become one of the international hallmarks of a French gaming culture that was just beginning to break out beyond the country’s borders. Captain Blood became the first poster child for what Philippe Ulrich himself would later dub “the French Touch”: “Our games didn’t have the excellent gameplay of original English-language games, but graphically, their aesthetics were superior.”

It took some time to realize that, underneath its undeniable haunting beauty, Captain Blood wasn’t really much of a game. Playing it meant flying around to random planets, going through the same tedious flight-simulator bits again and again, and then — if you were lucky and the planet you’d arrived at wasn’t entirely empty — having baffling conversations with all too loquacious aliens, never knowing what was just gibberish for the sake of it and what was some sort of vital clue. As Ulrich’s own words above would indicate, he and some other French developers really did seem to believe that making beautiful and conceptually original games like Captain Blood should absolve them from the hard work of testing, tweaking, and balancing them. And perhaps he had a point, at least momentarily. What with owners of slick new 16-bit machines like the Atari ST and Commodore Amiga eager to see them put through their audiovisual paces, gameplay really could fall by the wayside with few obvious consequences. Captain Blood sold more than 100,000 copies worldwide despite its faults. For ERE Informatique, it felt like a validation of their new direction.

So, on June 12, 1988, they announced the formation of a new sub-label for artsy games like Captain Blood in an elaborate “happening” at the storied Maison de la Radio in Paris. The master of ceremonies was none other than Alejandro Jodorowsky, the Chilean filmmaker who had spent $2 million in an abortive attempt to make a Dune movie back in the 1970s. The name of the sub-label, Exxos, was derived from the Greek prefix meaning “outward.” The conceit had it that Exxos was literally the god in the machines at ERE Informatique, the real mastermind of all their games. After Jodorowsky’s introduction, Ulrich stepped up to say his piece:

Ladies and gentlemen, the decision was not easy, but still, we have agreed to reveal to you the secret of our dynamism and creativity, which makes ERE Informatique a success. If there are sensitive people in the room, I ask them to be strong. They have nothing to fear if their vibrations are positive; the telluric forces will save them.

My friends, the inspiration does not fall from the sky, genius is not by chance. The inspiration and genius which designed Macadam Bumper is not the fabulous Rémi Herbulot. The inspiration and genius which led to Captain Blood is not the unquenchable Didier Bouchon nor your servant here.

It is Him! He who has lived hidden in our offices for months. He who comes from outside the Universe. He that we reveal today to the world, because the hour has come. I name Exxos. I ask you to say after me a few magic words to remind Him of His homeland: ata ata hoglo hulu, ata ata hoglo hulu…

A group chant followed, more worthy of an occult ceremony than a business presentation.

Some months later, Rémi Herbulot’s Purple Saturn Day became the first big game to premiere on the Exxos label. It was a sort of avant-garde take on the Epyx Games sports series, if you can imagine such a thing. “O Exxos, you who showed us the path to the global success of Captain Blood, you who inspired those fabulous colorful swirls of spacetime!” prayed Philippe Ulrich before a bemused crowd of ordinary trade-show attendees. “Today it is the turn of Rémi Herbulot and Purple Saturn Day. Exxos, thank you!”

The shtick got old quickly. When ERE promoted the next Exxos game, a poorly designed point-and-click adventure called Kult, by dismembering a life-sized latex alien in the name of their god and distributing the pieces to assembled journalists, you could almost see the collective shrug that followed even in the French gaming press. Neither Purple Saturn Day nor Kult (the latter of which was published under the name of Chamber of the Sci-Mutant Priestess in North America) sold in anything like the numbers of Captain Blood.

Meanwhile Infogrames, ERE’s parent company, had gotten into serious financial trouble through over-expansion and over-investment. After a near-acquisition by the American publisher Epyx fell through at the last minute, Infogrames stopped paying the bills at ERE Informatique. Thanks no doubt to such ruthless cost-cutting, Infogrames would escape by the skin of their teeth, and in time would recover sufficient to become one of the biggest games publishers in the world. ERE, however, was finished. Philippe Ulrich and his little band of followers had been cast adrift along with their god. But never fear; their second act would prove almost as surprising as their first. For Ulrich and company were about to meet Dune.



Given the enormous popularity of the novel, one might have expected a Dune computer game long before this point. Yet, thanks to the high-profile but failed Dune film, the rights had been in limbo for the past five years.

As we saw in my previous article, the Dino De Laurentiis Corporation licensed the media rights to Dune — which included game rights — from Frank Herbert in 1982. About six months prior to the film’s release in December of 1984, they made a deal with Parker Brothers — best known as the maker of such evergreen family board games as Monopoly, Clue, and Risk — for a Dune videogame. But said game never materialized; the failure of the film, coupled with a troubled American home-computer marketplace and an all but annihilated post-Great Videogame Crash console marketplace, apparently made them think better of the idea. The Dino De Laurentiis Corporation went bankrupt in 1985, and Frank Herbert died the following year. Despite the inevitable flurry of litigation which followed these events, no one seemed to be quite sure for a long time just where the game rights now resided. The person who would at last break this logjam at decade’s end was a dapper 47-year-old Briton named Martin Alper.

Martin Alper with a display rack of cheap games. These were to be found in all sorts of unlikely places in Britain, from corner shops to booksellers, during Mastertronic’s heyday.

Alper had gotten his start in software in 1983, when, already an established businessman and entrepreneur, he had invested in a tape-duplication facility. At this time, British computer games were distributed almost exclusively on cassette tapes. “I asked the guy how much it cost to duplicate a tape,” Alper later remembered. “He said about 30p. Then I asked him how much they sold the games for. About eight or nine pounds. I couldn’t understand the massive difference.” In his confusion he detected the scent of Opportunity. The result would be Mastertronic, the most internationally successful budget label of the 1980s.

Alper and two others launched Mastertronic in April of 1984 with several games priced at £1.99, about half the lowest price point typical in Britain at the time. The figure was no accident: a survey had revealed that £2 was the average amount of weekly pocket money given to boys of twelve years old or so by British parents. Thus, while the typical kid might have to save up for several weeks to buy a game from the competition, he could buy a new one every single weekend from Mastertronic if he was sufficiently dedicated. And dedicated the kids of Britain proved to be, to the tune of 130,000 Mastertronic games shipped in the first month.

The established powers in the British games industry, however, were less enthusiastic. Claiming that selling games at such prices would set everyone on the road to ruin, distributors flatly refused to handle Mastertronic’s products. Unfazed, Alper and his partners simply went around them, setting up their own distribution pipeline with the likes of the bookstore chain W.H. Smith and even supermarkets and convenience stores, who were advised to place the freestanding pillars of Mastertronic games, with “£1.99!” emblazoned in big digits across the top, right where parents and children passed by on their way to the cash register with their groceries. “The problem with the conventional retail outlets,” said Alper, “is [that] they don’t encourage the impulse purchase. Supermarkets are much better at that.”

Mastertronic’s simple action games weren’t great, but for the most part they weren’t as horrible as the rest of the industry liked to claim either. If they lacked the staying power of many of their higher-priced rivals, that could be rationalized away in light of the fact that a kid could buy a new one every week or two. And Alper proved hugely talented at tempting his target demographic in all sorts of ways that didn’t depend directly on the quality of the games themselves. One of Mastertronic’s biggest early hits was a knock-off of Michael Jackson’s extended “Thriller” video, renamed to Chiller. (Predictably enough, they were hauled into court by Jackson’s management company and wound up having to pay a settlement, but they still came out well-ahead financially.) Another game, Clumsy Colin Action Biker, starred the mascot from a popular brand of crisps, and was advertised right on the packages of said junk food. (“They showed us how they were made. It’s revolting. You know those little plastic chips you get in packing materials? They’re exactly the same, with added flavoring.”)

It was all pretty lowbrow stuff — about as far as you could get from the high-toned pretensions of ERE Informatique across the English Channel — but Mastertronic’s games-as-commodies business model proved very successful. Within eighteen months of their launch, Mastertronic alone owned 20 percent of the British computer-games market, was expanding aggressively across the rest of Europe, and had become the first British software house to launch a successful line in the United States. In fact, Martin Alper had already moved to California, the better to steer operations there.

But Mastertronic’s glory days of huge profits off cheap games were brief-lived. Just like Infogrames in France, they tried to do too much too soon. Losing sight of their core competencies, they funded a line of coin-operated arcade games that went nowhere and acquired the prestigious but troubled British/Australian publisher Melbourne House for way too much money. At the same time, the army of lone-wolf bedroom coders who provided their games proved ill-equipped to take full advantage of the newer 16-bit machines that began to capture many gamers’ hearts and wallets as the 1980s wore on. Already by 1987, Mastertronic’s bottom line had turned from black to red.

Meanwhile Virgin Games, one of the smaller subsidiaries of Richard Branson’s globe-spanning media empire, had been quietly releasing games in Britain since 1982. Now, though, Branson was eager to get into the games market in a more concentrated way. Mastertronic, possessed of excellent worldwide distribution and proven marketing savvy despite their current financial difficulties, seemed a great way to do that. In early 1988, Virgin bought Mastertronic.

Initially, the new subsidiary took the name of Virgin Mastertronic and simply continued on with business as usual. But as Martin Alper looked upon a changing industry, he saw those more powerful 16-bit platforms continuing to take over from the simple 8-bit machines that had fueled Mastertronic’s success, and he saw older demographics with more disposable income beginning to take an interest in more sophisticated, upmarket computer games. In short, he felt that he had already hit a ceiling with his cheap little games; what had been so right for 1984 was no longer such a great fit for 1988. And so Alper, a man of enormous charisma and energy, maneuvered himself into the leading role at Virgin Games proper, overseeing its worldwide operations from California, the entertainment capital of the world. After having fallen into exactly the decline Alper had foreseen, Virgin Mastertronic would be sold off in 1991 to the Japanese console maker Sega, with whom they had a longstanding distribution agreement.

Alper loved Dune, connecting with its mythical — mystical? — qualities on a deep-seated level: “It presents a parallel with Christianity or Judaism, including the idea of the messiah who comes to save a strange planet. Dune begs questions about other civilizations that could exist: will they have the same beliefs, worship the same supernatural beings?” He had always dreamed of publishing a Dune computer game, but had known it just wasn’t practical on a Mastertronic budget. Now, though, with the more prestigious name and deeper pockets of Virgin behind him, he started pursuing the license in earnest. Beginning in 1988, he worked through a long, fraught process of first identifying the proper holder of the media rights — as far as could be determined from all of the previous litigation and bankruptcies, they seemed to have reverted to Universal Pictures, the distributor of the film — and then of prying them away for Virgin. Alper saw a Dune game as announcing Virgin’s — and his own — arrival on the scene as a major industry player in an artistic as well as commercial sense, making games far removed from the budgetware of the Mastertronic years.

Even as Alper was trying to secure the Dune rights, Philippe Ulrich and his friends were trying to free themselves from their entanglements with Infogrames and continue making games elsewhere. They found a welcome supporter in Jean-Martial Lefranc, the head of Virgin Loisirs, Virgin Games’s French arm. Manifesting a touch of Gallic pride, he wanted to set up a homegrown studio, made up of French developers creating ambitious and innovative games which would be distributed all over the world under the Virgin label. And certainly no one could accuse Ulrich and friends of lacking either ambition or a spirit of innovation. Lefranc helped to negotiate a concrete exit agreement between the former ERE Informatique and Infogrames, and thereafter signed them up to become the basis of a new Virgin Loisirs subsidiary.

Ulrich and company named their new studio Cryo Interactive, a play on cryogenic chambers and the computer-assisted dreams people would presumably have in them in the future. They announced their existence with all the grandiosity the world had come to expect from this bunch, saying that their purpose would be to “open the way to the next generation of software designers, artists, programmers, and so on,” who would “create expanding horizons for our imagination in tomorrow’s fascinating technology world.” “Infinite travel, magic, beauty, technology, adventure, and mystery” were in the offing.

In August of 1989, Rémi Herbulot flew to California to have a more prosaic conversation with Martin Alper about potential Cryo projects that might be suitable for the international market. Alper told him then that he was trying to secure the rights to make a Dune game, a project for which he saw Cryo as the perfect development team, without elaborating as to why. “But,” he said, “there’s seems to be little chance of actually getting the rights.”

Herbulot wasn’t sure what to make of the whole exchange, but when he told his colleagues about it back in Paris, Ulrich, who loved the novel unconditionally, was convinced that the project had been ordained by fate. Not only had he bought his first computer in a shop called Dune, but the hotel in Las Vegas where they had all stayed during the last Winter Consumer Electronics Show had had the same name. And then there was his friendship with Alejandro Jodorowsky, the would-be Dune film director of yore. What another might have seen as a series of tangential coincidences, Ulrich saw as the mysterious workings of destiny. It was “obvious,” he said, that Cryo would end up making Dune into a computer game — and, indeed, he was proven correct. Three weeks after Herbulot’s return from California, Ulrich got a call at home from Jean-Martial Lefranc. Martin Alper had managed to secure the Dune license after all, said Virgin Loisir’s chief executive, and he wanted Cryo to start thinking immediately about what kind of game they could make out of it. Ulrich remembers running out of his apartment building and doing several laps around the block, feeling like he was levitating.

But his ecstasy would be short lived. Virgin assigned as Dune‘s producer David Bishop, a veteran British games journalist, designer, and executive. The language barrier and the distance separating London from Paris were just the beginning of the difficulties that ensued. In the eyes of his French charges, Bishop seemed to view himself as Dune‘s appointed designer, Cryo as the mere technical team assigned to implement his vision. Given the artistic aspirations of people like Philippe Urlich and Rémi Herbulot, who so forthrightly described themselves as the vanguard of nothing less than a new artistic movement, this was bound to cause problems. Meanwhile Bishop, for his part, was convinced that Cryo was being deliberately obtuse and oh so inscrutably Gallic just to mess with him. The cross-Channel working relationship started out strained and just kept getting more so.

Following what was, for better or for worse, becoming an accepted industry practice, Virgin told Cryo that they had to storyboard the game on paper and get that approved before they could even begin to implement anything on a computer. Cryo worked this way for months on end, abandoning their computers for pencil and paper.

Adapting a story as complex as that of Dune to another medium must be, as David Lynch among others had already learned, a daunting endeavor under any circumstances. “We reread the book several times, got hold of everything we could find on the subject, and watched the movie over and over again,” says Philippe Ulrich. “Whenever we came across somebody who had read the book, we asked them what had impressed them most and what their strongest memories were.” The centerpiece of the book and the movie, the struggle for control of Arrakis between House Atreides and House Harkonnen, must obviously be the centerpiece of the game as well. Yet Cryo didn’t want to lose all of the other textures of the story. How could they best capture the spirit of Dune? To boil it all down to yet another game of military strategy in an industry already flooded with such things didn’t seem right, but neither did a point-and-click adventure game. After much struggle, they decided to do both — to combine a strategic view of the battle for Arrakis with the embodied, first-person role of Paul Atreides.

David Bishop hated it. All of it. “The interface is too complex,” he said. “A mix of adventure and strategy is not desirable.” Others in Virgin’s British and American offices also piled on. Cryo’s design lacked “unity,” they said; it would require “fifty disks” to hold it; it had “too many cinematic sequences, at the risk of boring the player”; the time required to develop it would “exceed the average lifespan of a programmer.” One particular question was raised endlessly, if understandably in light of Cryo’s history: would this be a game that mainstream American gamers would want to play, or would it be all, well, French? And yes, it was a valid enough concern on the face of it. But equally valid was the counterpoint raised by Ulrich: if you didn’t want a French Dune, why did you hire arguably the most French of all French studios to make it? Or did Bishop feel that that decision had been a mistake? Certainly Cryo had long since begun to suspect that his real goal was to kill the project by any means necessary.

Matters came to a head in the summer of 1990. In what may very well still stand as an industry record, Dune had now been officially “in production” for almost a year without a single line of code getting written. Virgin invited the whole of Cryo to join them at their offices in London to try to hash the whole thing out. The meeting was marked by bursts of bickering over trivialities, interspersed with long, sullen silences. At last, Philippe Ulrich stood up to make a final impassioned speech. He said that Cryo was trying their level best to make a game that evoked all of the major themes of a book they loved (never mind for the moment that the license Virgin had acquired could more accurately be described as a license to the movie). The transformation of boy to messiah was in there; the all-importance of the spice was in there; even the ecological themes were in there. David Bishop just snorted in response; Virgin wanted a commercial computer game that was fun to play, he groused, not a work of fine literary art. Nothing got resolved.

Or perhaps in a way it did. On September 19, 1990, Cryo got a fax from London: “We do not believe that the Dune proposal is strong enough to publish under the Virgin Games label. Consequently, we do not wish that more work be undertaken on this title.”

And then, at this fraught juncture, a rather extraordinary thing happened. Ulrich went directly to Jean-Martial Lefranc of Virgin Loisirs to plead his case one final time, whereupon Lefranc told him to just go ahead and make his Dune his way — to forget about storyboards and David Bishop and all the rest of it. Virgin Loisirs was doing pretty well at the moment; he’d find some money in some hidden corner of his budget to keep the lights on at Cryo. If they made the Dune game a great one, he was sure he could smooth it all over with his superiors after the fact, when he had a fait accompli in the form of an amazing game that just had to be published already in his hands. And so Ulrich took a second lap or two around the block and then buckled down to work.

For some six months, Cryo beavered away at their Dune in secrecy. Then, suddenly, the jig was up. Lefranc — who, as his actions in relation to Dune would indicate, didn’t have an overly high opinion of Virgin Games’s international management — left to join the movie-making arm of the Virgin empire. His replacement, Christian Brécheteau, was a complete unknown quantity for Cryo. At about the same time, a routine global audit of the empire’s books sent word back to London about a significant sum being paid to Cryo every month for reasons that were obscure at best. Brécheteau called Ulrich: “Take the first plane to London and make your own case. I can’t do anything for you.”

As it happened, Martin Alper was in London at that time. If Ulrich hoped for a sympathetic reception from that quarter, however, he was disappointed. After pointedly leaving him to cool his heels in a barren waiting room most of the day, Alper and other executives, including Cryo’s arch-nemesis David Bishop, invited Ulrich in. The mood was decidedly chilly as he set up his presentation. “This is not a game!” scoffed Alper almost immediately, as soon as he saw the first, heavily scripted scenes. Yet as Ulrich demonstrated further he could sense the mood — even the mood of Bishop — slowly changing to one of grudging interest. Alper even pronounced some of what he saw “remarkable.”

Ulrich was ushered out of the room while the jury considered his fate. When he was called back in, Alper pronounced their judgment: “You have five weeks to send me something more polished. If that doesn’t please me, I never want to hear about it again, and you can consider yourself fired.” A more formal statement of his position was faxed to Paris the next day:

Our opinion of the game has not changed. The graphics and aesthetic  presentation are impressive, but the overall design is still too confusing, especially if one takes into account the tastes of the American public. We are willing to support your work until July 15 [1991], by which date we expect to receive a playable version of the game in England and the United States. If the earlier concerns expressed by David Bishop prove unfounded, we will be happy to support your efforts to realize the finished game. However, we wish to point out that it will not under any circumstances be possible to transfer the Dune license to another publisher, and that no game of Frank Herbert’s novel will be published without our consent. [1]Virgin’s concern here was likely related to the fact that they had technically purchased the rights to the Dune movie. The question of whether separate rights to the novel existed and could be licensed had never really been resolved. They wanted to head off the nightmare scenario of Cryo/Virgin Loisirs truly going rogue by acquiring the novel rights and releasing the game under that license through another publisher.

Cryo bit their tongues and made the changes Virgin requested — changes designed to make the game more streamlined, more understandable, and more playable. On July 15, they packaged up what they had and sent it off. Three days later, they got a call from a junior executive in Virgin’s California office. His tone was completely different from that of the fax of five and a half weeks earlier: “What you have done is fantastic. Productivity has collapsed around here because people are all playing your game!”

Cryo originally planned to use this picture of Sting in their Dune game, but the rock star refused permission to use his likeness.

So, Feyd-Rautha, Sting’s character in the movie, had to get some plastic surgery for the game.

Work continued on the game for another nine months or so. Relations between Cryo and Virgin remained strained at times over that period, but cancellation was never again on the cards. At Virgin’s insistence, Cryo spent considerable time making the game look more like the movie, rather than their possibly idiosyncratic image of the book. Most of the characters, with the exception of only a few whose actors refused permission to have their likenesses reproduced — Sting and Patrick Stewart were among them — were redrawn to match the film. The media-savvy Martin Alper was well aware that Kyle MacLachlan, the star of the film, was currently starring in David Lynch’s much-talked-about television series Twin Peaks. He made sure that MacLachlan graced the front of the box as Paul Atriedes.

The game of Dune‘s cover art was a still from the movie.

Cryo’s Dune finally shipped worldwide in May of 1992, to positive reviews and healthy sales; one report claims that it sold 20,000 copies in its first week in the United States alone, a very impressive performance for the time. It did if anything even better in Europe; Cryo had been smart enough to develop and release it simultaneously for MS-DOS, the overwhelmingly dominant computer-game platform in North America, and for the Commodore Amiga, the almost-as-popular computer-gaming platform of choice in much of Europe. The game was successful enough that Virgin funded expanded MS-DOS and Sega Genesis CD-based versions, which appeared in 1993, complete with voice acting and additional animation sequences.



And what can we say about Cryo’s Dune today? I will admit that I didn’t have high hopes coming in. As must be all too clear by now, I’m not generally a fan of this so-called French Touch in games. While I love beauty as much as the next person and love to be moved by games, I do insist that a game work first and foremost as a game. This isn’t a standard that Philippe Ulrich’s teams tended to meet very often, before or after they made Dune. The combination of Ulrich’s love of weirdness with the famously weird filmmaker David Lynch would seem a toxic brew indeed, one that could only result in a profoundly awful game. Inscrutability can work at times in the non-interactive medium of movies; in games, where the player needs to have some idea what’s expected from her, not so much.

But, rather amazingly, Cryo’s Dune defies any knee-jerk prejudices that might be engendered by knowledge of Philippe Ulrich’s earlier or later output. While it’s every bit as unique a design concept as you might expect given its place of origin, in this case the concept works. For all that they spent the better part of three years at one another’s throats more often than not, Dune nevertheless wound up being a true meeting in the middle between the passionate digital artistes of Cryo and the more practical craftsmen in Virgin’s Anglosphere offices. For once, an exemplar of the French Touch has a depth worthy of its striking surface. Dune plays like a dispatch from an alternate reality in which Cryo cared as much about making good games in a design sense as they did about making beautiful and meaningful ones in an aesthetic and thematic sense — thus proving, should anyone have doubted it, that these things need not be mutually exclusive.

The game leads you by the nose a bit at the beginning, but it later opens up. The early stages function very well as a tutorial for the strategy game. Thanks to this fact and the simple, intuitive interface, the Dune player has little need for the manual.

You play the game of Dune as Paul Atreides, just arrived on Arrakis with his father and mother and the rest of House Atreides. From his embodied perspective, you fly around the planet in your ornithopter, recruiting the various Fremen clans to your cause, then directing them to mine the precious spice, to train in military maneuvers, to spy on House Harkonnen, and eventually to go to war against them. As you’re doing so, another form of plot engine is also ticking along, unfolding the experiences which transform the boy Paul Atreides physically and spiritually into his new planet’s messiah. This “adventurey” side of the game is extremely assertive at first, to the point of leading you by the nose through the strategy side: go here and do this; now go there and do that. In time, however, it eases up and your goals become more abstract, giving much more scope for you to manage the war your way.

The fusion isn’t always perfect; it is possible to break the adventure side of the game if you obstinately pursue your own agenda in the strategy side. But it’s certainly one of the most interesting and successful hybrid designs I’ve ever seen. As the character you play is transformed by his experiences, so is the strategy game you’re playing; as Paul’s psychic powers grow, you no longer have to hop around the planet as much in your physical form, but can communicate with your followers over long distances using extra-sensory perception. Eventually your powers will expand enough to let you ride the fearsome sandworms into the final series of battles against the Harkonnen.

Dune is a strategy game inside an embodied adventure game.

Cryo’s Dune provides other ludic adaptations from non-interactive media with a worthy benchmark to strive for; it doesn’t always fuss overly much about the details of its source material, but it really does do a superb job of capturing its spirit. As an impassioned Philippe Ulrich noted at that pivotal meeting in London, there’s no theme in the book that isn’t echoed, however faintly, in the game. Even the ecological element of the book that made it such a favorite of the environmental movement is remembered, as you reclaim mined-out desert lands to begin a “greening” of Arrakis later in the game. Ditto that wind of utter alienness that blows through the book and, now, the game. This game looks and feels and, perhaps most of all, sounds like no other; its synthesized soundtrack has passed into gaming legend as one of the very best of its breed, so good that Cryo actually released it as a standalone audio CD.

An in-game encyclopedia is available for newcomers, but in truth it’s hardly needed. The game conveys everything you really need to know almost subliminally as you play.

The game manages to be so evocative of its source material while remaining as enjoyable for those who haven’t read the novel or seen the film as those who have. It does a great job of getting newcomers up to speed, even as its dynamic, emergent strategy element ensures that it never becomes a dull exercise in walking through a plot those who have read the book already know. Its interface is an intuitive breeze, and the difficulty as well is perfectly pitched for what the game wants to be, being difficult enough to keep you on your toes but reasonable enough that you have a good chance of winning on your first try; after all, who wants to play through a story-oriented game like this twice? I love to see innovative approaches to gameplay that defy the strict boundaries of genre, and love it even more when said approaches work as well as they do here. This game still has plenty to teach the designers of today.

The big picture…

Sadly, though, Cryo’s Dune, despite its considerable commercial success, has gone down in history as something of a curiosity rather than a harbinger of design trends to come, a one-off that had little influence on the games that came later — not even the later games that came out of Cryo, which quite uniformly failed to approach the design standard set here. Cryo would survive for the balance of the 1990s, churning out what veteran games journalist John Walker calls, in his succinct and hilarous summing up of their legacy, “always awful but ever so sincere productions.” They would become known for, as Walker puts it, “deadpan adventure games set in wholly ludicrous reinterpretations of out-of-copyright works of literature, in which nothing made sense, and all puzzles were unfathomable guesswork.” The biggest mystery surrounding them is just how the hell they managed to stay in business for a full decade. Just who was buying all these terrible games that all of the magazines ripped to shreds and no one you talked to would ever admit to even playing, much less enjoying?

Nor did anyone else emerge to take up the torch of games that were designed to match the themes, plots, and settings of their fictions rather than to slot into some arbitrary box of ludic genre. Instead, the lines of genre would only continue to harden as time went on. Interesting hybrids like Cryo’s Dune became a more and more difficult sell to publishers, for dismaying if understandable reasons: said publishers were continuing to look on as their customers segregated themselves into discrete pools, each of whom only played a certain kind of game to the exclusive of all others. And so Cryo’s Dune passed into history, just one more briefly popular, now obscure gem ripe for rediscovery…

But wait, you might be saying: I claimed at the end of the first article in this series that Dune left a “profound mark” on gaming. Well, as it happens, that is true of Dune in general — but not true of this particular Dune game. Those months during which Cryo and Virgin Loisirs took their Dune underground — months during which the rest of Virgin Games had no idea what their French arm was doing — had yet more ramifications than those I’ve already described. For, during the time when he believed the Cryo Dune to be dead, Martin Alper launched a new project to make another, very different sort of Dune game, using developers much closer to his home base in California. This other Dune would be far less inspiring than Cryo’s as an adaptation of Frank Herbert’s novel or even of David Lynch’s film, but its influence on the world of gaming in general would be far more pronounced.

(Sources: the book La Saga des Jeux Vidéo by Daniel Ichbiah; Home Computer of June 1984; CU Amiga of July 1991 and June 1992; Amiga Format of March 1990; Computer and Video Games of August 1985, November 1985, and April 1986; New Computer Express of February 3 1990; Amstrad Action of March 1986 and April 1986; Retro Gamer 90; The One of May 1991 and June 1992; Game Players PC Entertainment Vol. 5 No. 5; PC Review of June 1992; Aktueller Software Markt of August 1994; Home Computing Weekly of May 8 1984, July 17 1984, and September 18 1984; Popular Computing Weekly of July 19 1984; Sinclair User of January 1986; The Games Machine of October 1987; Your Computer of January 1986. Online sources include “I Kind of Miss Dreadful Adventure Developer Cryo” by John Walker on Rock Paper Shotgun and “How ‘French Touch’ Gave Early Videogames Art, Brains” by Chris Baker on Wired. Note that some of the direct quotations in this article are translated into English from the French.

Feel free to download Cryo Interactive’s Dune from right here, packaged so as to make it as easy as possible to get running using your platform’s version of DOSBox.)

Footnotes

Footnotes
1 Virgin’s concern here was likely related to the fact that they had technically purchased the rights to the Dune movie. The question of whether separate rights to the novel existed and could be licensed had never really been resolved. They wanted to head off the nightmare scenario of Cryo/Virgin Loisirs truly going rogue by acquiring the novel rights and releasing the game under that license through another publisher.
 
49 Comments

Posted by on November 30, 2018 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The 68000 Wars, Part 5: The Age of Multimedia

A group of engineers from Commodore dropped in unannounced on the monthly meeting of the San Diego Amiga Users Group in April of 1988. They said they were on their way to West Germany with some important new technology to share with their European colleagues. With a few hours to spare before they had to catch their flight, they’d decided to share it with the user group’s members as well.

They had with them nothing less than the machine that would soon be released as the next-generation Amiga: the Amiga 3000. From the moment they powered it up to display the familiar Workbench startup icon re-imagined as a three-dimensional ray-traced rendering, the crowd was in awe. The new model sported a 68020 processor running at more than twice the clock speed of the old 68000, with a set of custom chips redesigned to match its throughput; graphics in 2 million colors instead of 4096, shown at non-interlaced — read, non-flickering — resolutions of 640 X 400 and beyond; an AmigaOS 2.0 Workbench that looked far more professional than the garish version 1.3 that was shipping with current Amigas. The crowd was just getting warmed up when the team said they had to run. They did, after all, have a plane to catch.

Word spread like crazy over the online services. Calls poured in to Commodore’s headquarters in West Chester, Pennsylvania, but they didn’t seem to know what any of the callers were talking about. Clearly this must be a very top-secret project; the engineering team must have committed a major breach of protocol by jumping the gun as they had. Who would have dreamed that Commodore was already in the final stages of a project which the Amiga community had been begging them just to get started on?

Who indeed? The whole thing was a lie. The tip-off was right there in the April date of the San Diego Users Group Meeting. The president of the group, along with a few co-conspirators, had taken a Macintosh II motherboard and shoehorned it into an Amiga 2000 case. They’d had “Amiga 3000” labels typeset and stuck them on the case, and created some reasonable-looking renderings of Amiga applications, just enough to get them through the brief amount of time their team of “Commodore engineers” — actually people from the nearby Los Angeles Amiga Users Group — would spend presenting the package. When the truth came out, some in the Amiga community congratulated the culprits for a prank well-played, while others were predictably outraged. What hurt more than the fact that they had been fooled was the reality that a Macintosh that was available right now had been able to impersonate an Amiga that existed only in their dreams. If that wasn’t an ominous sign for their favored platform’s future, it was hard to say what would be.

Of course, this combination of counterfeit hardware and sketchy demos, no matter how masterfully acted before the audience, couldn’t have been all that convincing to a neutral observer with a modicum of skepticism. Like all great hoaxes, this one succeeded because it built upon what its audience already desperately wanted to believe. In doing so, it inadvertently provided a preview of what it would mean to be an Amiga user in the future: an ongoing triumph of hope over hard-won experience. It’s been said before that the worst thing you can do is to enter into a relationship in the hope that you will be able to change the other party. Amiga users would have reason to learn that lesson over and over again: Commodore would never change. Yet many would never take the lesson to heart. To be an Amiga user would be always to be fixated upon the next shiny object out there on the horizon, always to be sure this would be the thing that would finally turn everything around, only to be disappointed again and again.

Hoaxes aside, rumors about the Amiga 3000 had been swirling around since the introduction of the 500 and 2000 models in 1987. But for a long time a rumor was all the new machine was, even as the MS-DOS and Macintosh platforms continued to evolve apace. Commodore’s engineering team was dedicated and occasionally brilliant, but their numbers were tiny in comparison to those of comparable companies, much less bigger ones like Apple and IBM, the latter of whose annual research budget was greater than Commodore’s total sales. And Commodore’s engineers were perpetually underpaid and underappreciated by their managers to boot. The only real reason for a top-flight engineer to work at Commodore was love of the Amiga itself. In light of the conditions under which they were forced to work, what the engineering staff did manage to accomplish is remarkable.

After the crushing disappointment that had been the 1989 Christmas season, when Commodore’s last and most concerted attempt to break the Amiga 500 into the American mainstream had failed, it didn’t take hope long to flower again in the new year. “The chance for an explosive Amiga market growth is still there,” wrote Amazing Computing at that time, in a line that could have summed up the sentiment of every issue they published between 1986 and 1994.

Still, reasons for optimism seemingly did still exist. For one thing, Commodore’s American operation had another new man in charge, an event which always brought with it the hope that the new boss might not prove the same as the old boss. Replacing the unfortunately named Max Toy was Harold Copperman, a real, honest-to-goodness computer-industry veteran, coming off a twenty-year stint with IBM, followed by two years with Apple; he had almost literally stepped offstage from the New York Mac Business Expo, where he had introduced John Sculley to the speaker’s podium, and into his new office at Commodore. With the attempt to pitch the Amiga 500 to low-end users as the successor to the Commodore 64 having failed to gain any traction, the biggest current grounds for optimism was that Copperman, whose experience was in business computers, could make inroads into that market for the higher-end Amiga models. Rumor had it that the dismissal of Toy and the hiring of Copperman had occurred following a civil war that had riven the company, with one faction — Toy apparently among them — saying Commodore should de-emphasize the Amiga in favor of jumping on the MS-DOS bandwagon, while the other faction saw little future — or, perhaps better said, little profit margin — in becoming just another maker of commodity clones. If you were an Amiga fan, you could at least breathe a sigh of relief that the right side had won out in that fight.

The Amiga 3000

It was in that hopeful spring of 1990 that the real Amiga 3000, a machine custom-made for the high-end market, made its bow. It wasn’t a revolutionary update to the Amiga 2000 by any means, but it did offer some welcome enhancements. In fact, it bore some marked similarities to the hoax Amiga 3000 of 1988. For instance, replacing the old 68000 was a 32-bit 68030 processor, and replacing AmigaOS 1.3 was the new and much-improved — both practically and aesthetically — AmigaOS 2.0. The flicker of the interlaced graphics modes could finally be a thing of the past, at least if the user sprang for the right type of monitor, and a new “super-high resolution” mode of 1280 X 400 was available, albeit with only four onscreen colors. The maximum amount of “chip memory” — memory that could be addressed by the machine’s custom chips, and thus could be fully utilized for graphics and sound — had already increased from 512 K to 1 MB with the release of a “Fatter Agnus” chip, which could be retrofitted into older examples of the Amiga 500 and 2000, in 1989. Now it increased to 2 MB with the Amiga 3000.

The rather garish and toy-like AmigaOS 1.3 Workbench.

The much slicker Workbench 2.0.

So, yes, the Amiga 3000 was very welcome, as was any sign of technological progress. Yet it was also hard not to feel a little disappointed that, five years after the unveiling of the first Amiga, the platform had only advanced this far. The hard fact was that Commodore’s engineers, forced to work on a shoestring as they were, were still tinkering at the edges of the architecture that Jay Miner and his team had devised all those years before rather than truly digging into it to make the more fundamental changes that were urgently needed to keep up with the competition. The interlace flicker was eliminated, for instance, not by altering the custom chips themselves but by hanging an external “flicker fixer” onto the end of the bus to de-interlace the interlaced output they still produced before it reached the monitor. And the custom chips still ran no faster than they had in the original Amiga, meaning the hot new 68030 had to slow down to a crawl every time it needed to access the chip memory it shared with them. The color palette remained stuck at 4096 shades, and, with the exception of the new super-high resolution mode, whose weirdly stretched pixels and four colors limited its usability, the graphics modes as a whole remained unchanged. Amiga owners had spent years mocking the Apple Macintosh and the Atari ST for their allegedly unimaginative, compromised designs, contrasting them continually with Jay Miner’s elegant dream machine. Now, that argument was getting harder to make; the Amiga too was starting to look a little compromised and inelegant.

Harold Copperman personally introduced the Amiga 3000 in a lavish event — lavish at least by Commodore’s standards — held at New York City’s trendy Palladium nightclub. With CD-ROM in the offing and audiovisual standards improving rapidly across the computer industry, “multimedia” stood with the likes of “hypertext” as one of the great buzzwords of the age. Commodore was all over it, even going so far as to name the event “Multimedia Live!” From Copperman’s address:

It’s our turn. It’s our time. We had the technology four and a half years ago. In fact, we had the product ready for multimedia before multimedia was ready for a product. Today we’re improving the technology, and we’re in the catbird seat. It is our time. It is Commodore’s time.

I’m at Commodore just as multimedia becomes the most important item in the marketplace. Once again I’m with the leader. Of course, in this industry a leader doesn’t have any followers; he just has a lot of other companies trying to pass him by. But take a close look: the other companies are talking multimedia, but they’re not doing it. They’re a long way behind Commodore — not even close.

Multimedia is a first-class way for conveying a message because it takes the strength of the intellectual content and adds the verve — the emotion-grabbing, head-turning, pulse-raising impact that comes from great visuals plus a dynamic soundtrack. For everyone with a message to deliver, it unleashes extraordinary ability. For the businessman, educator, or government manager, it turns any ordinary meeting into an experience.

In a way, this speech was cut from the same cloth as the Amiga 3000 itself. It was certainly a sign of progress, but was it progress enough? Even as he sounded more engaged and more engaging than had plenty of other tepid Commodore executives, Copperman inadvertently pointed out much of what was still wrong with the organization he helmed. He was right that Commodore had had the technology to do multimedia for a long time; as I’ve argued at length elsewhere, the Amiga was in fact the world’s first multimedia personal computer, all the way back in 1985. Still, the obvious question one is left with after reading the first paragraph of the extract above is why, if Commodore had the technology to do multimedia four and a half years ago, they’ve waited until now to tell anyone about it. In short, why is the the world of 1990 “ready” for multimedia when the world of 1985 wasn’t? Contrary to Copperman’s claim about being a leader, Commodore’s own management had begun to evince an understanding of what the Amiga was and what made it special only after other companies had started building computers similar to it. Real business leaders don’t wait around for the world to decide it’s ready for their products; they make products the world doesn’t yet know it needs, then tell it why it needs them. Five years after being gifted with the Amiga, which stands alongside the Macintosh as one of the two most visionary computers of the 1980s precisely because of its embrace of multimedia, Commodore managed at this event to give every impression that they were the multimedia bandwagon jumpers.

The Amiga 3000 didn’t turn into the game changer the faithful were always dreaming of. It sold moderately, mostly to the established Amiga hardcore, but had little obvious effect on the platform’s overall marketplace position. Harold Copperman was blamed for the disappointment, and was duly fired by Irving Gould, the principal shareholder and ultimate authority at Commodore, at the beginning of 1991. The new company line became an exact inversion of that which had held sway at the time of the Amiga 3000’s introduction: Copperman’s expertise was business computing, but Commodore’s future lay in consumer computing. Jim Dionne, head of Commodore’s Canadian division and supposedly an expert consumer marketer, was brought in to replace him.

An old joke began to make the rounds of the company once again. A new executive arrives at his desk at Commodore and finds three envelopes in the drawer, each labelled “open in case of emergency” and numbered one, two, and three. When the company gets into trouble for the first time on his watch, he opens the first envelope. Inside is a note: “Blame your predecessor.” So he does, and that saves his bacon for a while, but then things go south again. He opens the second envelope: “Blame your vice-presidents.” So he does, and gets another lease on life, but of course it only lasts a little while. He opens the third envelope. “Prepare three envelopes…” he begins to read.

Yet anyone who happened to be looking closely might have observed that the firing of Copperman represented something more than the usual shuffling of the deck chairs on the S.S. Commodore. Upon his promotion, it was made clear to Jim Dionne that he was to be held on a much shorter leash than his predecessors, his authority carefully circumscribed. Filling the power vacuum was one Mehdi Ali, a lawyer and finance guy who had come to Commodore a couple of years before as a consultant and had since insinuated himself more and more with Irving Gould. Now he advanced to the title of president of Commodore International, Gould’s right-hand man in running the global organization; indeed, he seemed to be calling far more shots these days than his globe-trotting boss, who never seemed to be around when you needed him anyway. Ali’s rise would not prove a happy event for anyone who cared about the long-term health of the company.

For now, though, the full import of the changes in Commodore’s management structure was far from clear. Amiga users were on to the next Great White Hope, one that in fact had already been hinted at in the Palladium as the Amiga 3000 was being introduced. Once more “multimedia” would be the buzzword, but this time the focus would go back to the American consumer market Commodore had repeatedly failed to capture with the Amiga 500. The clue had been there in a seemingly innocuous, almost throwaway line from the speech delivered to the Palladium crowd by C. Lloyd Mahaffrey, Commodore’s director of marketing: “While professional users comprise the majority of the multimedia-related markets today, future plans call for penetration into the consumer market as home users begin to discover the benefits of multimedia.”

Commodore’s management, (proud?) owners of the world’s first multimedia personal computer, had for most of the latter 1980s been conspicuous by their complete disinterest in their industry’s initial forays into CD-ROM, the storage medium that, along with the graphics and sound hardware the Amiga already possessed, could have been the crowning piece of the platform’s multimedia edifice. The disinterest persisted in spite of the subtle and eventually blatant hints that were being dropped by people like Cinemaware’s Bob Jacob, whose pioneering “interactive movies” were screaming to be liberated from the constraints of 880 K floppy disks.

In 1989, a tiny piece of Commodore’s small engineering staff — described as “mavericks” by at least one source — resolved to take matters into their own hands, mating an Amiga with a CD-ROM drive and preparing a few demos designed to convince their managers of the potential that was being missed. Management was indeed convinced by the demo — but convinced to go in a radically different direction from that of simply making a CD-ROM drive that could be plugged into existing Amigas.

The Dutch electronics giant Philips had been struggling for what seemed like forever to finish something they envisioned as a whole new category of consumer electronics: a set-top box for the consumption of interactive multimedia content on CD. They called it CD-I, and it was already very, very late. Originally projected for release in time for the Christmas of 1987, its constant delays had left half the entertainment-software industry, who had invested heavily in the platform, in limbo on the whole subject of CD-ROM. What if Commodore could steal Phillips’s thunder by combining a CD-ROM drive with the audiovisually capable Amiga architecture not in a desktop computer but in a set-top box of their own? This could be the magic bullet they’d been looking for, the long-awaited replacement for the Commodore 64 in American living rooms.

The industry’s fixation on these CD-ROM set-top boxes — a fixation which was hardly confined to Phillips and Commodore alone — perhaps requires a bit of explanation. One thing these gadgets were not, at least if you listened to the voices promoting them, was game consoles. The set-top boxes could be used for many purposes, from displaying multimedia encyclopedias to playing music CDs. And even when they were used for pure interactive entertainment, it would be, at least potentially, adult entertainment (a term that was generally not meant in the pornographic sense, although some were already muttering about the possibilities that lurked therein as well). This was part and parcel of a vision that came to dominate much of digital entertainment between about 1989 and 1994: that of a sort of grand bargain between Northern and Southern California, a melding of the new interactive technologies coming out of Silicon Valley with the movie-making machine of Hollywood. Much of television viewing, so went the argument, would become interactive, the VCR replaced with the multimedia set-top box.

In light of all this conventional wisdom, Commodore’s determination to enter the fray — effectively to finish the job that Phillips couldn’t seem to — can all too easily be seen as just another example of the me-too-ism that had clung to their earlier multimedia pronouncements. At the time, though, the project was exciting enough that Commodore was able to lure quite a number of prominent names to work with them on it. Carl Sassenrath, who had designed the core of the original AmigaOS — including its revolutionary multitasking capability — signed on again to adapt his work to the needs of a set-top box. (“In many ways, it was what we had originally dreamed for the Amiga,” he would later say of the project, a telling quote indeed.) Jim Sachs, still the most famous of Amiga artists thanks to his work on Cinemaware’s Defender of the Crown, agreed to design the look of the user interface. Reichart von Wolfsheild and Leo Schwab, both well-known Amiga developers, also joined. And for the role of marketing evangelist Commodore hired none other than Nolan Bushnell, the founder almost two decades before of Atari, the very first company to place interactive entertainment in American living rooms. The project as a whole was placed in the capable hands of Gail Wellington, known throughout the Amiga community as the only Commodore manager with a dollop of sense. The gadget itself came to be called CDTV — an acronym, Commodore would later claim in a part of the sales pitch that fooled no one, for “Commodore Dynamic Total Vision.”

Nolan Bushnell, Mr. Atari himself, plugs CDTV at a trade show.

Commodore announced CDTV at the Summer Consumer Electronics Show in June of 1990, inviting selected attendees to visit a back room and witness a small black box, looking for all the world like a VCR or a stereo component, running some simple demos. From the beginning, they worked hard to disassociate the product from the Amiga and, indeed, from computers in general. The word “Amiga” appeared nowhere on the hardware or anywhere on the packaging, and if all went according to plan CDTV would be sold next to televisions and stereos in department stores, not in computer shops. Commodore pointed out that everything from refrigerators to automobiles contained microprocessors these days, but no one called those things computers. Why should CDTV be any different? It required no monitor, instead hooking up to the family television set. It neither included nor required a keyboard — much industry research had supposedly proved that non-computer users feared keyboards more than anything else — nor even a mouse, being controlled entirely through a remote control that looked pretty much like any other specimen of same one might find between the cushions of a modern sofa. “If you know how to change TV channels,” said a spokesman, “you can take full advantage of CDTV.” It would be available, Commodore claimed, before the Christmas of 1990, which should be well before CD-I despite the latter’s monumental head start.

That timeline sounded overoptimistic even when it was first announced, and few were surprised to see the launch date slip into 1991. But the extra time did allow a surprising number of developers to jump aboard the CDTV train. Commodore had never been good at developer relations, and weren’t terribly good at it now; developers complained that the tools Commodore provided were always late and inadequate and that help with technical problems wasn’t easy to come by, while financial help was predictably nonexistent. Still, lots of CD-I projects had been left in limbo by Phillips’s dithering and were attractive targets for adaptation to CDTV, while the new platform’s Amiga underpinnings made it fairly simple to port over extant Amiga games like SimCity and Battle Chess. By early 1991, Commodore could point to about fifty officially announced CDTV titles, among them products from such heavy hitters as Grolier, Disney, Guinness (the publisher, not the beer company), Lucasfilm, and Sierra. This relatively long list of CDTV developers certainly seemed a good sign, even if not all of the products they proposed to create looked likely to be all that exciting, or perhaps even all that good. Plenty of platforms, including the original Amiga, had launched with much less.

While the world — or at least the Amiga world — held its collective breath waiting for CDTV’s debut, the charismatic Nolan Bushnell did what he had been hired to do: evangelize like crazy. “What we are really trying to do is make multimedia a reality, and I think we’ve done that,” he said. The hyperbole was flying thick and fast from all quarters. “This will change forever the way we communicate, learn, and entertain,” said Irving Gould. Not to be outdone, Bushnell noted that “books were great in their day, but books right now don’t cut it. They’re obsolete.” (Really, why was everyone so determined to declare the death of the book during this period?)

CDTV being introduced at the 1991 World of Amiga show. Doing the introducing is Gail Wellington, head of the CDTV project and one of the unsung heroes of Commodore.

The first finished CDTV units showed up at the World of Amiga show in New York City in April of 1991; Commodore sold their first 350 to the Amiga faithful there. A staggered roll-out followed: to five major American cities, Canada, and the Commodore stronghold of Britain in May; to France, Germany, and Italy in the summer; to the rest of the United States in time for Christmas. With CD-I now four years late, CDTV thus became the first CD-ROM-based set-top box you could actually go out and buy. Doing so would set you back just under $1000.

The Amiga community, despite being less than thrilled by the excision of all mention of their platform’s name from the product, greeted the launch with the same enthusiasm they had lavished on the Amiga 3000, their Great White Hope of the previous year, or for that matter the big Christmas marketing campaign of 1989. Amazing Computing spoke with bated breath of CDTV becoming the “standard for interactive multimedia consumer hardware.”

“Yes, but what is it for?” These prospective customers’ confusion is almost palpable.

Alas, there followed a movie we’ve already seen many times. Commodore’s marketing was ham-handed as usual, declaring CDTV “nothing short of revolutionary” but failing to describe in clear, comprehensible terms why anyone who was more interested in relaxing on the sofa than fomenting revolutions might actually want one. The determination to disassociate CDTV from the scary world of computers was so complete that the computer magazines weren’t even allowed advance models; Amiga Format, the biggest Amiga magazine in Britain at the time with a circulation of more than 160,000, could only manage to secure their preview unit by making a side deal with a CDTV developer. CDTV units were instead sent to stereo magazines, who shrugged their shoulders at this weird thing this weird computer company had sent them and returned to reviewing the latest conventional CD players. Nolan Bushnell, the alleged marketing genius who was supposed to be CDTV’s ace in the hole, talked a hyperbolic game at the trade shows but seemed otherwise disengaged, happy just to show up and give his speeches and pocket his fat paychecks. One could almost suspect — perish the thought! — that he had only taken this gig for the money.

In the face of all this, CDTV struggled mightily to make any headway at all. When CD-I hit the market just before Christmas, boasting more impressive hardware than CDTV for roughly the same price, it only made the hill that much steeper. Commodore now had a rival in a market category whose very existence consumers still obstinately refused to recognize. As an established maker of consumer electronics in good standing with the major retailers — something Commodore hadn’t been since the heyday of the Commodore 64 — Phillips had lots of advantages in trying to flog their particular white elephant, not to mention an advertising budget their rival could only dream of. CD-I was soon everywhere, on store shelves and in the pages of the glossy lifestyle magazines, while CDTV was almost nowhere. Commodore did what they could, cutting the list price of CDTV to less than $800 and bundling with it The New Grolier Encyclopedia and the smash Amiga game Lemmings. It didn’t help. After an ugly Christmas season, Nolan Bushnell and the other big names all deserted the sinking ship.

Even leaving aside the difficulties inherent in trying to introduce people to an entirely new category of consumer electronics — difficulties that were only magnified by Commodore’s longstanding marketing ineptitude — CDTV had always been problematic in ways that had been all too easy for the true believers to overlook. It was clunky in comparison to CD-I, with a remote control that felt awkward to use, especially for games, and a drive which required that the discs first be placed into an external holder before being loaded into the unit proper. More fundamentally, the very re-purposing of old Amiga technology that had allowed it to beat CD-I to market made it an even more limited platform than its rival for running the sophisticated adult entertainments it was supposed to have enabled. Much of the delay in getting CD-I to market had been the product of a long struggle to find a way of doing video playback with some sort of reasonable fidelity. Even the released CD-I performed far from ideally in this area, but it did better than CDTV, which at best — at best, mind you — might be able to fill about a third of the television screen with low-resolution video running at a choppy twelve frames per second. It was going to be hard to facilitate a union of Silicon Valley and Hollywood with technology like that.

None of CDTV’s problems were the fault of the people who had created it, who had, like so many Commodore engineers before and after them, been asked to pull off a miracle on a shoestring. They had managed to create, if not quite a miracle, something that worked far better than it had a right to. It just wasn’t quite good enough to overcome the marketing issues, the competition from CD-I, and the marketplace confusion engendered by an interactive set-top box that said it wasn’t a game console but definitely wasn’t a home computer either.

CDTV could be outfitted with a number of accessories that turned it into more of a “real” computer. Still, those making software for the system couldn’t count on any of these accessories being present, which served to greatly restrict their products’ scope of possibility.

Which isn’t to say that some groundbreaking work wasn’t done by the developers who took a leap of faith on Commodore — almost always a bad bet in financial terms — and produced software for the platform. CDTV’s early software catalog was actually much more impressive than that of CD-I, whose long gestation had caused so many initially enthusiastic developers to walk away in disgust. The New Grolier Encyclopedia was a true multimedia dictionary; the entry for John F. Kennedy, for example, included not only a textual biography and photos to go along with it but audio excerpts from his most famous speeches. The American Heritage Dictionary also offered images where relevant, along with an audio pronunciation of every single word. American Vista: The Multimedia U.S. Atlas boasted lots of imagery of its own to add flavor to its maps, and could plan a route between any two points in the country at the click of a button. All of these things may sound ordinary today, but in a way that very modern ordinariness is a testament to what pioneering products these really were. They did in fact present an argument that, while others merely talked about the multimedia future, Commodore through CDTV was doing it — imperfectly and clunkily, yes, but one has to start somewhere.

One of the most impressive CDTV titles of all marked the return of one of the Amiga’s most beloved icons. After designing the CDTV’s menu system, the indefatigable Jim Sachs returned to the scene of his most famous creation. Really a remake rather than a sequel, Defender of the Crown II reintroduced many of the additional graphics and additional tactical complexities that had been excised from the original in the name of saving time, pairing them with a full orchestral soundtrack, digitized sound effects, and a narrator to detail the proceedings in the appropriate dulcet English accent. It was, Sachs said, “the game the original Defender of the Crown was meant to be, both in gameplay and graphics.” He did almost all of the work on this elaborate multimedia production all by himself, farming out little more than the aforementioned narration, and Commodore themselves released the game, having acquired the right to do so from the now-defunct Cinemaware at auction. While, as with the original, its long-term play value is perhaps questionable, Defender of the Crown II even today still looks and sounds mouth-wateringly gorgeous.


If any one title on CDTV was impressive enough to sell the machine by itself, this ought to be have been it. Unfortunately, it didn’t appear until well into 1992, by which time CDTV already had the odor of death clinging to it. The very fact that Commodore allowed the game to be billed as the sequel to one so intimately connected to the Amiga’s early days speaks to a marketing change they had instituted to try to breathe some life back into the platform.

The change was born out of an insurrection staged by Commodore’s United Kingdom branch, who always seemed to be about five steps ahead of the home office in any area you cared to name. Kelly Sumner, managing director of Commodore UK:

We weren’t involved in any of the development of CDTV technology; that was all done in America. We were taking the lead from the corporate company. And there was a concrete stance of “this is how you promote it, this is the way forward, don’t do this, don’t do that.” So, that’s what we did.

But after six or eight months we basically turned around and said, “You don’t know what you’re talking about. It ain’t going to go anywhere, and if it does go anywhere you’re going to have to spend so much money that it isn’t worth doing. So, we’re going to call it the Amiga CDTV, we’re going to produce a package with disk drives and such like, and we’re going to promote it like that. People can understand that, and you don’t have to spend so much money.”

True to their word, Commodore UK put together what they called “The Multimedia Home Computer Pack,” combining a CDTV unit with a keyboard, a mouse, an external disk drive, and the software necessary to use it as a conventional Amiga as well as a multimedia appliance — all for just £100 more than a CDTV unit alone. Commodore’s American operation grudgingly followed their lead, allowing the word “Amiga” to creep back into their presentations and advertising copy.

Very late in the day, Commodore finally began acknowledging and even celebrating CDTV’s Amigahood.

But it was too late — and not only for CDTV but in another sense for the Amiga platform itself. The great hidden cost of the CDTV disappointment was the damage it did to the prospects for CD-ROM on the Amiga proper. Commodore had been so determined to position CDTV as its own thing that they had rejected the possibility of equipping Amiga computers as well with CD-ROM drives, despite the pleas of software developers and everyday customers alike. A CD-ROM drive wasn’t officially mated to the world’s first multimedia personal computer until the fall of 1992, when, with CDTV now all but left for dead, Commodore finally started shipping an external drive that made it possible to run most CDTV software, as well as CD-based software designed specifically for Amiga computers, on an Amiga 500. Even then, Commodore provided no official CD-ROM solution for Amiga 2000 and 3000 owners, forcing them to cobble together third-party adapters that could interface with drives designed for the Macintosh. The people who owned the high-end Amiga models, of course, were the ones working in the very cutting-edge fields that cried out for CD-ROM.

It’s difficult to overstate the amount of damage the Amiga’s absence from the CD-ROM party, the hottest ticket in computing at the time, did to the platform’s prospects. It single-handedly gave the lie to every word in Harold Copperman’s 1990 speech about Commodore being “the leaders in multimedia.” Many of the most vibrant Amiga developers were forced to shift to the Macintosh or another platform by the lack of CD-ROM support. Of all Commodore’s failures, this one must loom among the largest. They allowed the Macintosh to become the platform most associated with the new era of CD-ROM-enabled multimedia computing without even bothering to contest the territory. The war was over before Commodore even realized a war was on.

Commodore’s feeble last gasp in terms of marketing CDTV positioned it as essentially an accessory to desktop Amigas, a “low-cost delivery system for multimedia” targeted at business and government rather than living rooms. The idea was that you could create presentations on Amiga computers, send them off to be mastered onto CD, then drag the CDTV along to board meetings or planning councils to show them off. In that spirit, a CDTV unit was reduced to a free toss-in if you bought an Amiga 3000 — two slow-selling products that deserved one another.

The final verdict on CDTV is about as ugly as they come: less than 30,000 sold worldwide in some eighteen months of trying; less than 10,000 sold in the American market Commodore so desperately wanted to break back into, and many or most of those sold at fire-sale discounts after the platform’s fate was clear. In other words, the 350 CDTV units that had been sold to the faithful at that first ebullient World of Amiga show made up an alarmingly high percentage of all the CDTV units that would ever sell. (Phillips, by contrast, would eventually manage to move about 1 million CD-I units over the course of about seven years of trying.)

The picture I’ve painted of the state of Commodore thus far is a fairly bleak one. Yet that bleakness wasn’t really reflected in the company’s bottom line during the first couple of years of the 1990s. For all the trouble Commodore had breaking new products in North America and elsewhere, their legacy products were still a force to be reckoned with outside the United States. Here the end of the Cold War and subsequent lifting of the Iron Curtain proved a boon. The newly liberated peoples of Eastern Europe were eager to get their hands on Western computers and computer games, but had little money to spend on them. The venerable old Commodore 64, pulling along behind it that rich catalog of thousands upon thousands of games of all stripes, was the perfect machine for these emerging markets. Effectively dead in North America and trending that way in Western Europe, it now enjoyed a new lease on life in the former Soviet sphere, its sales numbers suddenly climbing sharply again instead of falling. The Commodore 64 was, it seemed, the cockroach of computers; you just couldn’t kill it. Not that Commodore wanted to: they would happily bank every dollar their most famous creation could still earn them. Meanwhile the Amiga 500 was selling better than ever in Western Europe, where it was now the most popular single gaming platform of all, and Commodore happily banked those profits as well.

Commodore’s stock even enjoyed a brief-lived bubble of sorts. In the spring and early summer of 1991, with sales strong all over Europe and CDTV poised to hit the scene, the stock price soared past $20, stratospheric heights by Commodore’s recent standards. This being Commodore, the stock collapsed below $10 again just as quickly — but, hey, it was nice while it lasted. In the fiscal year ending on June 30, 1991, worldwide sales topped the magical $1 billion mark, another height that had last been seen in the heyday of the Commodore 64. Commodore was now the second most popular maker of personal computers in Europe, with a market share of 12.4 percent, just slightly behind IBM’s 12.7 percent. The Amiga was now selling at a clip of 1 million machines per year, which would bring the total installed base to 4.5 million by the end of 1992. Of that total, 3.5 million were in Europe: 1.3 million in Germany, 1.2 million in Britain, 600,000 in Italy, 250,000 in France, 80,000 in Scandinavia. (Ironically in light of the machine’s Spanish name, one of the few places in Western Europe where it never did well at all was Spain.) To celebrate their European success, Irving Gould and Mehdi Ali took home salaries in 1991 of $1.75 million and $2.4 million respectively, the latter figure $400,000 more than the chairman of IBM, a company fifty times Commodore’s size, was earning.

But it wasn’t hard to see that Commodore, in relying on all of these legacy products sold in foreign markets, was living on borrowed time. Even in Europe, MS-DOS was beginning to slowly creep up on the Amiga as a gaming platform by 1992, while Nintendo and Sega, the two big Japanese console makers, were finally starting to take notice of this virgin territory after having ignored it for so long. While Amiga sales in Europe in 1992 remained blessedly steady, sales of the Amiga in North America were down as usual, sales of the Commodore 64 in Eastern Europe fell off thanks to economic chaos in the region, and sales of Commodore’s line of commodity PC clones cratered so badly that they pulled out of that market entirely. It all added up to a bottom line of about $900 million in total earnings for the fiscal year ending on June 30, 1992. The company was still profitable, but considerably less so than it had been the year before. Everyone was now looking forward to 1993 with more than a little trepidation.

Even as Commodore faced an uncertain future, they could at least take comfort that their arch-enemy Atari was having a much worse time of it. In the very early 1990s, Atari enjoyed some success, if not as much as they had hoped, with their Lynx handheld game console, a more upscale rival to the Nintendo Game Boy. The Atari Portfolio, a genuinely groundbreaking palmtop computer, also did fairly well for them, if perhaps not quite as well as it deserved. But the story of their flagship computing platform, the Atari ST, was less happy. Already all but dead in the United States, the ST’s market share in Europe shrank in proportion to the Amiga’s increasing sales, such that it fell from second to third most popular gaming computer in 1991, trailing MS-DOS now as well as the Amiga.

Atari tried to remedy the slowing sales with new machines they called the STe line, which increased the color palette to 4096 shades and added a blitter chip to aid onscreen animation. (The delighted Amiga zealots at Amazing Computing wrote of these Amiga-inspired developments that they reminded them of “an Amiga 500 created by a primitive tribe that had never actually seen an Amiga, but had heard reports from missionaries of what the Amiga could do.”) But the new hardware broke compatibility with much existing software, and it only got harder to justify buying an STe instead of an Amiga 500 as the latter’s price slowly fell. Atari’s total sales in 1991 were just $285 million, down by some 30 percent from the previous year and barely a quarter of the numbers Commodore was doing. Jack Tramiel and his sons kept their heads above water only by selling off pieces of the company, such as the Taiwanese manufacturing facility that went for $40.9 million that year. You didn’t have to be an expert in the computer business to understand how unsustainable that path was. In the second quarter of 1992, Atari posted a loss of $39.8 million on sales of just $23.3 million, a rather remarkable feat in itself. Whatever else lay in store for Commodore and the Amiga, they had apparently buried old Mr. “Business is War.”

Still, this was no time to bask in the glow of sweet revenge. The question of where Commodore and the Amiga went from here was being asked with increasing urgency in 1992, and for very good reason. The answer would arrive in the latter half of the year, in the form at long last of the real, fundamental technical improvements the Amiga community had been begging for for so long. But had Commodore done enough, and had they done it in time to make a difference? Those questions loomed large as the 68000 Wars were about to enter their final phase.

(Sources: the book On the Edge: The Spectacular Rise and Fall of Commodore by Brian Bagnall; Amazing Computing of August 1987, June 1988, June 1989, July 1989, May 1990, June 1990, July 1990, August 1990, September 1990, December 1990, January 1991 February 1991, March 1991, April 1991, May 1991, June 1991, August 1991, September 1991, November 1991, January 1992, February 1992, March 1992, April 1992, June 1992, July 1992, August 1992, September 1992, November 1992, and December 1992; Info of July/August 1988 and January/February 1989; Amiga Format of July 1991, July 1995, and the 1992 annual; The One of September 1990, May 1991, and December 1991; CU Amiga of June 1992, October 1992, and November 1992; Amiga Computing of April 1992; AmigaWorld of June 1991. Online sources include Matt Barton’s YouTube interview with Jim Sachs,  Sébastien Jeudy’s interview with Carl Sassenrath, Greg Donner’s Workbench Nostalgia, and Atari’s annual reports from 1989, available on archive.org. My huge thanks to reader “himitsu” for pointing me to the last and providing some other useful information on Commodore and Atari’s financials during this period in the comments to a previous article in this series. And thank you to Reichart von Wolfsheild, who took time from his busy schedule to spend a Saturday morning with me looking back on the CDTV project.)

 
 

Tags: , , , , ,

Games on the Mersey, Part 4: The All-Importance of Graphics

The die for the first successful incarnation of Psygnosis was cast in the summer of 1987 with the release of a game called Barbarian. It was actually the company’s fourth game, following Brataccas, that underwhelming fruition of Imagine Software’s old megagame dream, and two other titles which had tried to evoke some of the magic of games from other publishers and rather resoundingly failed: Arena, an unfun alternative to the Epyx Games sports series, and Deep Space, a vaguely Elite-like game of interstellar trading and space combat saddled with a control scheme so terrible that many buyers initially thought it was a bug. None of this trio, needless to say, had done much for Psygnosis’s reputation. But with Barbarian the company’s fortunes finally began to change. It provided them at last with just the formula for commercial success they had been so desperately seeking.

Barbarian

Programmed by the redoubtable Dave Lawson, Barbarian might be labeled an action-adventure if we’re feeling generous, although it offers nothing like the sort of open-ended living world other British developers of the era were creating under that label. It rather takes the form of a linear progression through a series of discrete screens, fighting monsters and dodging traps as the titular barbarian Hegor. The control scheme — for some reason a consistent sore spot in almost every game Lawson programmed — once again annoys more than it ought to, and the game as a whole is certainly no timeless classic. What it did have going for it back in the day, however, were its superb graphics and sound. Released initially only on the Atari ST and the Commodore Amiga, just as the latter especially was about to make major inroads in Britain and Europe thanks to the new Amiga 500 model, it was one of the first games to really show what these 16-bit powerhouses could do in the context of a teenage-boy-friendly action game. Reviewers were so busy gushing about the lengthy opening animation, the “strange-looking animals,” and the “digitised groans and grunts” that accompanied each swing of Hegor’s sword as he butchered them that they barely noticed the game’s more fundamental failings.

Barbarian became the first unadulterated, undeniable hit to be created by the Imagine/Psygnosis nexus since Dave Lawson’s Arcadia had kicked everything off on the Sinclair Spectrum almost five years before. Thus was a precedent set. Out were the old dreams of revolutionizing the substance of gaming via the megagame project; in were simple, often slightly wonky action games that looked absolutely great to the teenage boys who devoured them. If Lawson and Ian Hetherington were disappointed to have abandoned more high-concept fare for simple games with spectacular visuals, they could feel gratified that, after all the years of failure and fiasco as Imagine, Finchspeed, Fireiron, and finally Psygnosis, they were at last consistently making games that made them actual money.

Psygnosis’s first games had been created entirely in-house, with much of the design and coding done by Lawson and Hetherington themselves. In the wake of Barbarian‘s success, however, that approach was changed to prioritize what was really important in them. After the games already in the pipeline at the time of Barbarian‘s release were completed, future programming and design — such as the latter was in the world of Psygnosis — would mostly be outsourced to the hotshot young bedroom coders with which Britain was so amply endowed.

Psygnosis hired far more artists rather than programmers as in-house employees. They built an art team that became the envy of the industry around one Garvan Corbett, a talented illustrator and animator who had come to Psygnosis out of a workfare program in the very early days, before even Brataccas had been released, and who had been responsible for the precedent-setting graphics in Barbarian. Notably, none of Psygnosis’s artists had much prior experience with computers; the company preferred to hire exceptional artists in traditional mediums and teach them what they needed to know to apply their skills to computer games. It gave Psygnosis’s games a look that was, if not quite what one might describe as more mature than the rest of the industry, certainly more striking, more polished. Working with Amigas running the games-industry stalwart Deluxe Paint, Corbett and his colleagues would build on the submissions of the outside teams to bring them in line with Psygnosis’s house style, giving the in-game graphics that final sheen for which the company was so famous whilst also adding the elaborate title screens and opening animations for which they were if anything even more famous. Such a hybrid of in-house and out-of-house development was totally unique in late-1980s game-making, but it suited Psygnosis’s style-over-substance identity perfectly. “At Psygnosis, graphics are all-important,” wrote one journalist as his final takeaway after a visit to the company. Truer words were never written.

With the assistance of an ever-growing number of outside developers, the games poured out of Psygnosis in the years after Barbarian, sporting short, punchy titles that sounded like heavy-metal bands or Arnold Schwarzenegger movies, both of which things served as a profound influence on their young developers: Terrorpods, Obliterator, Menace, Baal, Stryx, Blood Money, Ballistix, Infestation, Anarchy, Nitro, Awesome, Agony. Occasionally Psygnosis would tinker with the formula, as when they released the odd French adventure game Chrono Quest, but mostly it was nothing but relentless action played over a relentlessly thumping soundtrack. An inordinate number of Psygnosis games seemed to feature tentacled aliens that needed to be blown up with various forms of lasers and high explosives. In light of this, even the most loyal Psygnosis fan could be forgiven for finding it a little hard to keep them all straight. Many of the plots and settings of the games would arrive on the scene only after the core gameplay had been completed, when Psygnosis’s stable of artists were set loose to wrap the skeletons submitted by the outside developers in all the surrealistic gore and glory they could muster. Such a development methodology couldn’t help but lend the catalog as a whole a certain generic quality. Yet it did very, very well for the company, as the sheer number of games they were soon churning out — nearly one new game every other month by 1989 — will attest.

Psygnosis’s favored machine during this era was the Amiga, where their aesthetic maximalism could be deployed to best effect. They became known among owners of Amigas and those who wished they were as the platform’s signature European publisher, the place to go for the most impressive Amiga audiovisuals of all. This was the same space occupied by Cinemaware among the North American publishers. It thus makes for an interesting exercise to compare and contrast the two companies’ approaches.

In the context of the broader culture, few would have accused Cinemaware’s Bob Jacob, a passionate fan of vintage B-movies, of having overly sophisticated tastes. Yet, and often problematic though they admittedly were in gameplay terms, Cinemaware’s games stand out next to those of Psygnosis for the way they use the audiovisual capabilities of the Amiga in the service of a considered aesthetic, whether they happen to be harking back to the Robin Hood of Errol Flynn in Defender of the Crown or the vintage Three Stooges shorts in the game of the same name. There was a coherent and unique-to-it sense of aesthetic unity behind each one of Cinemaware’s games, as indicated by the oft-mocked title Jacob created for the person tasked with bringing it all together: the “computographer,” who apparently replaced the cinematographer of a movie.

Psygnosis games, in contrast, had an aesthetic that could be summed up in the single word “more”: more explosions, more aliens, more sprites flying around, more colors. This was aesthetic maximalism at its most maximalist, where the impressiveness of the effect itself was its own justification. Psygnosis’s product-development manager John White said that “half the battle is won if the visuals are interesting.” In fact, he was being overly conservative in making that statement; for Psygnosis, making the graphics good was actually far more than half the battle that went into making a game. Ian Hetherington:

We always start with a technical quest — achieving something new with graphics. We have to satisfy ourselves that what we are trying to achieve is possible before we go ahead with a game. My worst moments are when I show innovative techniques to the Japanese, and all they want to know is, what is the plot. They don’t understand our way of going about things.

The Japanese approach, as practiced by designers like the legendary Shigeru Miyamoto, would lead to heaps of Nintendo Entertainment System games that remain as playable today as they were in their heyday. The Psygnosis approach… not so much. In fact, Psygnosis’s games have aged almost uniquely poorly among their peers. While we can still detect and appreciate the “computography” of a Cinemaware interactive movie, Psygnosis games hit us only with heaps of impressive audiovisual tricks that no longer impress. Their enormous pixels and limited color palettes — yes, even on an audiovisual powerhouse of the era like the Amiga — now make them look quaint rather than awe-inspiring. Their only hope to move us thus becomes their core gameplay — and gameplay wasn’t one of Psygnosis’s strengths. Tellingly, most Psygnosis games don’t sport credited designers at all, merely programmers and artists who cobbled together the gameplay in between implementing the special effects. An understanding of what people saw in all these interchangeable games with the generic teenage-cool titles therefore requires a real effort of imagination from anyone who wasn’t there during the games’ prime.

The success of Psygnosis’s games was inextricably bound up with the platform patriotism that was so huge a part of the computing scene of the 1980s. What with the adolescent tendency to elevate consumer lifestyle choices to the status of religion, the type of computer a kid had in his bedroom was as important to his identity as the bands he liked, the types of sports cars he favored, or the high-school social set he hung out with — or possibly all three combined. Where adults saw just another branded piece of consumer electronics, he saw a big chunk of his self-image. It was deeply, personally important to him to validate his choice by showing off his computer to best effect, preferably by making it do things of which no other computer on the market was capable. For Amiga owners in particular, the games of Psygnosis fulfilled this function better than those of any other publisher. You didn’t so much buy a Psygnosis game to play it as you did to look at it, and to throw it in the face of any of your mates who might dare to question the superiority of your Amiga. A Psygnosis game was graphics porn of the highest order.

But Psygnosis’s graphics-über-alles approach carried with it more dangers than just that of making games whose appeal would be a little incomprehensible to future generations. Nowhere was the platform patriotism that they traded on more endemic than in the so-called “scene” of software piracy, whose members had truly made the computers they chose to use the very center of their existence. And few games were more naturally tempting targets for these pirates than those of Psygnosis. After all, what you really wanted out of a Psygnosis game was just a good look at the graphics. Why pay for that quick look-see when you could copy the disk for free? Indeed, cracked versions of the games were actually more appealing than the originals in a way, for the cracking groups who stripped off the copy protection also got into the habit of adding options for unlimited lives and other “trainers.” By utilizing them, you could see everything a Psygnosis game had to offer in short order, without having to wrestle with the wonky gameplay at all.

It was partially to combat piracy that Psygnosis endeavored to make the external parts of their games’ presentations as spectacular — and as appealing to teenage sensibilities — as the graphics in the games themselves. Most of their games were sold in bloated oblong boxes easily twice the size of the typical British game box — rather ironically so, given that few games had less need than those of Psygnosis for all that space; it wasn’t as if there was a burning need for lengthy manuals or detailed background information to accompany such simple, generic shoot-em-ups. Virtually all of Psygnosis’s covers during this era were painted by Roger Dean, the well-known pop artist with whom Ian Hetherington and Dave Lawson had first established a relationship back in the Imagine days. “If you’re asking so much for a piece of software,” said Hetherington, “you can’t package it like a Brillo pad.”

Dean’s packaging artwork was certainly striking, if, like most things associated with Psygnosis during this period, a little one-note. Not a gamer himself, Dean had little real concept of the games he was assigned, for which he generally drew the cover art well before they had been completed anyway. He usually had little more to go on than the name of the game and whatever said name might happen to suggest to his own subconscious. The results were predictable; the way that Psygnosis’s covers never seemed to have anything to do with the games inside the boxes became a running joke. No matter: they looked great in their own right, just begging to be hung up in a teenager’s bedroom. In that spirit, Psygnosis took to including the cover art in poster or tee-shirt form inside many of those cavernous boxes of theirs. Whatever else one could say about them, they knew their customers.

It’s difficult to exaggerate the role that image played in every aspect of Psygnosis’s business. Some of the people who wound up making games for them, who were almost universally in the same demographic group as those who played their games, were first drawn to this particular publisher by the box art. “I was always a fan of the art style and the packaging,” remembers outside developer Martin Edmondson. “Against a sea of brightly-colored and cheap-looking game boxes the Psygnosis products stood out a mile and had an air of mystery — and quality — about them.” Richard Browne, who became a project manager at Psygnosis in early 1991, noted that “while many of the games they produced were renowned somewhat for style over substance, they just oozed quality and the presentation was just sheer class.”

The quintessential Psygnosis game of the period between 1987 and 1990 is undoubtedly the 1989 release Shadow of the Beast. The company’s most massively hyped title since the days of Brataccas and the Imagine megagames, it differed from them by rewarding its hypers with sales to match. It was the creation of a three-man development studio who called themselves Reflections, who had already created an earlier hit for Psygnosis in the form of Ballistix. Unusually among Psygnosis’s outside developers, Reflections created all of their own graphics, outsourcing only the music and of course the Roger Dean cover art. Martin Edmondson, the leader of the Reflections trio, makes no bones about their priorities when creating Shadow of the Beast:

For me, it was about the technical challenge of getting something working that was far beyond the theoretical limits of the machine. It wasn’t a story originally. It was a technical demonstration of what the machine could do — I suppose sort of a “look how clever we are” type of thing. So, yeah… less interested in the subtleties of game design and game process, more “let’s see if we can do what these big expensive arcade machines are doing on our small home computer.”

Edmondson describes the book that most inspired Shadow of the Beast not as any work of fiction but rather the Amiga Technical Reference Manual. Every element of the game was crafted with the intention of pushing the hardware described therein to heights that had only been hinted at even in Psygnosis’s own earlier works. Barely two years after Shadow of the Beast‘s release, Edmondson’s assessment of its design — or perhaps lack thereof — was unflinching: “We couldn’t get away with it now. It was definitely a case of being in the right place at the right time. Apart from how many colors and layers of parallax and monsters we could squeeze on screen, no thought went into it whatsoever.” Its general approach is similar to that of the earlier Barbarian, but it’s a much more constrained experience than even that game had been. As the landscape scrolls behind your running avatar, you have to execute a series of rote moves with pinpoint timing to avoid seeing him killed. It’s brutally difficult, and difficult in ways that really aren’t much fun at all.

But, as Edmondson himself stated, to complain overmuch about the gameplay of Shadow of the Beast is to rather miss the point of the game. While plenty of praise would be given to the atmospheric soundtrack Psygnosis commissioned from veteran game composer David Whittaker and to the unprecedentedly huge sprites programmed by Reflections, the most attention of all would be paid to the thirteen layers of parallax scrolling that accompany much of the action.

Parallax scrolling is a fancy phrase used to describe the simulation of a real-world property so ingrained in us that we rarely even realize it exists — until, that is, we see a videogame that doesn’t implement it. Imagine you’re standing at the edge of a busy highway on a flat plain, with a second highway also in view beyond this one, perhaps half a kilometer in the distance. Cars on the highway immediately before you whiz by very quickly, perhaps almost too quickly to track with the eyes. Those on the distant highway, however, appear to move through your field of view relatively slowly, even though they’re traveling at roughly the same absolute speed as those closer to you. This difference is known as the parallax effect.

Because the real world we live in is an analog one, the parallax effect here has infinite degrees of subtle shading. But videogames which implemented parallax in the 1980s usually did so on only one or two rigid levels, resulting in scrolling landscapes that, while they may have looked better than those showing no parallax effect at all, nevertheless had their own artificial quality. Shadow of the Beast, however, uses its thirteen separate parallax layers to approach the realm of the analog, producing an effect that feels startingly real in contrast to any of its peers. As you watch YouTube creator Phoenix Risen’s playthrough of some of the trained version of Shadow of the Beast below — playing a non-trained version of this game is an exercise for masochists only —  be sure to take note of the scrolling effects; stunning though they were in their day, they’re like much else that used to impress in vintage Psygnosis games in that they’re all too easy to overlook entirely today.


Whether such inside baseball as the numbers of layers of parallax scrolling really ought to be the bedrock of a game’s reputation is perhaps debatable, but such was the nature of the contemporary gaming beast upon which Shadow of the Beast so masterfully capitalized. Edmonson has admitted that implementing the thirteen-layer scheme consumed so much of the Amiga’s considerable power that there was very little left over to implement an interesting game even had he and his fellow developers been more motivated to do so.

Psygnosis sold this glorified tech demo for fully £35, moving into the territory Imagine had proposed to occupy with the megagames back in the day. This time, though, they had a formula for success at that extreme price point. “Shadow of the Beast just went to show that you don’t need quality gameplay to sell a piece of software,” wrote a snarky reviewer from the magazine The One. Zzap! described it only slightly more generously, as “very nice to look at, very tough to play, and very expensive.” Whatever its gameplay shortcomings, Shadow of the Beast became the most iconic Amiga game since Cinemaware’s Defender of the Crown, the ultimate argument to lay before your Atari ST-owning schoolmate. “Within a week or so of launch they could barely press enough disks to keep up with demand,” remembers Edmondson. For the Imagine veterans who had stayed the course at Psygnosis, it had to feel like the sweetest of vindications.

One can’t help but admire Psygnosis’s heretofore unimagined (ha!) ability to change. They had managed to execute a complete about-face, shedding the old Imagine legacy of incompetence and corruption. In addition to being pretty poor at making games people actually wanted to play, Imagine had been staggeringly, comprehensively bad at all of the most fundamental aspects of running a business, whilst also being, to put it as gently as possible, rather ethically challenged to boot. They had had little beyond audacity going for them. Few would have bet that Psygnosis, with two of their three leaders the very same individuals who had been responsible for the Imagine debacle, would have turned out any different. And yet, here they were.

It’s important to note that the transition from Imagine to Psygnosis encompassed much more than just hitting on a winning commercial formula. Ian Hetherington and Dave Lawson, with the aid of newcomer Jonathan Ellis, rather changed their whole approach to doing business, creating a sustainable company this time around that conducted itself with a measure of propriety and honesty. Whereas Imagine had enraged and betrayed everyone with whom they ever signed a contract, Psygnosis developed a reputation — whatever you thought of their actual games — as a solid partner, employer, and publisher. There would never be anything like the external scandals and internal backstabbing that had marked Imagine’s short, controversial life. Indeed, none of the many outside developers with whom Psygnosis did deals ever seemed to have a bad word to say about them. Hetherington’s new philosophy was to “back the talent, not the product.” Said talent was consistently supported, consistently treated fairly and even generously. Outside developers felt truly valued, and they reciprocated with fierce loyalty.

As people interested in the processes of history, we naturally want to understand to what we can attribute this transformation. Yet that question is ironically made more difficult to answer by another aspect of said transformation: rather than making a constant spectacle of themselves as had Imagine, Psygnosis became a low-key, tight-lipped organization when it came to their personalities and their internal politics, preferring to let their games, their advertising, and all those gorgeous Roger Dean cover paintings speak for them. Neither Hetherington, Lawson, nor Ellis spoke publicly on a frequent basis, and full-on interviews with them were virtually nonexistent in the popular or the trade press.

That said, we can hazard a few speculations about how this unlikely transformation came to be. The obvious new variable in the equation is Jonathan Ellis, a quietly competent businessman of exactly the type Imagine had always so conspicuously lacked; his steady hand at the wheel must have made a huge difference to the new company. If we’re feeling kind, we might also offer some praise to Ian Hetherington, who showed every sign of having learned from his earlier mistakes — a far less common human trait than one might wish it was — and having revamped his approach to match his hard-won wisdom.

If we’re feeling a little less kind, we might note that Dave Lawson’s influence at Psygnosis steadily waned after Brataccas was completed, with Hetherington and Ellis coming to the fore as the real leaders of the company. He left entirely in 1989, giving as his public reason his dissatisfaction with Psygnosis’s new focus on outside rather than in-house development. He went on to form a short-lived development studio of his own called Kinetica, who released just one unsuccessful game before disbanding. And after that, Dave Lawson left the industry for good, to join his old Imagine colleague Mark Butler in modern obscurity.

A rare shot of Ian Hetherington, left, with his business partner Jonathan Ellis, right, in 1991. (In the middle is Gary Bracey of Imagine’s old Commercial Breaks companion Ocean Software.) In contrast to the look-at-me! antics of the Imagine days, Hetherington and Ellis were always among the most elusive of games-industry executives, preferring to let the products — and their superb marketing department — speak for themselves.

None of this is to say that there were no traces of the Imagine that had been to be found in the Psygnosis that was by 1989. On the contrary: for all the changes Hetherington and Ellis wrought, the new Psygnosis still evinced much of the Imagine DNA. To see it, one needed look no further than the Roger Dean cover art, a direct legacy of Imagine’s megagame dream. Indeed, the general focus on hype and style was as pronounced as ever at Psygnosis, albeit executed in a far more sophisticated, ethical, and sustainable fashion. One might even say that Hetherington and Ellis built Psygnosis by discarding all the scandalous aspects of Imagine while retaining and building upon the visionary aspects. Sometimes the legacy could be subtle rather than obvious. For example, the foreign distribution network that had been set up by Bruce Everiss to fuel Imagine’s first rush of success was also a key part of the far more long-lived success enjoyed by Psygnosis. Hetherington and Ellis never forgot Everiss’s lesson that having better distribution than the competition — having more potential customers to sell to — could make all the difference. As home computers spread like wildfire across Western Europe during the latter half of the 1980s, Psygnosis games were there early and in quantity to reap the benefits. In 1989, Jonathan Ellis estimated that France and West Germany alone made up 60 percent of Psygnosis’s sales.

Psygnosis knew that many of their potential customers, particularly in less well-off countries, weren’t lucky enough to own Amigas. Thus, while Psygnosis games were almost always born on Amigas, they were ported widely thereafter. Like those of Cinemaware, Psygnosis games gained a cachet merely from being associated with the Amiga, the computer everyone recognized as the premiere game machine of the time — even if some were, for reasons of that afore-described platform patriotism, reluctant to acknowledge the fact out loud. Even the most audiovisually spectacular Psygnosis experiences, like Shadow of the Beast, were duly ported to the humble likes of the Commodore 64 and Sinclair Spectrum, where they paled in comparison to their Amiga antecedents but sold well anyway on the strength of the association. This determination to meet the mass market wherever it lived also smacked of Imagine — albeit, yet again, being far more competently executed.

The South Harrington Building, Liverpool, home of Psygnosis for several years from 1989.

In addition to cutting ties with Dave Lawson in 1989, Hetherington and Ellis also shed original investor and Liverpool big wheel Roger Talbot Smith that year, convincing him to give up his share of the company in return for royalty payments on all Psygnosis sales over the next several years. Yet, and even as their network of outside developers and contractors spread across Britain and beyond, Psygnosis’s roots remained firmly planted in Liverpudlian soil. They moved out of their dingy offices behind Smith’s steel foundry and into a stylish gentrified shipping warehouse known as the South Harrington Building. It lay just one dock over from the city’s new Beatles museum, a fact that must have delighted any old Imagine stalwarts still hanging about the place. While an American journalist visiting Psygnosis in early 1991 could still pronounce Liverpool as a whole to be “very grim,” they were certainly doing their bit to change the economic picture.

Meanwhile Ian Hetherington in particular was developing a vision for Psygnosis’s future — indeed, for the direction that games in general must go. Like many of his counterparts in the United States, he saw the CD-ROM train coming down the track early. “The technological jump is exponential,” he said, “which means you have to make the jump now — otherwise when CD happens you’re going to be ten years behind, not two, and you’re never going to catch up.” With the move to the larger South Harrington Building offices, he set up an in-house team to research CD-ROM applications and develop techniques that could be utilized when the time was right by Psygnosis’s collection of loyal outside developers.

In this interest in CD-ROM — still a rare preoccupation among British publishers, where consumer-computing technology lagged two or three years behind the United States — Psygnosis once again invited comparison with that most obvious point of comparison, the American publisher Cinemaware. Yet there were important differences between the two companies on this front as well. While Cinemaware invested millions into incorporating real-world video footage into their games, Hetherington rejected the fad of “interactive video” which was all the rage in the United States at the time. His point of view reads as particularly surprising given that so many interactive-video zealots of the immediate future would be accused of making pretty but empty games — exactly what Psygnosis was so often accused of in the here and now. It serves perhaps as further evidence of Hetherington’s ability to learn and to evolve. Hetherington:

Interactive video is a farce. It is ill-conceived and it doesn’t work. It is seductive, though. Trying to interact with £400,000 worth of video on disc is a complete fiasco. We are looking for alternative uses of CD. You have to throw your existing thinking in the bin, then go sit in the middle of a field and start again from scratch. Most developers will evolve into CD. They will go from 5 MB products to 10 MB products with studio-quality soundtracks, and that will be what characterizes CD products for the next few years.

Hetherington also differed from Cinemaware head Bob Jacob in making sure his push into CD-ROM was, in keeping with the new operational philosophy behind Psygnosis in general, a sustainable one. Rather than betting the farm like his American counterpart and losing his company when the technology didn’t mature as quickly as anticipated, Hetherington used the ongoing sales from Psygnosis’s existing premium, high-profit-margin games — for all their audiovisual slickness, these simple action games really didn’t cost all that much to make — to fund steady, ongoing work in figuring out what CD-ROM would be good for, via an in-house team known as the Advanced Technology Group. When the advanced technology was finally ready, Psygnosis would be ready as well. At risk of belaboring the point, I will just note one last time how far this steady, methodical, reasoned approach to gaming’s future was from the pie-in-the-sky dreaming of the Imagine megagames.

The Psygnosis Advanced Technology Group in 1991. Standing second from left in the back row is John Gibson — the media’s favorite “Granddad” himself — who after spending some years with Denton Designs wound up rejoining some other old Imagine mates at Psygnosis.

As the 1990s began, then, Psygnosis had a lot going for them, both in the Amiga-dominated present and in the anticipated CD-ROM-dominated future. One thing they still lacked, though — one thing that Imagine also had resoundingly failed to produce — was a single game to their name that could be unequivocally called a classic. It was perhaps inevitable that one of the company’s stable of outside developers, nurtured and supported as they were, must eventually break away from the established Psygnosis gameplay formulas and deliver said classic. Still, such things are always a surprise when they happen. And certainly when Psygnosis did finally get their classic, the magnitude of its success would come as a shock even to them.

(Sources: the book Grand Thieves and Tomb Raiders: How British Videogames Conquered the World by Rebecca Levene and Magnus Anderson; the extra interviews that accompanied the documentary film From Bedrooms to Billions; Retro Gamer 50; Computer and Video Games of October 1987; STart of June 1990; The One of July 1989, March 1990, September 1990 and February 1992; The Game Machine of October 1989 and August 1990; Transactor of August 1989; Next Generation of November 1995; the online articles “From Lemmings to Wipeout: How Ian Hetherington Incubated Gaming Success” from Polygon, “The Psygnosis Story: John White, Director of Software” from Edge Online, and “An Ode to the Owl: The Inside Story of Psygnosisfrom Push Square.)

 
18 Comments

Posted by on September 29, 2017 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Ultima VI

After Richard Garriott and his colleagues at Origin Systems finished each Ultima game — after the manic final crunch of polishing and testing, after the release party, after the triumphant show appearances and interviews in full Lord British regalia — there must always arise the daunting question of what to do next. Garriott had set a higher standard for the series than that of any of its competitors almost from the very beginning, when he’d publicly declared that no Ultima would ever reuse the engine of its predecessor, that each new entry in the series would represent a significant technological leap over what had come before. And just to add to that pressure, starting with Ultima IV he’d begun challenging himself to make each new Ultima a major thematic statement that also built on what had come before. Both of these bars became harder and harder to meet as the series advanced.

As if that didn’t present enough of a burden, each individual entry in the series came with its own unique psychological hurdles for Garriott to overcome. For example, by the time he started thinking about what Ultima V should be he’d reached the limits of what a single talented young man like himself could design, program, write, and draw all by himself on his trusty Apple II. It had taken him almost a year — a rather uncomfortable year for his brother Robert and the rest of Origin’s management — to accept that reality and to begin to work in earnest on Ultima V with a team of others.

The challenge Garriott faced after finishing and releasing that game in March of 1988 was in its way even more emotionally fraught: the challenge of accepting that, just as he’d reached the limits of what he could do alone on the Apple II a couple of years ago, he’d now reached the limits of what any number of people could do on Steve Wozniak’s humble little 8-bit creation. Ultima V still stands today as one of the most ambitious things anyone has ever done on an Apple II; it was hard at the time and remains hard today to imagine how Origin could possibly push the machine much further. Yet that wasn’t even the biggest problem associated with sticking with the platform; the biggest problem could be seen on each monthly sales report, which showed the Apple II’s numbers falling off even faster than those of the Commodore 64, the only other viable 8-bit computer remaining in the American market.

After serving as the main programmer on Ultima V, John Miles’s only major contribution to Ultima VI was the opening sequence. The creepy poster of a pole-dancing centaur hanging on the Avatar’s wall back on Earth has provoked much comment over the years…

Garriott was hardly alone at Origin in feeling hugely loyal to the Apple II, the only microcomputer he’d ever programmed. While most game developers in those days ported their titles to many platforms, almost all had one which they favored. Just as Epyx knew the Commodore 64 better than anyone else, Sierra had placed their bets on MS-DOS, and Cinemaware was all about the Commodore Amiga, Origin was an Apple II shop through and through. Of the eleven games they’d released from their founding in 1983 through to the end of 1988, all but one had been born and raised on an Apple II.

Reports vary on how long and hard Origin tried to make Ultima VI work on the Apple II. Richard Garriott, who does enjoy a dramatic story even more than most of us, has claimed that Origin wound up scrapping nine or even twelve full months of work; John Miles, who had done the bulk of the programming for Ultima V and was originally slated to fill the same role for the sequel, estimated to me that “we probably spent a few months on editors and other utilities before we came to our senses.” At any rate, by March of 1989, the one-year anniversary of Ultima V‘s release, the painful decision had been made to switch not only Ultima VI but all of Origin’s ongoing and future projects to MS-DOS, the platform that was shaping up as the irresistible force in American computer gaming. A slightly petulant but nevertheless resigned Richard Garriott slapped an Apple sticker over the logo of the anonymous PC clone now sitting on his desk and got with the program.

Richard Garriott with an orrery, one of the many toys he kept at the recently purchased Austin house he called Britannia Manor.

Origin was in a very awkward spot. Having frittered away a full year recovering from the strain of making the previous Ultima, trying to decide what the next Ultima should be, and traveling down the technological cul de sac that was now the Apple II, they simply had to have Ultima VI finished — meaning designed and coded from nothing on an entirely new platform — within one more year if the company was to survive. Origin had never had more than a modestly successful game that wasn’t an Ultima; the only way their business model worked was if Richard Garriott every couple of years delivered a groundbreaking new entry in their one and only popular franchise and it sold 200,000 copies or more.

John Miles, lacking a strong background in MS-DOS programming and the C language in which all future Ultimas would be coded, was transferred off the team to get himself up to speed and, soon enough, to work on middleware libraries and tools for the company’s other programmers. Replacing him on the project in Origin’s new offices in Austin, Texas, were Herman Miller and Cheryl Chen, a pair of refugees from the old offices in New Hampshire, which had finally been shuttered completely in January of 1989. It was a big step for both of them to go from coding what until quite recently had been afterthought MS-DOS versions of Origin’s games to taking a place at the center of the most critical project in the company. Fortunately, both would prove more than up to the task.

Just as Garriott had quickly learned to like the efficiency of not being personally responsible for implementing every single aspect of Ultima V, he soon found plenty to like about the switch to MS-DOS. The new platform had four times the memory of the Apple II machines Origin had been targeting before, along with (comparatively) blazing-fast processors, hard drives, 256-color VGA graphics, sound cards, and mice. A series that had been threatening to burst the seams of the Apple II now had room to roam again. For the first time with Ultima VI, time rather than technology was the primary restraint on Garriott’s ambitions.

But arguably the real savior of Ultima VI was not a new computing platform but a new Origin employee: one Warren Spector, who would go on to join Garriott and Chris Roberts — much more on him in a future article — as one of the three world-famous game designers to come out of the little collective known as Origin Systems. Born in 1955 in New York City, Spector had originally imagined for himself a life in academia as a film scholar. After earning his Master’s from the University of Texas in 1980, he’d spent the next few years working toward his PhD and teaching undergraduate classes. But he had also discovered tabletop gaming at university, from Avalon Hill war games to Dungeons & Dragons. When a job as a research archivist which he’d thought would be his ticket to the academic big leagues unexpectedly ended after just a few months, he wound up as an editor and eventually a full-fledged game designer at Steve Jackson Games, maker of card games, board games, and RPGs, and a mainstay of Austin gaming circles. It was through Steve Jackson, like Richard Garriott a dedicated member of Austin’s local branch of the Society for Creative Anachronism, that Spector first became friendly with the gang at Origin; he also discovered Ultima IV, a game that had a profound effect on him. He left Austin in March of 1987 for a sojourn in Wisconsin with TSR, the makers of Dungeons & Dragons, but, jonesing for the warm weather and good barbecue of the city that had become his adopted hometown, he applied for a job with Origin two years later. Whatever role his acquaintance with Richard Garriott and some of the other folks there played in getting him an interview, it certainly didn’t get him a job all by itself; Spector claims that Dallas Snell, Robert Garriott’s right-hand man running the business side of the operation, grilled him for an incredible nine hours before judging him worthy of employment. (“May you never have to live through something like this just to get a job,” he wishes for all and sundry.) Starting work at Origin on April 12, 1989, he was given the role of producer on Ultima VI, the high man on the project totem pole excepting only Richard Garriott himself.

Age 33 and married, Spector was one of the oldest people employed by this very young company; he realized to his shock shortly after his arrival that he had magazine subscriptions older than Origin’s up-and-coming star Chris Roberts. A certain wisdom born of his age, along with a certain cultural literacy born of all those years spent in university circles, would serve Origin well in the seven years he would remain there. Coming into a company full of young men who had grand dreams of, as their company’s tagline would have it, “creating worlds,” but whose cultural reference points didn’t usually reach much beyond Lord of the Rings and Star Wars, Spector was able to articulate Origin’s ambitions for interactive storytelling in a way that most of the others could not, and in time would use his growing influence to convince management of the need for a real, professional writing team to realize those ambitions. In the shorter term — i.e., in the term of the Ultima VI project — he served as some badly needed adult supervision, systematizing the process of development by providing everyone on his team with clear responsibilities and by providing the project as a whole with the when and what of clear milestone goals. The project was so far behind that everyone involved could look forward to almost a year of solid crunch time as it was; Spector figured there was no point in making things even harder by letting chaos reign.

On the Ultima V project, it had been Dallas Snell who had filled the role of producer, but Snell, while an adept organizer and administrator, wasn’t a game designer or a creative force by disposition. Spector, though, proved himself capable of tackling the Ultima VI project from both sides, hammering out concrete design documents from the sometimes abstracted musings of Richard Garriott, then coming up with clear plans to bring them to fruition. In the end, the role he would play in the creation of Ultima VI was as important as that of Garriott himself. Having learned to share the technical burden with Ultima V — or by now to pass it off entirely; he never learned C and would never write a single line of code for any commercial game ever again — Garriott was now learning to share the creative burden as well, another necessary trade-off if his ever greater ambitions for his games were to be realized.

If you choose not to import an Ultima V character into Ultima VI, you go through the old Ultima IV personality test, complete with gypsy soothsayer, to come up with your personal version of the Avatar. By this time, however, with the series getting increasingly plot-heavy and the Avatar’s personality ever more fleshed-out within the games, the personality test was starting to feel a little pointless. Blogger Chet Bolingbroke, the “CRPG Addict,” cogently captured the problems inherent in insisting that all of these disparate Ultima games had the same hero:
 
Then there’s the Avatar. Not only is it unnecessary to make him the hero of the first three games, as if the Sosarians and Britannians are so inept they always need outside help to solve their problems, but I honestly think the series should have abandoned the concept after Ultima IV. In that game, it worked perfectly. The creators were making a meta-commentary on the very nature of playing role-playing games. The Avatar was clearly meant to be the player himself or herself, warped into the land through the “moongate” of his or her computer screen, represented as a literal avatar in the game window. Ultima IV was a game that invited the player to act in a way that was more courageous, more virtuous, more adventurous than in the real world. At the end of the game, when you’re manifestly returned to your real life, you’re invited to “live as an example to thine own people”–to apply the lesson of the seven virtues to the real world. It was brilliant. They should have left it alone.
 
Already in Ultima V, though, they were weakening the concept. In that game, the Avatar is clearly not you, but some guy who lives alone in his single-family house of a precise layout. But fine, you rationalize, all that is just a metaphor for where you actually do live. By Ultima VI, you have some weird picture of a pole-dancing centaur girl on your wall, you’re inescapably a white male with long brown hair.

Following what had always been Richard Garriott’s standard approach to making an Ultima, the Ultima VI team concentrated on building their technology and then building a world around it before adding a plot or otherwise trying to turn it all into a real game with a distinct goal. Garriott and others at Origin would always name Times of Lore, a Commodore 64 action/CRPG hybrid written by Chris Roberts and published by Origin in 1988, as the main influence on the new Ultima VI interface, the most radically overhauled version of same ever to appear in an Ultima title. That said, it should be noted that Times of Lore itself lifted many or most of its own innovations from The Faery Tale Adventure, David Joiner’s deeply flawed but beautiful and oddly compelling Commodore Amiga action/CRPG of 1987. By way of completing the chain, much of Times of Lore‘s interface was imported wholesale into Ultima VI; even many of the onscreen icons looked exactly the same. The entire game could now be controlled, if the player liked, with a mouse, with all of the keyed commands duplicated as onscreen buttons; this forced Origin to reduce the “alphabet soup” that had been previous Ultima interfaces, which by Ultima V had used every letter in the alphabet plus some additional key combinations, to ten buttons, with the generic “use” as the workhorse taking the place of a multitude of specifics.

Another influence, one which Origin was for obvious reasons less eager to publicly acknowledge than that of Times of Lore, was FTL’s landmark 1987 CRPG Dungeon Master, a game whose influence on its industry can hardly be overstated. John Miles remembers lots of people at Origin scrambling for time on the company’s single Atari ST in order to play it soon after its release. Garriott himself has acknowledged being “ecstatic” for his first few hours playing it at all the “neat new things I could do.” Origin co-opted  Dungeon Master‘s graphical approach to inventory management, including the soon-to-be ubiquitous “paper doll” method of showing what characters were wearing and carrying.

Taking a cue from theories about good interface design dating back to Xerox PARC and Apple’s Macintosh design team, The Faery Tale Adventure, Times of Lore, and Dungeon Master had all abandoned “modes”: different interfaces — in a sense entirely different programs — which take over as the player navigates through the game. The Ultima series, like most 1980s CRPGs, had heretofore been full of these modes. There was one mode for wilderness travel; another for exploring cities, towns, and castles; another, switching from a third-person overhead view to a first-person view like Wizardry (or, for that matter, Dungeon Master), for dungeon delving. And when a fight began in any of these modes, the game switched to yet another mode for resolving the combat.

Ultima VI collapsed all of these modes down into a single unified experience. Wilderness, cities, and dungeons now all appeared on a single contiguous map on which combat also occurred, alongside everything else possible in the game; Ultima‘s traditionally first-person dungeons were now displayed using an overhead view like the rest of the game. From the standpoint of realism, this was a huge step back; speaking in strictly realistic terms, either the previously immense continent of Britannia must now be about the size of a small suburb or the Avatar and everyone else there must now be giants, building houses that sprawled over dozens of square miles. But, as we’ve had plenty of occasion to discuss in previous articles, the most realistic game design doesn’t always make the best game design. From the standpoint of creating an immersive, consistent experience for the player, the new interface was a huge step forward.

As the world of Britannia had grown more complex, the need to give the player a unified window into it had grown to match, in ways that were perhaps more obvious to the designers than they might have been to the players. The differences between the first-person view used for dungeon delving and the third-person view used for everything else had become a particular pain. Richard Garriott had this to say about the problems that were already dogging him when creating Ultima V, and the changes he thus chose to make in Ultima VI:

Everything that you can pick up and use [in Ultima V] has to be able to function in 3D [i.e., first person] and also in 2D [third person]. That meant I had to either restrict the set of things players can use to ones that I know I can make work in 3D or 2D, or make them sometimes work in 2D but not always work in 3D or vice versa, or they will do different things in one versus the other. None of those are consistent, and since I’m trying to create an holistic world, I got rid of the 3D dungeons.

Ultima V had introduced the concept of a “living world” full of interactive everyday objects, along with characters who went about their business during the course of the day, living lives of their own. Ultima VI would build on that template. The world was still constructed, jigsaw-like, from piles of tile graphics, an approach dating all the way back to Ultima I. Whereas that game had offered 16 tiles, however, Ultima VI offered 2048, all or almost all of them drawn by Origin’s most stalwart artist, Denis Loubet, whose association with Richard Garriott stretched all the way back to drawing the box art for the California Pacific release of Akalabeth. Included among these building blocks were animated tiles of several frames — so that, for instance, a water wheel could actually spin inside a mill and flames in a fireplace could flicker. Dynamic, directional lighting of the whole scene was made possible by the 256 colors of VGA. While Ultima V had already had a day-to-night cycle, in Ultima VI the sun actually rose in the east and set in the west, and torches and other light sources cast a realistic glow onto their surroundings.

256 of the 2048 tiles from which the world of Ultima VI was built.

In a clear signal of where the series’s priorities now lay, other traditional aspects of CRPGs were scaled back, moving the series further from its roots in tabletop Dungeons & Dragons. Combat, having gotten as complicated and tactical as it ever would with Ultima V, was simplified, with a new “auto-combat” mode included for those who didn’t want to muck with it at all; the last vestiges of distinct character races and classes were removed; ability scores were boiled down to just three numbers for Strength, Dexterity, and Intelligence. The need to mix reagents in order to cast spells, one of the most mind-numbingly boring aspects of a series that had always made you do far too many boring things, was finally dispensed with; I can’t help but imagine legions of veteran Ultima players breathing a sigh of relief when they read in the manual that “the preparation of a spell’s reagents is performed at the moment of spellcasting.” The dodgy parser-based conversation system of the last couple of games, which had required you to try typing in every noun mentioned by your interlocutor on the off chance that it would elicit vital further information, was made vastly less painful by the simple expedient of highlighting in the text those subjects into which you could inquire further.

Inevitably, these changes didn’t always sit well with purists, then or now. Given the decreasing interest in statistics and combat evinced by the Ultima series as time went on, as well as the increasing emphasis on what we might call solving the puzzles of its ever more intricate worlds, some have accused later installments of the series of being gussied-up adventure games in CRPG clothing; “the last real Ultima was Ultima V” isn’t a hard sentiment to find from a vocal minority on the modern Internet. What gives the lie to that assertion is the depth of the world modeling, which makes these later Ultimas flexible in ways that adventure games aren’t. Everything found in the world has, at a minimum, a size, a weight, and a strength. Say, then, that you’re stymied by a locked door. There might be a set-piece solution for the problem in the form of a key you can find, steal, or trade for, but it’s probably also possible to beat the door down with a sufficiently big stick and a sufficiently strong character, or if all else fails to blast it open with a barrel of dynamite. Thus your problems can almost never become insurmountable, even if you screw up somewhere else. Very few other games from Ultima VI‘s day made any serious attempt to venture down this path. Infocom’s Beyond Zork tried, somewhat halfheartedly, and largely failed at it; Sierra’s Hero’s Quest was much more successful at it, but on nothing like the scale of an Ultima. Tellingly, almost all of the “alternate solutions” to Ultima VI‘s puzzles emerge organically from the simulation, with no designer input whatsoever. Richard Garriott:

I start by building a world which you can interact with as naturally as possible. As long as I have the world acting naturally, if I build a world that is prolific enough, that has as many different kinds of natural ways to act and react as possible, like the real world does, then I can design a scenario for which I know the end goal of the story. But exactly whether I have to use a key to unlock the door, or whether it’s an axe I pick up to chop down the door, is largely irrelevant.

The complexity of the world model was such that Ultima VI became the first installment that would let the player get a job to earn money in lieu of the standard CRPG approach of killing monsters and taking their loot. You can buy a sack of grain from a local farmer, take the grain to a mill and grind it into flour, then sell the flour to a baker — or sneak into his bakery at night to bake your own bread using his oven. Even by the standards of today, the living world inside Ultima VI is a remarkable achievement — not to mention a godsend to those of us bored with killing monsters; you can be very successful in Ultima VI whilst doing very little killing at all.

A rare glimpse of Origin’s in-house Ultima VI world editor, which looks surprisingly similar to the game itself.

Plot spoilers begin!

It wasn’t until October of 1989, just five months before the game absolutely, positively had to ship, that Richard Garriott turned his attention to the Avatar’s reason for being in Britannia this time around. The core idea behind the plot came to him during a night out on Austin’s Sixth Street: he decided he wanted to pitch the Avatar into a holy war against enemies who, in classically subversive Ultima fashion, turn out not to be evil at all. In two or three weeks spent locked together alone in a room, subsisting on takeout Chinese food, Richard Garriott and Warren Spector created the “game” part of Ultima VI from this seed, with Spector writing it all down in a soy-sauce-bespattered notebook. Here Spector proved himself more invaluable than ever. He could corral Garriott’s sometimes unruly thoughts into a coherent plan on the page, whilst offering plenty of contributions of his own. And he, almost uniquely among his peers at Origin, commanded enough of Garriott’s respect — was enough of a creative force in his own right — that he could rein in the bad and/or overambitious ideas that in previous Ultimas would have had to be attempted and proved impractical to their originator. Given the compressed development cycle, this contribution too was vital. Spector:

An insanely complicated process, plotting an Ultima. I’ve written a novel, I’ve written [tabletop] role-playing games, I’ve written board games, and I’ve never seen a process this complicated. The interactions among all the characters — there are hundreds of people in Britannia now, hundreds of them. Not only that, but there are hundreds of places and people that players expect to see because they appeared in five earlier Ultimas.

Everybody in the realm ended up being a crucial link in a chain that adds up to this immense, huge, wonderful, colossal world. It was a remarkably complicated process, and that notebook was the key to keeping it all under control.

The chain of information you follow in Ultima VI is, it must be said, far clearer than in any of the previous games. Solving this one must still be a matter of methodically talking to everyone and assembling a notebook full of clues — i.e., of essentially recreating Garriott and Spector’s design notebook — but there are no outrageous intuitive leaps required this time out, nor any vital clues hidden in outrageously out-of-the-way locations. For the first time since Ultima I, a reasonable person can reasonably be expected to solve this Ultima without turning it into a major life commitment. The difference is apparent literally from your first moments in the game: whereas Ultima V dumps you into a hut in the middle of the wilderness — you don’t even know where in the wilderness — with no direction whatsoever, Ultima VI starts you in Lord British’s castle, and your first conversation with him immediately provides you with your first leads to run down. From that point forward, you’ll never be at a total loss for what to do next as long as you do your due diligence in the form of careful note-taking. Again, I have to attribute much of this welcome new spirit of accessibility and solubility to the influence of Warren Spector.

Ultima VI pushes the “Gargoyles are evil!” angle hard early on, going so far as to have the seemingly demonic beasts nearly sacrifice you to whatever dark gods they worship. This of course only makes the big plot twist, when it arrives, all the more shocking.

At the beginning of Ultima VI, the Avatar — i.e., you — is called back to Britannia from his homeworld of Earth yet again by the remarkably inept monarch Lord British to deal with yet another crisis which threatens his land. Hordes of terrifyingly demonic-looking Gargoyles are pouring out of fissures which have opened up in the ground everywhere and making savage war upon the land. They’ve seized and desecrated the eight Shrines of Virtue, and are trying to get their hands on the Codex of Ultimate Wisdom, the greatest symbol of your achievements in Ultima IV.

But, in keeping with the shades of gray the series had begun to layer over the Virtues with Ultima V, nothing is quite as it seems. In the course of the game, you discover that the Gargoyles have good reason to hate and fear humans in general and you the Avatar in particular, even if those reasons are more reflective of carelessness and ignorance on the part of you and Lord British’s peoples than they are of malice. To make matters worse, the Gargoyles are acting upon a religious prophecy — conventional religion tends to take a beating in Ultima games — and have come to see the Avatar as nothing less than the Antichrist in their own version of the Book of Revelation. As your understanding of their plight grows, your goal shifts from that of ridding the land of the Gargoyle scourge by violent means to that of walking them back from attributing everything to a foreordained prophecy and coming to a peaceful accommodation with them.

Ultima VI‘s subtitle, chosen very late in the development process, is as subtly subversive as the rest of the plot. Not until very near the end of the game do you realize that The False Prophet is in fact you, the Avatar. As the old cliché says, there are two sides to every story. Sadly, the big plot twist was already spoiled by Richard Garriott in interviews before Ultima VI was even released, so vanishingly few players have ever gotten to experience its impact cold.

When discussing the story of Ultima VI, we shouldn’t ignore the real-world events that were showing up on the nightly news while Garriott and Spector were writing it. Mikhail Gorbachev had just made the impossibly brave decision to voluntarily dissolve the Soviet empire and let its vassal states go their own way, and just like that the Cold War had ended, not in the nuclear apocalypse so many had anticipated as its only possible end game but rather in the most blessed of all anticlimaxes in human history. For the first time in a generation, East was truly meeting West again, and each side was discovering that the other wasn’t nearly as demonic as they had been raised to believe. On November 10, 1989, just as Garriott and Spector were finishing their design notebook, an irresistible tide of mostly young people burst through Berlin’s forbidding Checkpoint Charlie to greet their counterparts on the other side, as befuddled guards, the last remnants of the old order, looked on and wondered what to do. It was a time of extraordinary change and hope, and the message of Ultima VI resonated with the strains of history.

Plot spoilers end.

When Garriott and Spector emerged from their self-imposed quarantine, the first person to whom they gave their notebook was an eccentric character with strong furry tendencies who had been born as David Shapiro, but who was known to one and all at Origin as Dr. Cat. Dr. Cat had been friends with Richard Garriott for almost as long as Denis Loubet, having first worked at Origin for a while when it was still being run out of Richard’s parents’ garage in suburban Houston. A programmer by trade — he had done the Commodore 64 port of Ultima V — Dr. Cat was given the de facto role of head writer for Ultima VI, apparently because he wasn’t terribly busy with anything else at the time. Over the next several months, he wrote most of the dialog for most of the many characters the Avatar would need to speak with in order to finish the game, parceling the remainder of the work out among a grab bag of other programmers and artists, whoever had a few hours or days to spare.

Origin Systems was still populating the games with jokey cameos drawn from Richard Garriott’s friends, colleagues, and family as late as Ultima VI. Thankfully, this along with other aspects of the “programmer text” syndrome would finally end with the next installment in the series, for which a real professional writing team would come aboard. More positively, do note the keyword highlighting in the screenshot above, which spared players untold hours of aggravating noun-guessing.

Everyone at Origin felt the pressure by now, but no one carried a greater weight on his slim shoulders than Richard Garriott. If Ultima VI flopped, or even just wasn’t a major hit, that was that for Origin Systems. For all that he loved to play His Unflappable Majesty Lord British in public, Garriott was hardly immune to the pressure of having dozens of livelihoods dependent on what was at the end of the day, no matter how much help he got from Warren Spector or anyone else, his game. His stress tended to go straight to his stomach. He remembers being in “constant pain”; sometimes he’d just “curl up in the corner.” Having stopped shaving or bathing regularly, strung out on caffeine and junk food, he looked more like a homeless man than a star game designer — much less a regal monarch — by the time Ultima VI hit the homestretch. On the evening of February 9, 1990, with the project now in the final frenzy of testing, bug-swatting, and final-touch-adding, he left Origin’s offices to talk to some colleagues having a smoke just outside. When he opened the security door to return, a piece of the door’s apparatus — in fact, an eight-pound chunk of steel — fell off and smacked him in the head, opening up an ugly gash and knocking him out cold. His panicked colleagues, who at first thought he might be dead, rushed him to the emergency room. Once he had had his head stitched up, he set back to work. What else was there to do?

Ultima VI shipped on time in March of 1990, two years almost to the day after Ultima V, and Richard Garriott’s fears (and stomach cramps) were soon put to rest; it became yet another 200,000-plus-selling hit. Reviews were uniformly favorable if not always ecstatic; it would take Ultima fans, traditionalists that so many of them were, a while to come to terms with the radically overhauled interface that made this Ultima look so different from the Ultimas of yore. Not helping things were the welter of bugs, some of them of the potentially showstopping variety, that the game shipped with (in years to come Origin would become almost as famous for their bugs as for their ambitious virtual world-building). In time, most if not all old-school Ultima fans were comforted as they settled in and realized that at bottom you tackled this one pretty much like all the others, trekking around Britannia talking to people and writing down the clues they revealed until you put together all the pieces of the puzzle. Meanwhile Origin gradually fixed the worst of the bugs through a series of patch disks which they shipped to retailers to pass on to their customers, or to said customers directly if they asked for them. Still, both processes did take some time, and the reaction to this latest Ultima was undeniably a bit muted — a bit conflicted, one might even say — in comparison to the last few games. It perhaps wasn’t quite clear yet where or if the Ultima series fit on these newer computers in this new decade.

Both the muted critical reaction and that sense of uncertainty surrounding the game have to some extent persisted to this day. Firmly ensconced though it apparently is in the middle of the classic run of Ultimas, from Ultima IV through Ultima VII, that form the bedrock of the series’s legacy, Ultima VI is the least cherished of that cherished group today, the least likely to be named as the favorite of any random fan. It lacks the pithy justification for its existence that all of the others can boast. Ultima IV was the great leap forward, the game that dared to posit that a CRPG could be about more than leveling up and collecting loot. Ultima V was the necessary response to its predecessor’s unfettered idealism; the two games together can be seen to form a dialog on ethics in the public and private spheres. And, later, Ultima VII would be the pinnacle of the series in terms not only of technology but also, and even more importantly, in terms of narrative and thematic sophistication. But where does Ultima VI stand in this group? Its plea for understanding rather than extermination is as important and well-taken today as it’s ever been, yet its theme doesn’t follow as naturally from Ultima V as that game’s had from Ultima IV, nor is it executed with the same sophistication we would see in Ultima VII. Where Ultima VI stands, then, would seem to be on a somewhat uncertain no man’s land.

Indeed, it’s hard not to see Ultima VI first and foremost as a transitional work. On the surface, that’s a distinction without a difference; every Ultima, being part of a series that was perhaps more than any other in the history of gaming always in the process of becoming, is a bridge between what had come before and what would come next. Yet in the case of Ultima VI the tautology feels somehow uniquely true. The graphical interface, huge leap though it is over the old alphabet soup, isn’t quite there yet in terms of usability. It still lacks a drag-and-drop capability, for instance, to make inventory management and many other tasks truly intuitive, while the cluttered onscreen display combines vestiges of the old, such as a scrolling textual “command console,” with this still imperfect implementation of the new. The prettier, more detailed window on the world is welcome, but winds up giving such a zoomed-in view in the half of a screen allocated to it that it’s hard to orient yourself. The highlighted keywords in the conversation engine are also welcome, but are constantly scrolling off the screen, forcing you to either lawnmower through the same conversations again and again to be sure not to miss any of them or to jot them down on paper as they appear. There’s vastly more text in Ultima VI than in any of its predecessors, but perhaps the kindest thing to be said about Dr. Cat as a writer is that he’s a pretty good programmer. All of these things would be fixed in Ultima VII, a game — or rather games; there were actually two of them, for reasons we’ll get to when the time comes — that succeeded in becoming everything Ultima VI had wanted to be. To use the old playground insult, everything Ultima VI can do Ultima VII can do better. One thing I can say, however, is that the place the series was going would prove so extraordinary that it feels more than acceptable to me to have used Ultima VI as a way station en route.

But in the even more immediate future for Origin Systems was another rather extraordinary development. This company that the rest of the industry jokingly referred to as Ultima Systems would release the same year as Ultima VI a game that would blow up even bigger than this latest entry in the series that had always been their raison d’être. I’ll tell that improbable story soon, after a little detour into some nuts and bolts of computer technology that were becoming very important — and nowhere more so than at Origin — as the 1990s began.

(Sources: the books Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, The Official Book of Ultima, Second Edition by Shay Addams, and Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector; ACE of April 1990; Questbusters of November 1989, January 1990, March 1990, and April 1990; Dragon of July 1987; Computer Gaming World of March 1990 and June 1990; Origin’s in-house newsletter Point of Origin of August 7 1991. Online sources include Matt Barton’s interviews with Dr. Cat and Warren Spector’s farewell letter from the Wing Commander Combat Information Center‘s document archive. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Ultima VI is available for purchase from GOG.com in a package that also includes Ultima IV and Ultima V.)

 

Tags: , , ,