RSS

Category Archives: Interactive Fiction

Blade Runner

Blade Runner has set me thinking about the notion of a “critical consensus.” Why should we have such a thing at all, and why should it change over time?

Ridley Scott’s 1982 film Blade Runner is an adaptation of Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?, about a police officer cum bounty hunter — a “blade runner” in street slang — of a dystopian near-future whose job is to “retire” android “replicants” of humans whose existence on Earth is illegal. The movie had a famously troubled gestation, full of time and budget overruns, disputes between Scott and his investors, and an equally contentious relationship between the director and his leading man, Harrison Ford. When it was finally finished, the first test audiences were decidedly underwhelmed, such that Scott’s backers demanded that the film be recut, with the addition of a slightly hammy expository voice-over and a cheesy happy-ending epilogue which was cobbled together quickly using leftover footage from, of all movies, Stanley Kubrick’s The Shining.

It didn’t seem to help. The critical consensus on the released version ranged over a continuum from ambivalence to outright hostile. Robert Ebert’s faint praise was typically damning: “I was never really interested in the characters in Blade Runner. I didn’t find them convincing. What impressed me in the film was the special effects, the wonderful use of optical trickery to show me a gigantic imaginary Los Angles, which in the vision of this movie has been turned into sort of a futuristic Tokyo. It’s a great movie to look at, but a hard one to care about. I didn’t appreciate the predictable story, the standard characters, the cliffhanging clichés… but I do think the special effects make Blade Runner worth going to see.” Pauline Kael was less forgiving of what she saw as a cold, formless, ultimately pointless movie: “If anybody comes around with a test to detect humanoids, maybe Ridley Scott and his associates should hide. With all the smoke in this movie, you feel as if everyone connected with it needs to have his flue cleaned.” Audiences do not always follow the critics’ lead, but in this case they largely did. During its initial theatrical run, Blade Runner fell well short of earning back the $30 million it had cost to make.

Yet remarkably soon after it had disappeared from theaters, its rehabilitation got underway in fannish circles. In 1984, William Gibson published his novel Neuromancer, the urtext of a new “cyberpunk” movement in science fiction that began in printed prose but quickly spiraled out from there into comics, television, and games. Whereas Blade Runner‘s dystopic Los Angeles looked more like Tokyo than any contemporary American city, much of Gibson’s book actually took place in Japan. The two neon-soaked nighttime cityscapes were very much of a piece. The difference was that Gibson added to the equation a computer-enabled escape from reality known as cyberspace, creating a combination that would prove almost irresistibly alluring to science-fiction fans as the computer age around them continued to evolve apace.

Blade Runner‘s rehabilitation spread to the mainstream in 1992, when a “director’s cut” of the film was re-released in theaters, lacking the Captain Obvious voice-over or the tacked-on happy ending but sporting a handful of new scenes that added fresh layers of nuance to the story. Critics — many of them the very same critics who had dismissed the movie a decade earlier — now rushed to praise it as a singular cinematic vision and a science-fiction masterpiece. They found many reasons for its box-office failure on the first go-round, even beyond the infelicitous changes that Ridley Scott had been forced by his backers to make to it. For one thing, it had been unlucky enough to come out just one month after E.T.: The Extraterrestrial, the biggest box-office smash of all time to that point, whose long shadow was as foreboding and unforgiving a place to dwell as any of Blade Runner‘s own urban landscapes. Then, too, the audience was conditioned back then to see Harrison Ford as Han Solo or Indiana Jones — a charming rogue with a heart of gold, not the brooding, morally tormented cop Rick Deckard, who has a penchant for rough sex and a habit of shooting women in the back. In light of all this, surely the critics too could be forgiven for failing to see the film’s genius the first time they were given the chance.

Whether we wish to forgive them or not, I find it fascinating that a single film could generate such polarized reactions only ten year apart in time from people who study the medium for a living. The obvious riposte to my sense of wonder is, of course, that the Blade Runner of 1992 really wasn’t the same film at all as the one that had been seen in 1982. Yet I must confess to considerable skepticism about this as a be-all, end-all explanation. It seems to me that, for all that the voice-over and forced happy ending did the movie as a whole no favors, they were still a long way from destroying the qualities that made Blade Runner distinct.

Some of my skepticism may arise from the fact that I’m just not onboard with the most vaunted aspect of the director’s cut, its subtle but undeniable insinuation that Deckard is himself a replicant with implanted memories, no different from the androids he hunts down and kills. This was not the case in Philip K. Dick’s novel, nor was it the original intention of the film’s scriptwriters. I rather suspect, although I certainly cannot prove it, that even Ridley Scott’s opinion on the subject was more equivocal during the making of the film than it has since become. David Peoples, one of the screenwriters, attributes the genesis of the idea in Scott’s mind to an overly literal reading on his part of a philosophical meditation on free will and the nature of human existence in an early draft of the script. Peoples:

I invented a kind of contemplative voice-over for Deckard. Here, let me read it to you:

“I wondered who designs the ones like me and what choices we really have, and which ones we just think we have. I wondered which of my memories were real and which belonged to someone else. The great Tyrell [the genius inventor and business magnate whose company made the replicants] hadn’t designed me, but whoever had hadn’t done so much better. In my own modest way, I was a combat model.”

Now, what I’d intended with this voice-over was mostly metaphysical. Deckard was supposed to be philosophically questioning himself about what it was that made him so different from Rachael [a replicant with whom he falls in love or lust, and eventually runs away with in the “happy-ending” version of the film] and the other replicants. He was supposed to be realizing that, on the human level, they weren’t so different. That Deckard wanted the same things the replicants did. The “maker” he was referring to wasn’t Tyrell. It was supposed to be God. So, basically, Deckard was just musing about what it meant to be human.

But then, Ridley… well, I think Ridley misinterpreted me. Because right about this period of time, he started announcing, “Ah-ha! Deckard’s a replicant! What brilliance!” I was sort of confused by this response, because Ridley kept giving me all this praise and credit for this terrific idea. It wasn’t until many years later, when I happened to be browsing through this draft, that I suddenly realized the metaphysical material I had written could just as easily have been read to imply that Deckard was a replicant, even though it wasn’t what I meant at all. What I had meant was, we all have a maker, and we all have an incept date [a replicant’s equivalent to a date of birth]. We just can’t address them. That’s one of the similarities we had to the replicants. We couldn’t go find Tyrell, but Tyrell was up there somewhere. For all of us.

So, what I had intended as kind of a metaphysical speculation, Ridley had read differently, but now I realize there was nothing wrong with this reading. That confusion was my own fault. I’d written this voice-over so ambiguously that it could indeed have meant exactly what Ridley took it to mean. And that, I think, is how the whole idea of Deckard being a replicant came about.

The problem I have with Deckard being a replicant is that it undercuts the thematic resonance of the story. In the book and the movie, the quality of empathy, or a lack thereof, is described as the one foolproof way to distinguish real from synthetic humans. To establish which is which, blade runners like Deckard use something called the Voight-Kampff test, in which suspects are hooked up to a polygraph-like machine which measures their emotional response to shockingly transgressive statements, starting with stuff like “my briefcase is made out of supple human-baby skin” and getting steadily worse from there. Real humans recoil, intuitively and immediately. Replicants can try to fake the appropriate emotional reaction — might even be programmed to fake it to themselves, such that even they don’t realize what they are — but there is always a split-second delay, which the trained operator can detect.

The central irony of the film is that cops like Deckard are indoctrinated to have absolutely no empathy for the replicants they track down and murder, even as many of the replicants we meet evince every sign of genuinely caring for one another, leading one to suspect that the Voight-Kampff test may not be measuring pure, unadulterated empathy in quite the way everyone seems to think it is. The important transformation that Deckard undergoes, which eventually brings his whole world down around his head, is that of allowing himself to feel the pain and fear of those he hunts. He is a human who rediscovers and re-embraces his own humanity, who finally begins to understand that meting out suffering and death to other feeling creatures is no way to live, no matter how many layers of justification and dogma his actions are couched within.

But in Ridley Scott’s preferred version of the film, the central theme falls apart, to be replaced with psychological horror’s equivalent of a jump scare: “Deckard himself is really a replicant, dude! What a mind fuck, huh?” For this reason, it’s hard for me to see the director’s cut as an holistically better movie than the 1982 cut, which at least leaves some more room for debate about the issue.

This may explain why I’m lukewarm about Blade Runner as a whole, why none of the cuts — and there have been a lot of them by now — quite works for me. As often happens in cases like this one, I find that my own verdict on Blade Runner comes down somewhere between the extremes of then and now. There’s a lot about Roger Ebert’s first hot-take that still rings true to me all these years later. It’s a stunning film in terms of atmosphere and audiovisual composition; I defy anyone to name a movie with a more breathtaking opening shot than the panorama of nighttime Tokyo… er, Los Angeles that opens this one. Yet it’s also a distant and distancing, emotionally displaced film that aspires to a profundity it doesn’t completely earn. I admire many aspects of its craft enormously and would definitely never discourage anyone from seeing it, but I just can’t bring myself to love it as much as so many others do.

The opening shot of Blade Runner the movie.

These opinions of mine will be worth keeping in mind as we move on now to the 1997 computer-game adaptation of Blade Runner. For, much more so than is the case even with most licensed games, your reaction to this game might to be difficult to separate from your reaction to the movie.


Thanks to the complicated, discordant circumstances of its birth, Blade Runner had an inordinate number of vested interests even by Hollywood standards, such that a holding company known as The Blade Runner Partnership was formed just to administer them. When said company started to shop the property around to game publishers circa 1994, the first question on everyone’s lips was what had taken them so long. The film’s moody, neon-soaked aesthetic if not its name had been seen in games for years by that point, so much so that it had already become something of a cliché. Just among the games I’ve written about on this site, Rise of the Dragon, Syndicate, System Shock, Beneath a Steel Sky, and the Tex Murphy series all spring to mind as owing more than a small debt to the movie. And there are many, many more that I haven’t written about.

Final Fantasy VII is another on the long list of 1990s games that owes more than a little something to Blade Runner. It’s hard to imagine its perpetually dark, polluted, neon-soaked city of Midgar ever coming to exist without the example of Blade Runner’s Los Angeles. Count it as just one more way in which this Japanese game absorbed Western cultural influences and then reflected them back to their point of origin, much as the Beatles once put their own spin on American rock and roll and sold it back to the country of its birth.

Meanwhile the movie itself was still only a cult classic in the 1990s; far more gamers could recognize and enjoy the gritty-cool Blade Runner aesthetic than had actually seen its wellspring. Blade Runner was more of a state of mind than it was a coherent fictional universe in the way of other gaming perennials like Star Trek and Star Wars. Many a publisher therefore concluded that they could have all the Blade Runner they needed without bothering to pay for the name.

Thus the rights holders worked their way down through the hierarchy of publishers, beginning with the prestigious heavy hitters like Electronic Arts and Sierra and continuing into the ranks of the mid-tier imprints, all without landing a deal. Finally, they found an interested would-be partner in the financially troubled Virgin Interactive.

The one shining jewel in Virgin’s otherwise tarnished crown was Westwood Studios, the pioneer of the real-time-strategy genre that was on the verge of becoming one of the two hottest in all of gaming. And one of the founders of Westwood was a fellow named Louis Castle, who listed Blade Runner as his favorite movie of all time. His fandom was such that Westwood probably did more than they really needed to in order to get the deal. Over a single long weekend, the studio’s entire art department pitched in to meticulously recreate the movie’s bravura opening shots of dystopic Los Angeles. It did the trick; the Blade Runner contract was soon given to Virgin and Westwood. It also established, for better or for worse, the project’s modus operandi going forward: a slavish devotion not just to the film’s overall aesthetic but to the granular details of its shots and sets.

The opening shot of Blade Runner the game.

Thanks to the complicated tangle of legal rights surrounding the film, Westwood wasn’t given access to any of its tangible audiovisual assets. Undaunted, they endeavored to recreate almost all of them on the monitor screen for themselves by using pre-rendered 3D backgrounds combined with innovative real-time lighting effects; these were key to depicting the flashing neon and drifting rain and smoke that mark the film. The foreground actors were built from motion-captured human models, then depicted onscreen using voxels, collections of tiny cubes in a 3D space, essentially pixels with an added Z-dimension of depth.

At least half of what you see in the Blade Runner game is lifted straight from the movie, which Westwood pored over literally frame by frame in order to include even the tiniest details, the sorts of things that no ordinary moviegoer would ever notice. The Westwood crew took a trip from their Las Vegas offices to Los Angeles to measure and photograph the locations where the film had been shot, the better to get it all exactly correct. Even the icy, synth-driven soundtrack for the movie was deconstructed, analyzed, and then mimicked in the game, note by ominous note.

The two biggest names associated with the film, Ridley Scott and Harrison Ford, were way too big to bother with a project like this one, but a surprising number of the other actors agreed to voice their parts and to allow themselves to be digitized and motion-captured. Among them were Sean Young, who had played Deckard’s replicant love interest Rachael; Edward James Olmos, who had played his enigmatic pseudo-partner Gaff; and Joe Turkel, who had played Eldon Tyrell, the twisted genius who invented the replicants. Set designers and other behind-the-scenes personnel were consulted as well.

It wasn’t judged practical to clone the movie’s plot in the same way as its sights and sounds, if for no other reason than the absence of Harrison Ford; casting someone new in the role of Deckard would have been, one senses, more variance than Westwood’s dedication to re-creation would have allowed. Instead they came up with a new story that could play out in the seams of the old one, happening concurrently with the events of the film, in many of the same locations and involving many of the same characters. Needless to say, its thematic concerns too would be the same as those of the film — and, yes, its protagonist cop as well would eventually be given reason to doubt his own humanity. His name was McCoy, another jaded gumshoe transplanted from a Raymond Chandler novel into an equally noirish future. But was he a “real” McCoy?

Westwood promised great things in the press while Blade Runner was in development: a truly open-world game taking place in a living, breathing city, full of characters that went about their own lives and pursued their own agendas, whose response to you in the here and now would depend to a large degree on how you had treated them and their acquaintances and enemies in the past. There would be no fiddly puzzles for the sake of them; this game would expect you to think and act like a real detective, not as the typical adventure-game hero with an inventory full of bizarre objects waiting to be put to use in equally bizarre ways. To keep you on your toes and add replay value — the lack of which was always the adventure genre’s Achilles heel as a commercial proposition — the guilty parties in the case would be randomly determined, so that no two playthroughs would ever be the same. And there would be action elements too; you would have to be ready to draw your gun at almost any moment. “There’s actually very little action in the film,” said Castle years later, “but when it happens, it’s violent, explosive, and deadly. I wanted to make a game where the uncertainty of what’s going to happen makes you quiver with anticipation every time you click the mouse.”

As we’ll soon see, most of those promises would be fulfilled only partially, but that didn’t keep Blade Runner from becoming a time-consuming, expensive project by the standards of its era,  taking two years to make and costing about $2 million. It was one of the last times that a major, mainstream American studio swung for the fences with an adventure game, a genre that was soon to be relegated to niche status, with budgets and sales expectations to match.

In fact, Blade Runner’s commercial performance was among the reasons that down-scaling took place. Despite a big advertising push on Virgin Interactive’s part, it got lost in the shuffle among The Curse of Monkey Island, Riven, and Zork: Grand Inquisitor, three other swansongs of the AAA adventure game that all competed for a dwindling market share during the same holiday season of 1997. Reviews were mixed, often expressing a feeling I can’t help but share: what was ultimately the point of so slavishly re-creating another work of art if you’re weren’t going to add much of anything of your own to it? “The perennial Blade Runner images are here, including the winking woman in the Coca-Cola billboard and vehicles flying over the flaming smokestacks of the industrial outskirts,” wrote GameSpot. “Unfortunately, most of what’s interesting about the game is exactly what was interesting about the film, and not much was done to extend the concepts or explore them any further.” Computer and Video Games magazine aptly called it “more of a companion to the movie than a game.” Most gamers shrugged and moved on the next title on the shelf; Blade Runner sold just 15,000 copies in the month of its release.[1]Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.

As the years went by, however, a funny thing happened. Blade Runner never faded completely from the collective gamer consciousness like so many other middling efforts did. It continued to be brought up in various corners of the Internet, became a fixture of an “abandonware” scene whose rise preceded that of back-catalog storefronts like GOG.com, became the subject of retrospectives and think pieces on major gaming sites. Finally, in spite of the complications of its licensing deal, it went up for sale on GOG.com in 2019. Then, in 2022, Night Dive Studios released an “enhanced” edition. It seems safe to say today that many more people have played Westwood’s Blade Runner since the millennium than did so before it. The critical consensus surrounding it has shifted as well. As of this writing, Blade Runner is rated by the users of MobyGames as the 51st best adventure game of all time — a ranking that doesn’t sound so impressive at first, until you realize that it’s slightly ahead of such beloved icons of the genre as LucasArts’s Monkey Island 2 and Indiana Jones and the Fate of Atlantis.[2]This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release. I trust that I need not belabor the parallels with the reception history of Ridley Scott’s movie. In this respect as well as so many others, the film and the game seem joined at the hip. And the latter wouldn’t have it any other way.


In all my years of writing these histories, I’m not sure I’ve ever come across a game that combines extremes of derivation and innovation in quite the way of Westwood’s Blade Runner. While there is nary an original idea to be found in the fiction, the gameplay has if anything too many of them.

I’ve complained frequently in the past that most alleged mystery games aren’t what they claim to be at all, that they actually solve the mystery for you while you occupy your time with irrelevant lock-and-key puzzles and the like. Louis Castle and his colleagues at Westwood clearly had the same complaints; there are none of those irrelevancies here. Blade Runner really does let you piece together its clues for yourself. You feel like a real cop — or at least a television one — when you, say, pick out the license plate of a car on security-camera footage, then check the number in the database of the near-future’s equivalent to the Department of Motor Vehicles to get a lead. Even as it’s rewarding, the game is also surprisingly forgiving in its investigative aspects, not an adjective that’s frequently applied to adventures of this period. There are a lot of leads to follow, and you don’t need to notice and run down all of them all to make progress in your investigation. At its best, then, this game makes you feel smart — one of the main reasons a lot of us play games, if we’re being honest.

Those problems that do exist here arise not from the developers failing to do enough, but rather from trying to do too much. There’s an impossibly baroque “clues database” that purports to aid you in tying everything together. This experiment in associative, cross-referenced information theory would leave even Ted Nelson scratching his head in befuddlement. Thankfully, it isn’t really necessary to engage with it at all. You can keep the relevant details in your head, or at worst in your trusty real-world notepad, easily enough.

If you can make any sense of this, you’re a better detective than I am.

Features like this one seem to be artifacts of that earlier, even more conceptually ambitious incarnation of Blade Runner that was promoted in the press while the game was still being made.[3]Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article. As I noted earlier, this was to have been a game that you could play again and again, with the innocent and guilty parties behind the crime you investigated being different each time. It appears that, under the pressure of time, money, and logistics, that concept got boiled down to randomizing which of the other characters are replicants and which are “real” humans, but not changing their roles in the story in response to their status in any but some fairly cosmetic ways. Then, too, the other characters were supposed to have had a great deal of autonomy, but, again, the finished product doesn’t live up to this billing. In practice, what’s left of this aspiration is more of an annoyance than anything else. While the other characters do indeed move around, they do so more like subways trains on a rigid schedule than independent human actors. When the person you need to speak to isn’t where you go to speak to him, all you can do is go away and return later. This leads to tedious rounds of visiting the same locations again and again, hoping someone new will turn up to jog the plot forward. While this may not be all that far removed from the nature of much real police work, it’s more realism than I for one need.

This was also to have been an adventure game that you could reasonably play without relying on saving and restoring, taking your lumps and rolling with the flow. Early on, the game just about lives up to this ideal. At one point, you chase a suspect into a dark alleyway where a homeless guy happens to be rooting through a dumpster. It’s damnably easy in the heat of the moment to shoot the wrong person. If you do so — thus committing a crime that counts as murder, unlike the “retiring” of a replicant — you have the chance to hide the body and continue on your way; life on the mean streets of Los Angeles is a dirty business, regardless of the time period. Even more impressively, you might stumble upon your victim’s body again much later in the game, popping up out of the murk like an apparition from your haunted conscience. If you didn’t kill the hobo, on the other hand, you might meet him again alive.

But sadly, a lot of this sort of thing as well falls away as the game goes on. The second half is rife with learning-by-death moments that would have done the Sierra of the 1980s proud, all people and creatures jumping out of the shadows and killing you without warning. Hope you have a save file handy, says the game. The joke’s on you!

By halfway through, the game has just about exhausted the movie’s iconic set-pieces and is forced to lean more on its own invention, much though this runs against its core conviction that imitation trumps originality. Perhaps that conviction was justified after all: the results aren’t especially inspiring. What we see are mostly generic sewers, combined with characters who wouldn’t play well in the dodgiest sitcom. The pair of bickering conjoined twins — one smart and urbane, the other crude and rude — is particularly cringe-worthy.

Writers and other artists often talk about the need to “kill your darlings”: to cut out those scenes and phrases and bits and bobs that don’t serve the art, that only serve to gratify the vanity of the artist. This game is full of little darlings that should have died well before it saw release. Some of them are flat-out strange. For example, if you like, you can pre-pick a personality for McCoy: Polite, Normal, (don’t call me) Surly, or Erratic. Doing so removes the conversation menu from the interface; walk up to someone and click on her, and McCoy just goes off on his own tangent. I don’t know why anyone would ever choose to do this, unless it be to enjoy the coprolalia of Erratic McCoy, who jumps from Sheriff Andy Taylor to Dirty Harry and back again at a whipsaw pace, leaving everyone on the scene flummoxed.

Even when he’s ostensibly under your complete control, Detective McCoy isn’t the nimblest cowboy at the intellectual rodeo. Much of the back half of the game degenerates into trying to figure out how and when to intervene to keep him from doing something colossally stupid. When a mobster you’ve almost nailed hands him a drink, you’re reduced to begging him silently: Please, please, do not drink it, McCoy! And of course he does so, and of course it’s yet another Game Over. (After watching the poor trusting schmuck screw up this way several times, you might finally figure out that you have about a two-second window of control to make him draw his gun on the other guy — no other action will do — before he scarfs down the spiked cocktail.)

Bottoms up! (…sigh…)

All my other complaints aside, though, for me this game’s worst failing remains its complete disinterest in standing on its own as either a piece of fiction or as an aesthetic statement of any stripe. There’s an embarrassingly mawkish, subservient quality that dogs it even as it’s constantly trying to be all cool and foreboding and all, with all its darkness and its smoke. Its brand of devotion is an aspect of fan culture that I just don’t get.

So, I’m left sitting here contemplating an argument that I don’t think I’ve ever had to make before in the context of game development: that you can actually love something too much to be able to make a good game out of it, that your fandom can blind you as surely as the trees of any forest. This game is doomed, seemingly by design, to play a distant second fiddle to its parent. You can almost hear the chants of “We’re not worthy!” in the background. When you visit Tyrell in his office, you know it can have no real consequences for your story because the resolution of that tycoon’s fate has been reserved for the cinematic story that stars Deckard; ditto your interactions with Rachael and Gaff and others. They exist here at all, one can’t help but sense, only because the developers were so excited at the prospect of having real live Blade Runner actors visit them in their studio that they just couldn’t help themselves. (“We’re not worthy!”) For the player who doesn’t live and breathe the lore of Blade Runner like the developers do, they’re living non sequiters who have nothing to do with anything else that’s going on.

Even the endings here — there are about half a dozen major branches, not counting the ones where McCoy gets shot or stabbed or roofied midway through the proceedings — are sometimes in-jokes for the fans. One of them is a callback to the much-loathed original ending of the film — a callback that finds a way to be in much worse taste than its inspiration: McCoy can run away with one of his suspects, who happens to be a fourteen-year-old girl who’s already been the victim of adult molestation. Eww!

What part of “fourteen years old and already sexually traumatized” do you not understand, McCoy?

Heck, even the options menu of this game has an in-joke that only fans will get. If you like, you can activate a “designer cut” here that eliminates all of McCoy’s explanatory voice-overs, a callback to the way that Ridley Scott’s director’s cut did away with the ones in the film. The only problem is that in this medium those voice-overs are essential for you to have any clue whatsoever what’s going on. Oh, well… the Blade Runner fans have been served, which is apparently the important thing.

I want to state clearly here that my objections to this game aren’t abstract objections to writing for licensed worlds or otherwise building upon the creativity of others. It’s possible to do great work in such conditions; the article I published just before this one praised The Curse of Monkey Island to the skies for its wit and whimsy, despite that game making absolutely no effort to bust out of the framework set up by The Secret of Monkey Island. In fact, The Curse of Monkey Island too is bursting at the seams with in-jokes and fan service. But it shows how to do those things right: by weaving them into a broader whole such that they’re a bonus for the people who get them but never distract from the experience of the people who don’t. That game illustrates wonderfully how one can simultaneously delight hardcore fans of a property and welcome newcomers into the fold, how a game can be both a sequel and fully-realized in an Aristotelian sense. I’m afraid that this game is an equally definitive illustration of how to do fan service badly, such that it comes across as simultaneously elitist and creatively bankrupt.

Westwood always prided themselves on their technical excellence, and this is indeed a  technically impressive game in many respects. But impressive technology is worth little on its own. If you’re a rabid fan of the movie in the way that I am not, I suppose you might be excited to live inside it here and see all those iconic sets from slightly different angles. If you aren’t, though, it’s hard to know what this game is good for. In its case, I think that the first critical consensus had it just about right.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book Future Noir: The Making of Blade Runner by Paul M. Sammon; Computer and Video Games of January 1998; PC Zone of May 1999; Next Generation of July 1997; Computer Gaming World of March 1998; Wall Street Journal of January 21 1998; New Yorker of July 1982; Retro Gamer 142.

Online sources include Ars Technica’s interview with Louis Castle, Game Developer‘s interview with Castle, Edges feature on the making of the game, the original Siskel and Ebert review of the movie, an unsourced but apparently authentic interview with Philip K. Dick, and GameSpot’s vintage Blade Runner review.

Blade Runner is available for digital purchase at GOG.com, in both its original edition that I played for this article and the poorly received enhanced edition. Note that the latter actually includes the original game as well as of this writing, and is often cheaper than buying the original alone…

Footnotes

Footnotes
1 Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.
2 This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release.
3 Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article.
 

Tags: , ,

The Curse of Monkey Island

Fair Warning: this article contains plot spoilers for Monkey Island 2: LeChuck’s Revenge and The Curse of Monkey Island. No puzzle spoilers, however…

The ending of 1991’s Monkey Island 2: LeChuck’s Revenge seems as shockingly definitive in its finality as that of the infamous last episode of the classic television series St. Elsewhere. Just as the lovable wannabe pirate Guybrush Threepwood is about to finally dispatch his arch-nemesis, the zombie pirate LeChuck, the latter tears off his mask to reveal that he is in reality Guybrush’s older brother, looking a trifle peeved but hardly evil or undead. Guybrush, it seems, is just an ordinary suburban kid who has wandered away from his family to play make-believe inside a storage room at Big Whoop Amusement Park, LeChuck the family member who has been dispatched to find him. An irate janitor appears on the scene: “Hey, kids! You’re not supposed to be in here!” And so the brothers make their way out to rejoin their worried parents, and another set of Middle American lives goes on.

Or do they? If you sit through the entirety of the end credits, you will eventually see a short scene featuring the fetching and spirited Elaine, Guybrush’s stalwart ally and more equivocal love interest, looking rather confused back in the good old piratey Caribbean. ‘I wonder what’s keeping Guybrush?” she muses. “I hope LeChuck hasn’t cast some horrible SPELL over him or anything.” Clearly, someone at LucasArts anticipated that a day might just come when they would want to make a third game.

Nevertheless, for a long time, LucasArts really did seem disposed to let the shocking ending stand. Gilbert himself soon left the company to found Humongous Entertainment, where he would use the SCUMM graphic-adventure engine he had helped to invent to make educational games for youngsters, even as LucasArts would continue to evolve the same technology to make more adventure games of their own. None of them, however, was called Monkey Island for the next four years, not even after the first two games to bear that name became icons of their genre.

Still, it is a law of the games industry that sequels to hit games will out, sooner or later and one way or another. In late 1995, LucasArts’s management decided to make a third Monkey Island at last. Why they chose to do so at this particular juncture isn’t entirely clear. Perhaps they could already sense an incipient softening of the adventure market — a downturn that would become all too obvious over the next eighteen months or so — and wanted the security of such an established name as this one if they were to invest big bucks in another adventure project. Or perhaps they just thought they had waited long enough.

Larry Ahern and Jonathan Ackley.

Whatever their reasoning in beginning the project, they chose for the gnarly task of succeeding Ron Gilbert an in-house artist and a programmer, a pair of good friends who had been employed at LucasArts for years and were itching to move into a design role. Larry Ahern had been hired to help draw Monkey Island 2 and had gone on to work on most of LucasArts’s adventure games since, while Jonathan Ackley had programmed large parts of Day of the Tentacle and The Dig. Knowing of their design aspirations, management came to them one day to ask if they’d like to become co-leads on a prospective Monkey Island 3. It was an extraordinary amount of faith to place in such unproven hands, but it would not prove to have been misplaced.

“We were too green to suggest anything else [than Monkey Island 3], especially an original concept,” admits Ahern, “and were too dumb to worry about all the responsibility of updating a classic game series.” He and Ackley brainstormed together in a room for two months, hashing out the shape of a game. After they emerged early in 1996 with their design bible for The Curse of Monkey Island in hand, production got underway in earnest.

At the end of Monkey Island 2, Ahern and Ackley announced, Guybrush had indeed been “hexed” by LeChuck into believing he was just a little boy in an amusement park. By the beginning of the third game, he would have snapped back to his senses, abandoning mundane hallucination again for a fantastical piratey reality.

A team that peaked at 50 people labored over The Curse of Monkey Island for eighteen months. That period was one of dramatic change in the industry, when phrases like “multimedia” and “interactive movie” were consigned to the kitschy past and first-person shooters and real-time strategies came to dominate the sales charts. Having committed to the project, LucasArts felt they had no choice but to stick with the old-school pixel art that had always marked their adventure games, even though it too was fast becoming passé in this newly 3D world. By way of compensation, this latest LucasArts pixel art was to be more luscious than anything that had come out of the studio before, taking advantage of a revamped SCUMM engine that ran at a resolution of 640 X 480 instead of 320 X 200.

The end result is, in the opinion of this critic at least, the loveliest single game in terms of pure visuals that LucasArts ever produced. Computer graphics and animation, at LucasArts and elsewhere, had advanced enormously between Monkey Island 2 and The Curse of Monkey Island. With 1993’s Day of the Tentacle and Sam and Max Hit the Road, LucasArts’s animators had begun producing work that could withstand comparison to that of role models like Chuck Jones and Don Bluth without being laughed out of the room. (Indeed, Jones reportedly tried to hire Larry Ahern and some of his colleagues away from LucasArts after seeing Day of the Tentacle.) The Curse of Monkey Island marked the fruition of that process, showing LucasArts to have become a world-class animation studio in its own right, one that could not just withstand but welcome comparison with any and all peers who worked with more traditional, linear forms of media. “We were looking at Disney feature animation as our quality bar,” says Ahern.

That said, the challenge of producing a game that still looked like Monkey Island despite all the new technical affordances should not be underestimated. The danger of the increased resolution was always that the finished results could veer into a sort of photo-realism, losing the ramshackle charm that had always been such a big part of Monkey Island‘s appeal. This LucasArts managed to avoid; in the words of The Animation World Network, a trade organization that was impressed enough by the project to come out and do a feature on it, Guybrush was drawn as “a pencil-necked beanpole with a flounce of eighteenth-century hair and a nose as vertical as the face of Half Dome.” The gangling frames and exaggerated movements of Guybrush and many of the other characters were inspired by Tim Burton’s The Nightmare Before Christmas. Yet the characters aren’t grotesques; The Curse of Monkey Island aims to be lovable, and it hits the mark. For this game is written as well as drawn in the spirit of the original Secret of Monkey Island, abandoning the jarring mean-spiritedness that dogs the second game in the series, a change in tone that has always left me a lot less positively disposed toward it than most people seem to be.

This was the first Monkey Island game to feature voice acting from the outset, as telling a testament as any to the technological gulf that lies between the second and third entries in the series. The performances are superb — especially Guybrush, who sounds exactly like I want him to, all gawky innocence and dogged determination. (His voice actor Dominic Armato would return for every single Monkey Island game that followed, as well as circling back to give Guybrush a voice in the remastered versions of the first two games. I, for one, wouldn’t have it any other way.)

The opening sees Guybrush adrift on the open ocean in, of all forms of conveyance, a floating bumper car, for reasons that aren’t initially clear beyond the thematic connection to that amusement park at the end of Monkey Island 2. He floats smack-dab into the middle of a sea battle between LeChuck and Elaine; the former is trying to abduct the latter to make her his bride, while the latter is doing her level best to maintain her single status. Stuff happens, LeChuck seems to get blown up, and Guybrush and Elaine wind up on Plunder Island, a retirement community for aging pirates that’s incidentally also inhabited by El Pollo Diablo, the giant demon chicken. (“He’s hatching a diabolical scheme”; “He’s establishing a new pecking order”; “He’s going to buck buck buck the system”; “He’s crossing the road to freedom”; etc.) Guybrush proposes to Elaine using a diamond ring he stole from the hold of LeChuck’s ship, only to find that there’s a voodoo curse laid on it. Elaine gets turned into a solid-gold statue (d’oh!), which Guybrush leaves standing on the beach while he tries to figure out what to do about the situation. Sure enough, some opportunistic pirates — is there any other kind? — sail away with it. (Double d’oh!) Guybrush is left to scour Plunder Island for a ship, a crew, and a map that will let him follow them to Blood Island, where there is conveniently supposed to be another diamond ring that can reverse the curse.

The vicious chickens of Plunder Island. “Larry and I thought we were so clever when we came up with the idea of having a tropical island covered with feral chickens,” says Jonathan Ackley. “Then I took a vacation to the Hawaiian island of Kauai. It seems that when Kauai was hit by Hurricane Iniki, it blew open all the chicken coops. Everywhere I went on the island I was surrounded by feral chickens.”

From the shopping list of quest items to the plinking steelband soundtrack that undergirds the proceedings, all of this is a dead ringer for The Secret of Monkey Island; this third game is certainly not interested in breaking any new ground in setting, story, or genre. But when it’s done this well, who cares? There is a vocal segment of Monkey Island fans who reject this game on principle, who say that any Monkey Island game without the name of Ron Gilbert first on its credits list is no Monkey Island game at all. For my own part, I tend to believe that, if we didn’t know that Gilbert didn’t work on this game, we’d have trouble detecting that fact from the finished product. It nails that mixture of whimsy, cleverness, and sweetness that has made The Secret of Monkey Island arguably the most beloved point-and-click adventure game of all time.

During the latter 1990s, when most computers games were still made by and for a fairly homogeneous cohort of young men, too much ludic humor tried to get by on transgression rather than wit; this was a time of in-groups punching — usually punching down — on out-groups. I’m happy to say that The Curse of Monkey Island‘s humor is nothing like that. At the very beginning, when Guybrush is floating in that bumper car, he scribbles in his journal about all the things he wishes he had. “If only I could have a small drink of freshwater, I might have the strength to sail on.” A bottle of water drifts past while Guybrush’s eyes are riveted to the page. “If I could reach land, I might find water and some food. Fruit maybe, something to fight off the scurvy and help me get my strength back. Maybe some bananas.” And a crate of bananas drifts by in the foreground. “Oh, why do I torture myself like this? I might as well wish for some chicken and a big mug of grog, for all the good it will do me.” Cue the clucking chicken perched on top of a barrel. Now, you might say that this isn’t exactly sophisticated humor, and you’d be right. But it’s an inclusive sort of joke that absolutely everyone is guaranteed to understand, from children to the elderly, whilst also being a gag that I defy anyone not to at least smirk at. Monkey Island is funny without ever being the slightest bit cruel — a combination that’s rarer in games of its era than it ought to be.

Which isn’t to say that this game is without in-jokes. They’re everywhere, and the things they reference are far from unexpected. Star Trek gets a shout-out in literally the first line of the script as Guybrush writes in his “captain’s log,” while, appropriately enough given the studio that made this game, whole chunks of dialogue are re-contextualized extracts from the Star Wars movies. The middle of the game is an extended riff on/parody of that other, very different franchise that springs to mind when gamers think about pirates — the one started by Sid Meier, that’s known as simply Pirates!. Here as there, you have to sail your ship around the Caribbean engaging in battles with other sea dogs. But instead of dueling the opposing captains with your trusty cutlass when you board their vessels, here you challenge them to a round of insult sword-fighting instead. (Pirate: “You’re the ugliest monster ever created!” Guybrush: “If you don’t count all the ones you’ve dated!”)


One of the game’s best gags is an interactive musical number you perform with your piratey crew, feeding them appropriate rhymes. “As far as I know, nobody had ever done interactive singing before,” says Jonathan Ackley. “I think it was an original idea and I still laugh when I see it.” Sadly, the song was cut from the game’s foreign localizations as a bridge too far from its native English, even for LucasArts’s superb translators.

It shouldn’t work, but somehow it does. In fact, this may just be my favorite section of the entire game. Partly it succeeds because it’s just so well done; the action-based minigame of ship-to-ship combat that precedes each round of insult sword-fighting is, in marked contrast to those in LucasArts’s previous adventure Full Throttle, very playable in its own right, being perfectly pitched in difficulty, fun without ever becoming frustrating. But another key to this section’s success is that you don’t have to know Pirates! for it to make you laugh; it’s just that, if you do, you’ll laugh that little bit more. All of the in-jokes operate the same way.

Pirates! veterans will feel right at home with the ship-combat minigame. It was originally more complicated. “When I first started the ship-combat section,” says programmer Chris Purvis, “I had a little readout that told how many cannons you had, when they were ready to fire, and a damage printout for when you or the computer ships got hit. We decided it was too un-adventure-gamey to leave it that way.” Not to be outdone, a member of the testing team proposed implementing multiplayer ship combat as “the greatest Easter egg of all time for any game.” Needless to say, it didn’t happen.

The puzzle design makes for an interesting study. After 1993, the year of Day of the Tentacle and Sam and Max Hit the Road, LucasArts hit a bumpy patch in this department in my opinion. Both Full Throttle and The Dig, their only adventures between those games and this one, are badly flawed efforts when it comes to their puzzles, adhering to the letter but not the spirit of Ron Gilbert’s old “Why Adventure Games Suck” manifesto. And Grim Fandango, the adventure that immediately followed this one, fares if anything even worse in that regard. I’m pleased if somewhat perplexed to be able to say, then, that The Curse of Monkey Island mostly gets its puzzles right.

There are two difficulty levels here, an innovation borrowed from Monkey Island 2. Although the puzzles at the “Mega-Monkey” level are pretty darn convoluted — one sequence involving a restaurant and a pirate’s tooth springs especially to mind as having one more layer of complexity than it really needs to — they are never completely beyond the pale. It might not be a totally crazy idea to play The Curse of Monkey Island twice, once at the easy level and once at the Mega-Monkey level, with a few weeks or months in between your playthroughs. There are very few adventure games for which I would make such a recommendation in our current era of entertainment saturation, but I think it’s a reasonable one in this case. This game is stuffed so full of jokes both overt and subtle that it can be hard to take the whole thing in in just one pass. Your first excursion will give you the lay of the land, so to speak, so you know roughly what you’re trying to accomplish when you tackle the more complicated version.

Regardless of how you approach it, The Curse of Monkey Island is a big, generous adventure game by any standard. I daresay that the part that takes place on Plunder Island alone is just about as long as the entirety of The Secret of Monkey Island. Next comes the Pirates! homage, to serve as a nice change of pace at the perfect time. And then there’s another whole island of almost equal size to the first to explore.  After all that comes the bravura climax, where LeChuck makes his inevitable return; in a rather cheeky move, this ending too takes place in an amusement park, with Guybrush once again transformed into a child.

If I was determined to find something to complain about, I might say that the back half of The Curse of the Monkey Island isn’t quite as strong as the front half. Blood Island is implemented a little more sparsely than Plunder Island, and the big climax in particular feels a little rushed and truncated, doubtless the result of a production budget and schedule that just couldn’t be stretched any further if the game was to ship in time for the 1997 Christmas season. Still, these are venial sins; commercial game development is always the art of the possible, usually at the expense of the ideal.

When all is said and done, The Curse of Monkey Island might just be my favorite LucasArts adventure, although it faces some stiff competition from The Secret of Monkey Island and Day of the Tentacle. Any points that it loses to Secret for its lack of originality in the broad strokes, it makes up for in size, in variety, and in sheer gorgeousness.

Although I have no firm sales figures to point to, all indications are that The Curse of Monkey Island was a commercial success in its day, the last LucasArts adventure about which that statement can be made. I would guess from anecdotal evidence that it sold several hundred thousand copies, enough to convince the company to go back to the Monkey Island well one more time in 2000. Alas, the fourth game would be far less successful, both artistically and commercially.

These things alone are enough to give Curse a valedictory quality today. But there’s more: it was also the very last LucasArts game to use the SCUMM engine, as well as the last to rely primarily on pixel art. The world-class cartoon-animation studio that the company’s adventure division had become was wound down after this game’s release, and Larry Ahern and Jonathan Ackley were never given a chance to lead a project such as this one again, despite having acquitted themselves so well here. That was regrettable, but not incomprehensible. Economics weren’t working in the adventure genre’s favor in the late 1990s. A game like The Curse of Monkey Island was more expensive to make per hour of play time it provided than any other kind of game you could imagine; all of this game’s content was bespoke content, every interaction a unique one that had to be written and story-boarded and drawn and painted and animated and voiced from scratch.

The only way that adventure games — at least adventures with AAA production values like this one — could have remained an appealing option for gaming executives would have been if they had sold in truly massive numbers. And this they emphatically were not doing. Yes, The Curse of Monkey Island did reasonably well for itself — but a game like Jedi Knight probably did close to an order of magnitude better, whilst probably costing considerably less to make. The business logic wasn’t overly complicated. The big animation studios which LucasArts liked to see as their peers could get away with it because their potential market was everyone with a television or everyone who could afford to buy a $5 movie ticket; LucasArts, on the other hand, was limited to those people who owned fairly capable, modern home computers, who liked to solve crazily convoluted puzzles, and who were willing and able to drop $40 or $50 for ten hours or so of entertainment. The numbers just didn’t add up.

In a sense, then, the surprise isn’t that LucasArts made no more games like this one, but rather that they allowed this game to be finished at all. Jonathan Ackley recalls his reaction when he saw Half-Life for the first time: “Well… that’s kind of it for adventure games as a mainstream, AAA genre.” More to their credit than otherwise, the executives at LucasArts didn’t summarily abandon the adventure genre, but rather tried their darnedest to find a way to make the economics work, by embracing 3D modelling to reduce production costs and deploying a new interface that would be a more natural fit with the tens of millions of game consoles that were out there, thus broadening their potential customer base enormously. We’ll get to the noble if flawed efforts that resulted from these initiatives in due course.

For today, though, we raise our mugs of grog to The Curse of Monkey Island, the last and perhaps the best go-round for SCUMM. If you haven’t played it yet, by all means, give it a shot. And even if you have, remember what I told you earlier: this is a game that can easily bear replaying. Its wit, sweetness, and beauty remain undiminished more than a quarter of a century after its conception.


The Curse of Monkey Island: The Graphic Novel

(I’ve cheerfully stolen this progression from the old Prima strategy guide to the game…)

Our story begins with our hero, Guybrush Threepwood, lost at sea and pining for his love Elaine.

He soon discovers her in the midst of a pitched battle…

…with his old enemy and rival for her fair hand, the zombie pirate LeChuck.

Guybrush is captured by LeChuck…

…but manages to escape, sending LeChuck’s ship to the bottom in the process. Thinking LeChuck finally disposed of, Guybrush proposes to Elaine, using a diamond ring he found in the zombie pirate’s treasure hold…

…only to discover it is cursed. Elaine is less than pleased…

…and is even more ticked off when she is turned into a gold statue.

Guybrush sets off to discover a way to break the curse — and to rescue Elaine, since her statue is promptly stolen. His old friend the voodoo lady tells him he will need a ship, a crew, and a map to Blood Island, where he can find a second diamond ring that will reverse the evil magic of the first.

He meets many interesting and irritating people, including some barbers…

…a restaurateur…

…and a cabana boy, before he is finally able to set sail for Blood Island.

After some harrowing sea battles and a fierce storm…

…his ship is washed ashore on Blood Island.

Meanwhile LeChuck has been revived…

…and has commanded his minions to scour the Caribbean in search of Guybrush.

Unaware of this, Guybrush explores Blood Island, where he meets a patrician bartender…

…the ghost of a Southern belle…

…a vegan cannibal…

…and a Welsh ferryman.

He finally outsmarts Andre, King of the Smugglers, to get the diamond that will restore Elaine.

Unfortunately, as soon as Elaine is uncursed, the two are captured by LeChuck and taken to the Carnival of the Damned on Monkey Island.

LeChuck turns Guybrush into a little boy and attempts to escape with Elaine on his hellish roller coaster.

But Guybrush’s quick thinking saves the day, and he sails off with his new bride into the sunset.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book The Curse of Monkey Island: The Official Strategy Guide by Jo Ashburn. Retro Gamer 70; Computer Games Strategy Plus  of August 1997; Computer Gaming World of October 1995, March 1996, September 1997, November 1997, December 1997, and March 1998.

Online sources include a Genesis Temple interview with Larry Ahern, an International House of Mojo interview with Jonathan Ackley and Larry Ahern, the same site’s archive of old Curse of Monkey Island interviews, and a contemporaneous Animation World Network profile of LucasArts.

Also, my heartfelt thanks to Guillermo Crespi and other commenters for pointing out some things about the ending of Monkey Island 2 that I totally overlooked in my research for the first version of this article.

The Curse of Monkey Island is available for digital purchase at GOG.com.

 
 

Tags: , ,

Jedi Knight (Plus, Notes on an Expanded Universe)

The years from 1991 to 1998 were special ones in which to be a Star Wars fan. For during these years, more so than during any other time in the franchise’s existence, Star Wars truly belonged to its fans.

The period just before this one is sometimes called the “Dark Period” or the “Dark Ages” by the fans of today. After 1983’s Return of the Jedi, that concluding installment in the original trilogy of films, George Lucas, Star Wars‘s sometimes cantankerous creator, insisted that he was done with his most beloved creation. A few underwhelming television productions aside, he stayed true to his word in the years that followed, whilst also refusing anyone else the right to play in his playground; even Kenner Toys was denied its request to invent some new characters and vehicles with which to freshen up the action-figure line. So, Star Wars gradually faded from the mass-media consciousness, much like the first generation of videogames that so infamously crashed the same year Return of the Jedi dropped. But no Nintendo came along to revive Star Wars‘s fortunes, for the simple reason that Lucas refused to allow it. The action figures that had revolutionized the toy industry gathered dust and then slowly disappeared from store shelves, to be replaced by cynical adjuncts to Saturday-morning cartoons: Transformers, He-Man, G.I. Joe. (Or, perhaps better said, the television shows were adjuncts to the action figures: the old scoffer’s claim that Star Wars had been created strictly to sell toys was actually true in their case.)

The biggest Star Wars project of this period wasn’t any traditional piece of media but rather a theme-park attraction. In a foreshadowing of the franchise’s still-distant future, Disneyland in January of 1987 opened its Star Wars ride, whose final price tag was almost exactly the same as that of the last film. Yet even at that price, something felt vaguely low-rent about it: the ride had been conceived under the banner of The Black Hole, one of the spate of cinematic Star Wars clones from the films’ first blush of popularity, then rebranded when Disney managed to acquire a license for The Black Hole’s inspiration. The ride fit in disarmingly well at a theme park whose guiding ethic was nostalgia for a vanished American past of Main Streets and picket fences. Rather than remaining a living property, Star Wars was being consigned to the same realm of kitschy nostalgia. In these dying days of the Cold War, the name was now heard most commonly as shorthand for President Ronald Reagan’s misconceived, logistically unsustainable idea for a defensive umbrella that would make the United States impervious to Soviet nuclear strikes.

George Lucas’s refusal to make more Star Wars feature films left Lucasfilm, the sprawling House That Star Wars Built, in an awkward situation. To be sure, there were still the Indiana Jones films, but those had at least as much to do with the far more prolific cinematic imagination of Steven Spielberg as they did with Lucas himself. When Lucas tried to strike out in new directions on his own, the results were not terribly impressive. Lucasfilm became as much a technology incubator as a film-production studio, spawning the likes of Pixar, that pioneer of computer-generated 3D animation, and Lucasfilm Games (later LucasArts), an in-house games studio which for many years wasn’t allowed to make Star Wars games. The long-running Star Wars comic book, which is credited with saving Marvel Comics from bankruptcy in the late 1970s, sent out its last issue in May of 1986; the official Star Wars fan club sent out its last newsletter in February of 1987. At this point, what was there left to write about? It seemed that Star Wars was dead and already more than half buried. But, as the cliché says, the night is often darkest just before the dawn.

The seeds of a revival were planted the very same year that the Star Wars fan club closed up shop, when West End Games published Star Wars: The Roleplaying Game, a tabletop RPG. Perhaps because it addressed such a niche segment of the overall entertainment marketplace, it was allowed more freedom to expand upon the extant universe of Star Wars than anything that had come before from anyone not named George Lucas. Although its overall commercial profile would indeed remain small in comparison to the blockbuster films and toys, it set a precedent for what was to come.

In the fall of 1988, Lou Aronica, head of Bantam Books’s science-fiction imprint Spectra, sent a proposal to Lucas for a series of new novels set in the Star Wars universe. This was by no means an entirely original idea in the broad strokes. The very first Star Wars novel, Alan Dean Foster’s Splinter of the Mind’s Eye, had appeared just nine months after the first film, having been born as a script treatment for a potential quickie low-budget sequel if the movie should prove modestly but not extremely successful. After it, a handful of additional paperbacks starring Han Solo and Lando Calrissian had been published. But Aronica envisioned something bigger than those early coattail-riders, a series of true “event” novels. “We can’t do these casually,” he wrote to Lucas. “They have to be as ambitious as the movies were. This body of work is too important to popular culture to end with these three movies.”

He knew it was a shot in the dark. Thus he was disappointed but not overly surprised when he heard nothing back for months; many an earlier proposal for doing something new with Star Wars had fallen on similarly deaf ears. Then, out of the blue, he received a grudging letter expressing interest. “No one is going to buy these,” Lucas groused — but if Bantam Books wanted to throw its money away, Lucasfilm would deign to accept a licensing royalty, predicated on a number of conditions. The most significant of these were that the books could take place between, during, or after the movies but not before; that they would be labeled as artifacts of an “Expanded Universe” which George Lucas could feel free to contradict at any time, if he should ever wish to return to Star Wars himself; and that Lucas and his lieutenants at Lucasfilm would be able to request whatever changes they liked in the manuscripts — or reject them completely — prior to their publication. All of that sounded fine to Lou Aronica.

So, Heir to the Empire, the first of a trilogy of novels telling what happened immediately after Return of the Jedi, was published on May 1, 1991. Its author was Timothy Zahn, an up-and-coming writer whose short stories had been nominated for Hugo awards four times, winning once. Zahn was symbolic of the new group of creators who would be allowed to take the reins of Star Wars for the next seven years. For unlike the workaday writers who had crafted those earlier Star Wars novels to specifications, Zahn was a true-blue fan of the movies, a member of the generation who had first seen them as children or adolescents — Zahn was fifteen when the first film arrived in theaters — and literally had the trajectory of their lives altered by the encounter. Despite the Bantam Spectra imprint on its spine, in other words, Heir to the Empire was a form of fan fiction.

Heir to the Empire helped the cause immensely by being better than anyone might have expected. Even the sniffy mainstream reviewers who took it on had to admit that it did what it set out to do pretty darn effectively. Drawing heavily on the published lore of Star Wars: The Roleplaying Game as well as his own imagination, Zahn found a way to make his novel feel like Star Wars without lapsing into rote regurgitation of George Lucas’s tropes and plot lines. Grand Admiral Thrawn, his replacement for Darth Vader in the role of chief villain, was at least as interesting a character as his predecessor, whilst being interesting in totally different ways. Through him, Zahn was able to articulate an ethical code for the Empire that went beyond being evil and oppressive for the sake of it: a philosophy of political economy by no means unknown to some of the authoritarian nations of our own world, hinging on the belief that too much personal freedom leads only to anarchy and chaos and an endemic civic selfishness, making life worse for everyone. It’s a philosophy with which you can disagree — I certainly do, stridently — but it isn’t a thoughtless or even an entirely heartless one.

This is not to say that Heir to the Empire was some dry political dissertation; Zahn kept the action scenes coming, kept it fun, kept it Star Wars, striking a balance that George Lucas himself would later fail badly to establish in his own return to his science-fiction universe. The hardcover novel topped the New York Times bestseller chart, defying Lucas’s predictions of its failure, proving there was a ready market out there for new Star Wars product.

That said, very few of the Star Wars novels that would follow would match Heir to the Empire and its two sequels in terms of quality. With so much money waiting to be made, Lou Aronica’s vision for a carefully curated and edited series of event novels — perhaps one per year — fell by the wayside all too rapidly. Soon new novels were appearing monthly rather than yearly, alongside a rebooted comic book. Then they were coming even faster than that; 1997 alone saw a staggering 22 new Star Wars novels. And so the Expanded Universe fell victim to that bane of fan fictions everywhere, a lack of quality control. By the time Han Solo and Princess Leia had gotten married and produced three young Jedi of their own, who were all running around having adventures of their own in their own intertwining series of books, it was reasonable to ask whether it was all becoming much, much too much. A drought had become an indiscriminate tsunami; a trilogy of action movies had turned into All My Children.

Even when it was no better than it ought to have been, however, there was a freewheeling joy to the early Expanded Universe which is poignant to look back upon from the perspective of these latter days of Star Wars, when everything about the franchise is meticulously managed from the top down. The Expanded Universe, by contrast, was a case of by the fans, for the fans. With new movies the stuff of dreams only, they painted every corner of the universe in vivid colors of their own. The Expanded Universe could be cheesy, but it was never cynical. One could argue that it felt more like Star Wars — the original Star Wars of simple summertime fun, the one that didn’t take itself so gosh-darn seriously — than anything that has appeared under the name since 1998.

By a happy accident, a contract between Lucasfilm and Kenner Toys, giving the latter an exclusive monopoly on Star Wars “toys and games,” was allowed to lapse the same year that Heir to the Empire appeared in bookstores. Thus LucasArts, Lucasfilm’s own games division, could get in on the Expanded Universe fun. What had been a bizarre dearth of Star Wars games during the 1980s turned into a 1990s deluge almost comparable to the one taking place in novels. LucasArts released a dozen or so Star Wars games in a broad range of gameplay genres between 1993 and 1998, drawing indiscriminately both from the original movies and from the new tropes and characters of the literary Expanded Universe. Like the books, these games weren’t always or even usually masterpieces, but their unconstrained sense of possibility makes them feel charmingly anomalous in comparison to the corporate-managed, risk-averse, Disneyfied Star Wars of today.

And then, too, LucasArts did produce two games that deserve to be ranked alongside Timothy Zahn’s first trilogy of Star Wars novels as genuine classics in their field. We’ve met one of these already in an earlier article: the “space simulator” TIE Fighter, whose plot had you flying and fighting for Zahn’s more philosophically coherent version of the Empire, with both Darth Vader and Admiral Thrawn featuring in prominent roles. The other, the first-person shooter Jedi Knight, will be our subject for today.


Among other things, Jedi Knight heralded a dawning era of improbably tortured names in games. Its official full name is Star Wars: Jedi Knight — Dark Forces II, a word salad that you can arrange however you like and still have it make just about the same amount of sense. It’s trying to tell us in its roundabout way that Jedi Knight is a sequel to Dark Forces, the first Star Wars-themed shooter released by LucasArts. Just as TIE Fighter and its slightly less refined space-simulator predecessor X-Wing were responses to the Wing Commander phenomenon, Jedi Knight and before it Dark Forces put a Star Wars spin on the first-person-shooter (FPS) craze that was inaugurated by DOOM. So, it’s with Dark Forces that any Jedi Knight story has to begin.

Dark Forces was born in the immediate aftermath of DOOM, when half or more of the studios in the games industry seemed suddenly to be working on a “DOOM clone,” as the nascent FPS genre was known before that acronym was invented. It was in fact one of the first of the breed to be finished, shipping already in February of 1995, barely a year after its inspiration. And yet it was also one of the few to not just match but clearly improve upon id Software’s DOOM engine. Whereas DOOM couldn’t handle sloping surfaces, didn’t even allow you to look up or down, LucasArts’s “Jedi” engine could play host to vertiginous environments full of perches and ledges and passages that snaked over and under as well as around one another.

Dark Forces stood out as well for its interest in storytelling, despite inhabiting a genre in which, according to a famous claim once advanced by id’s John Carmack, story was no more important than it was in a porn movie. This game’s plot could easily have been that of an Expanded Universe novel.

Dark Forces begins concurrent to the events of the first Star Wars movie. Its star is Kyle Katarn, a charming rogue of the Han Solo stripe, a mercenary who once worked for the Empire but is now peddling his services to the Rebel Alliance alongside his friend Jan Ors, a space jockey with a knack for swooping in in the nick of time to save him from the various predicaments he gets himself into. The two are hired to steal the blueprints of the Death Star, the same ones that will allow the Rebels to identify the massive battle station’s one vulnerability and destroy it in the film’s bravura climax. Once their role in the run-up to that event has been duly fulfilled, Kyle and Jan then go on to foil an Imperial plot to create a new legion of super soldiers known as Dark Troopers. (This whole plot line can be read as an extended inside joke about how remarkably incompetent the Empire’s everyday Stormtroopers are, throughout this game just as in the movies. If ever there was a gang who couldn’t shoot straight…)

Told through sparsely animated between-mission cut scenes, it’s not a great story by any means, but it serves its purpose of justifying the many changes of scenery and providing some motivation to traverse each succeeding level. Staying true to the Han Solo archetype, Kyle Katarn is even showing signs of developing a conscience by the time it’s over. All of which is to say that, in plot as in its audiovisual aesthetics, Dark Forces feels very much like Star Wars. It provided for its contemporary players an immersive rush that no novel could match; this and the other games of LucasArts were the only places where you could see new Star Wars content on a screen during the mid-1990s.

Unfortunately, Dark Forces is more of its time than timeless.[1]A reworked and remastered version of Dark Forces has recently been released as of this writing; it undoubtedly eases some of the issues I’m about to describe. These comments apply only to the original version of the game. I concur with Wes Fenlon of PC Gamer, who wrote in a retrospective review in 2016 that “I spent more of my Dark Forces playthrough appreciating what it pulled off in 1995 than I did really having fun.” Coming so early in the lifespan of the FPS as it did, its controls are nonstandard and, from the perspective of the modern player at least, rather awkward, lacking even such niceties as mouse-look. In lieu of a save-anywhere system or even save checkpoints, it gives you a limited number of lives with which to complete each level, like one of the arcade quarter-eaters of yore.

Its worst issues, however, are connected to level design, which was still a bit of a black art at this point in time. It’s absurdly easy to get completely lost in its enormous levels, which have no obvious geographical through-line to follow, but are rather built around a tangled collection of lock-and-key puzzles that require lots and lots of criss-crossing and backtracking. Although there is an auto-map, there’s no easy way to project a three-dimensional space like these levels onto its two-dimensional plane; all those ladders and rising and falling passageways quickly turn into an incomprehensible mess on the map. Dark Forces is an ironic case of a game being undone by the very technological affordances that made it stand out; playing it, one gets the sense that the developers have rather outsmarted themselves. When I think back on it now, my main memory is of running around like a rat in a maze, circling back into the same areas again and again, trying to figure out what the hell the game wants me to do next.

Good luck making sense of this bowl of spaghetti…

Nevertheless, Dark Forces was very well-received in its day as the first game to not just copy DOOM‘s technology but to push it forward — and with a Star Wars twist at that. Just two complaints cut through the din of praise, neither of them having anything to do with the level design that so frustrated me. One was the lack of a multiplayer mode, an equivalent to DOOM‘s famed deathmatches. And the other was the fact that Dark Forces never let you fight with a lightsaber, rather giving the lie to the name of the Jedi engine that powered it. The game barely even mentioned Jedi and The Force and all the rest; like Han Solo, Kyle Katarn was strictly a blaster sort of guy at this juncture. LucasArts resolved to remedy both of these complaints in the sequel.


Jedi Knight actually straddles two trends in 1990s gaming, one of which has remained an evergreen staple of the hobby to this day, the other of which has long since been consigned to the realm of retro kitsch. The former is of course the FPS genre; the later is the craze for “full-motion video,” the insertion of video clips featuring real human actors into games. This “interactive movie” fad was already fast becoming passé when Jedi Knight was released in October of 1997. It was one of the last relatively big-budget, mainstream releases to embrace it.

Having written about so many of these vintage FMV productions in recent years, I’ve developed an odd fascination with the people who starred in them. These were generally either recognizable faces with careers past their prime or, more commonly, fresh-faced strivers looking for their big break, the sort of aspirants who have been waiting tables and dressing up in superhero costumes for the tourists strolling the Hollywood Walk of Fame for time immemorial, waiting for that call from their agent that means their ship has finally come in. Needless to say, for the vast majority of the strivers, a role in a CD-ROM game was as close as they ever came to stardom. Most of them gave up their acting dream at some point, went back home, and embarked on some more sensible career. I don’t see their histories as tragic at all; they rather speak to me of the infinite adaptability of our species, our adroitness at getting on with a Plan B when Plan A doesn’t work out, leaving us only with some amusing stories to share at dinner parties. Such stories certainly aren’t nothing. For what are any of our lives in the end but the sum total of the stories we can share, the experiences we’ve accumulated? All that stuff about “if you can dream it, you can do it” is nonsense; success in any field depends on circumstance and happenstance as much as effort or desire. Nonetheless, “it’s better to try and fail than never to try at all” is a cliché I can get behind.

But I digress. In Jedi Knight, Kyle Katarn is played by a fellow named Jason Court, whose résumé at the time consisted of a few minor television guest appearances, who would “retire” from acting by the end of the decade to become a Napa Valley winemaker. Court isn’t terrible here — a little wooden perhaps, but who wouldn’t be in a situation like this, acting on an empty sound stage whose background will later be painted in on the computer, intoning a script like this one?

Kyle Katarn, right, with his sidekick Jan Ors. It was surely no accident that Jason Court bears a passing resemblance to Mark Hamill — who was ironically himself starring in the Wing Commander games at this time.

Ah, yes… the script. Do you remember me telling you how Timothy Zahn’s early Star Wars novels succeeded by not slavishly echoing the tropes and character beats from the films? Well, this script is the opposite of that. The first words out of any character’s mouth are those of a Light Jedi promising a Dark Jedi that “striking me down” will have unforeseen consequences, just as Obi-Wan Kenobi once said to Darth Vader. What follows is a series of reenactments of beats and entire scenes from the movies in slightly altered contexts, on a budget of about one percent the size. Kyle Katarn, now yanked out of Han Solo’s shoes and thrust into those of Luke Skywalker, turns out to have grown up on a planet bizarrely similar to Tatooine and to have some serious daddy issues to go along with an inherited lightsaber and undreamt-of potential in The Force. The word “derivative” hardly begins to convey the scale of this game’s debt to its cinematic betters.

For all that, though, it’s hard to really hate the cut scenes. Their saving grace is that of the Expanded Universe as a whole (into whose welcoming canon Kyle Katarn was duly written, appearing in the comics, the novels, even as an action figure of his own): the lack of cynicism, the sense that everything being done is being done out of love even when it’s being done badly. When the Jedi ignited their lightsabers during the opening cut scene, it was the first time that distinctive swoosh and buzz had been seen and heard since Return of the Jedi. Even in our jaded present age, we can still sense the makers’ excitement at being allowed to do this, can imagine the audience’s excitement at being witness to it. There are worse things in this world than a community-theater re-creation of Star Wars.

The cut scenes are weirdly divorced from everything else in Jedi Knight. Many FMV productions have this same disjointed quality to them, a sense that the movie clips we watch and the game we play have little to do with one another. Yet seldom is that sense of a right hand that doesn’t know what the left is doing more pronounced than here. The Kyle of the video clips doesn’t even look like the Kyle of the rest of the game; the former has a beard, the latter does not. The divide is made that much more jarring by the aesthetic masterfulness of the game whenever the actors aren’t onscreen. Beginning with that iconic three-dimensional text crawl and John Williams’s equally iconic score, this game looks, sounds, and plays like an interactive Star Wars movie — whenever, that is, it’s not literally trying to be a Star Wars movie.

Certainly the environments you explore here are pure Star Wars. The action starts in a bar that looks like the Mos Eisley cantina, then sends you scampering off through one of those sprawling indoor complexes that seem to be everywhere in the Star Wars universe, all huge halls with improbably high ceilings and miles of corridors and air shafts connecting them, full of yawning gaps and precarious lifts, gun-metal grays and glittering blacks. Later, you’ll visit the streets and rooftops of a desert town with a vaguely Middle Eastern feel, the halls and courts of a fascistic palace lifted straight out of Triumph of the Will, the crawl-ways and garbage bins of a rattletrap spaceship… all very, very Star Wars, all pulsing with that unmistakable Star Wars soundtrack.

Just as Dark Forces was a direct response to DOOM, in technological terms Jedi Knight was LucasArts’s reply to id’s Quake, which was released about fifteen months before it. DOOM and Dark Forces are what is sometimes called “2.5D games” — superficially 3D, but relying on a lot of cheats and shortcuts, such as pre-rendered sprites standing in for properly 3D-modelled characters and monsters in the world. The Quake engine and the “Sith” engine that powers Jedi Knight are, by contrast, 3D-rendered from top to bottom, taking advantage not only of the faster processors and more expansive memories of the computers of their era but the new hardware-accelerated 3D graphics cards. Not only do they look better for it, but they play better as well; the vertical dimension which LucasArts so consistently emphasized benefits especially. There’s a lot of death-defying leaping and controlled falling in Jedi Knight, just as in Dark Forces, but it feels more natural and satisfying here. Indeed, Jedi Knight in general feels so much more modern than Dark Forces that it’s hard to believe the two games were separated in time by only two and a half years. Gone, for example, are the arcade-like limited lives of Dark Forces, replaced by the ability to save wherever you want whenever you want, a godsend for working adults like yours truly whose bedtime won’t wait for them to finish a level.

If you ask me, though, the area where Jedi Knight improves most upon its predecessor has nothing to do with algorithms or resolutions or frame rates, nor even convenience features like the save system. More than anything, it’s the level design here that is just so, so much better. Jedi Knight’s levels are as enormous as ever, whilst being if anything even more vertiginous than the ones of Dark Forces. And yet they manage to be far less confusing, having the intuitive through-line that the levels of Dark Forces lacked. Very rarely was I more than momentarily stumped about where to go next in Jedi Knight; in Dark Forces, on the other hand, I was confused more or less constantly.

Maybe I should clarify something at this point: when I play an FPS or a Star Wars game, and especially when I play a Star Wars FPS, I’m not looking to labor too hard for my fun. I want a romp; “Easy” mode suits me just fine. You know how in the movies, when Luke and Leia and the gang are running around getting shot at by all those Stormtroopers who can’t seem to hit the broadside of a barn, things just kind of work out for them? A bridge conveniently collapses just after they run across, a rope is hanging conveniently to hand just when they need it, etc. Well, this game does that for you. You go charging through the maelstrom, laser blasts ricocheting every which way, and, lo and behold, there’s the elevator platform you need to climb onto to get away, the closing door you need to dive under, the maintenance tunnel you need to leap into. It’s frantic and nerve-wracking and then suddenly awesome, over and over and over again. It’s incredibly hard in any creative field, whether it happens to be writing or action-game level design, to make the final product feel effortless. In fact, I can promise you that, the more effortless something feels, the more hard work went into it to make it feel that way. My kudos, then, to project leader Justin Chin and the many other hands who contributed to Jedi Knight, for being willing to put in the long, hard hours to make it look easy.

Of those two pieces of fan service that were deemed essential in this sequel — a multiplayer mode and lightsabers — I can only speak of the second from direct experience. By their own admission, the developers struggled for some time to find a way of implementing lightsabers in a way that felt both authentic and playable. In the end, they opted to zoom back to a Tomb Raider-like third-person, behind-the-back perspective whenever you pull out your trusty laser sword. This approach generated some controversy, first within LucasArts and later among FPS purists in the general public, but it works pretty well in my opinion. Still, I must admit that when I played the game I stuck mostly with guns and other ranged weapons, which run the gamut from blasters to grenades, bazookas to Chewbacca’s crossbow.

The exceptions — the places where I had no choice but to swing a lightsaber — were the one-on-one duels with other Jedi. These serve as the game’s bosses, coming along every few levels until the climax arrives in the form of a meeting with the ultimate bad guy, the Dark Jedi Jerec whom you’ve been in a race with all along to locate the game’s McGuffin, a mysterious Valley of the Jedi. (Don’t ask; it’s really not worth worrying about.) Like everything else here, these duels feel very, very Star Wars, complete with lots of villainous speechifying beforehand and lots of testing of Kyle’s willpower: “Give in to the Dark Side, Kyle! Use your hatred!” You know the drill. I enjoyed their derivative enthusiasm just as much as I enjoyed the rest of the game.

A Jedi duel in the offing.

Almost more interesting than the lightsabers, however, is the decision to implement other types of Force powers, and with them a morality tracker that sees you veering toward either the Dark or the Light Side of the Force as you play. If you go Dark by endangering or indiscriminately killing civilians and showing no mercy to your enemies, you gradually gain access to Force powers that let you deal out impressive amounts of damage without having to lay your hand on a physical weapon. If you go Light by protecting the innocent and sparing your defeated foes, your talents veer more toward the protective and healing arts — which, given the staggering amounts of firepower at your disposal in conventional-weapon form, is probably more useful in the long run. Regardless of which path you go down, you’ll learn to pull guns right out of your enemies’ hands from a distance and to “Force Jump” across gaps you could never otherwise hope to clear. Doing so feels predictably amazing.

Kyle can embrace the Dark Side to some extent. But as usually happens with these sorts of nods toward free will in games with mostly linear plot lines, it just ends up meaning that he foils the plans of the other Dark Jedi for his own selfish purposes rather than for selfless reasons. Cue the existentialist debates…

I’m going to couch a confession inside of my praise at this point: Jedi Knight is the first FPS I’ve attempted whilst writing these histories that I’ve enjoyed enough to play right through to the end. It took me about a week and a half of evenings to finish, the perfect length for a game like this in my book. Obviously, the experience I was looking for may not be the one that other people who play this game have in mind; those people can try turning up the difficulty level, ferreting out every single secret area, killing every single enemy, or doing whatever else they need to in order to find the sort of challenge they’d prefer. They might also want to check out the game’s expansion pack, which caters more to the FPS hardcore by eliminating the community-theater cut scenes and making everything in general a little bit harder. I didn’t bother, having gotten everything I was looking for out of the base game.

That said, I do look forward to playing more games like Jedi Knight as we move on into a slightly more evolved era of the FPS genre as a whole. While I’m never likely to join the hardcore blood-and-guts contingent, action-packed fun like this game offers up is hard for even a reflex-challenged, violence-ambivalent old man like me to resist.


Epilogue: The Universe Shrinks

Students of history like to say that every golden age carries within it the seeds of its demise. That rings especially true when it comes to the heyday of the Expanded Universe: the very popularity of the many new Star Wars novels, comics, and games reportedly did much to convince George Lucas that it might be worth returning to Star Wars himself. And because Lucas was one of the entertainment world’s more noted control freaks, such a return could bode no good for this giddy era of fan ownership.

We can pin the beginning of the end down to a precise date: November 1, 1994, the day on which George Lucas sat down to start writing the scripts for what would become the Star Wars prequels, going so far as to bring in a film crew to commemorate the occasion. “I have beautiful pristine yellow tablets,” he told the camera proudly, waving a stack of empty notebooks in front of its lens. “A nice fresh box of pencils. All I need is an idea.” Four and a half years later, The Phantom Menace would reach theaters, inaugurating for better or for worse — mostly for the latter, many fans would come to believe — the next era of Star Wars as a media phenomenon.

Critics and fans have posited many theories as to why the prequel trilogy turned out to be so dreary, drearier even than clichés about lightning in a bottle and not being able to go home again would lead one to expect. One good reason was the absence in the editing box of Marcia Lucas, whose ability to trim the fat from her ex-husband’s bloated, overly verbose story lines was as sorely missed as her deft way with character moments, the ones dismissed by George as the “dying and crying” scenes. Another was the self-serious insecurity of the middle-aged George Lucas, who wanted the populist adulation that comes from making blockbusters simultaneously with the respect of the art-house cognoscenti, who therefore decided to make the prequels a political parable about “what happens to you if you’ve got a dysfunctional government that’s corrupt and doesn’t work” instead of allowing them to be the “straightforward, wholesome, fun adventure” he had described the first Star Wars movie to be back in 1977. Suffice to say that Lucas displayed none of Timothy Zahn’s ability to touch on more complicated ideas without getting bogged down in them.

But whatever the reasons, dreary the prequels were, and their dreariness seeped into the Expanded Universe, whose fannish masterminds saw the breadth of their creative discretion steadily constricted. A financially troubled West End Games lost the license for its Star Wars tabletop RPG, the Big Bang that had gotten the universe expanding in the first place, in 1999. In 2002, the year that the second of the cinematic prequels was released, Alan Dean Foster, the author of the very first Star Wars novel from 1978, agreed to return to write another one. “It was no fun,” he remembers. The guidance he got from Lucasfilm “was guidance in the sense that you’re in a Catholic school and nuns walk by with rulers.”

And then, eventually, came the sale to Disney, which in its quest to own all of our childhoods turned Star Wars into just another tightly controlled corporate property like any of its others. The Expanded Universe was finally put out of its misery once and for all in 2014, a decade and a half past its golden age. It continues to exist today only in the form of a handful of characters, Grand Admiral Thrawn among them, who have been co-opted by Disney and integrated into the official lore.

The corporate Star Wars of these latter days can leave one longing for the moment when the first film and its iconic characters fall out of copyright and go back to the people permanently. But even if Congress is willing and the creek don’t rise, that won’t occur until 2072, a year I and presumably many of you as well may not get to see. In the meantime, we can still use the best artifacts of the early Expanded Universe as our time machines for traveling back to Star Wars‘s last age of innocent, uncalculating fun.

Where did it all go wrong?



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The books Rocket Jump: Quake and the Golden Age of First-Person Shooters by David L. Craddock, How Star Wars Conquered the Universe by Chris Taylor, and The Secret History of Star Wars by Michael Kaminski. Computer Gaming World of May 1995, October 1996, January 1997, December 1997, and March 1998; PC Zone of May 1997; Retro Gamer 138; Chicago Tribune of May 24 2017.

Online sources include Wes Fenlon’s Dark Forces and Jedi Knight retrospective for PC Gamer. The film George Lucas made to commemorate his first day of writing the Star Wars prequels is available on YouTube.

Jedi Knight is available for digital purchase at GOG.com. Those who want to dive deeper may also find the original and/or remastered version of Dark Forces to be of interest.

Footnotes

Footnotes
1 A reworked and remastered version of Dark Forces has recently been released as of this writing; it undoubtedly eases some of the issues I’m about to describe. These comments apply only to the original version of the game.
 

Tags: , , ,

Age of Empires (or, How Microsoft Got in on Games)

We don’t have a strategy to do a $200 game console that is a direct competitor to what Nintendo, Sega, and Sony are doing…

— Bill Gates, June 1996

It’s hard to overstate the scale of the real-time-strategy deluge of the late 1990s. For a period of several years, it seemed that every studio and publisher in the industry was convinced that duplicating the gameplay of Blizzard’s Warcraft and Westwood’s Command & Conquer franchises, those two most striking success stories in the business of computer games since Myst and DOOM, must surely be the digital equivalent of printing money. In the fall of 1997, Computer Gaming World magazine counted no fewer than 40 RTS’s slated for release during the coming Christmas season alone, to go along with the “nearly 20” that had already appeared with names other than Warcraft or Command & Conquer on their boxes. With no other obvious way of sorting through the jumble, the magazine chose simply to alphabetize the combatants in this “biggest clone war to hit the PC,” resulting in a list that began with 7th Legion and ended with Waterworld.

If those names don’t ring any bells with you today, you aren’t alone. While many of these games were competently made by genuinely enthusiastic developers, few mass movements in gaming have ever felt quite so anonymous. Although the drill of collecting resources, building up an army, and attacking your computerized or human enemies in real time struck a lot of people as a whole lot of fun — there was, after all, a reason that Warcraft and Command & Conquer had become so popular in the first place — it was hard for the creators of the next RTS generation to figure out what to do to set their games apart, whilst also staying within a strict set of design constraints that were either self-imposed or imposed upon them by their conservative publishers. Adventure games, CRPGs, and first-person shooters had all been the beneficiaries or victims of similar gluts in the past, but they had managed to explore a larger variety of fictional contexts if not always gameplay innovations. When it came to RTS’s, though, they all seemed to follow in the footsteps of either the high-fantasy Warcraft or the techno-futuristic Command & Conquer in their fictions as well as their gameplay. This can make even those members of the RTS Class of 1997 that are most fondly remembered today, such as the fantasy Myth or the science-fictional Total Annihilation, feel just a little generic to the uninitiated.

One game from this group, however, did stand out starkly from the crowd for the editors of Computer Gaming World, as it still does in the memories of gamers to this day. Whilst sticking to the tried and true in many of its mechanics, Age of Empires dared to try something different in terms of theme, mining its fiction from the real cultures of our planet’s ancient past. It played relatively straight with history, with no magic spells or aliens in sight. This alone was enough to make Age of Empires a welcome gust of fresh air in a sub-genre that was already sorely in need of it.

Yet there was also something else that made it stand out from the pack. Although its developer was an unknown outfit called Ensemble Studios — one of many that were springing up like toadstools after a rain to feed the real or perceived hunger among gamers for more, more, more RTS’s — its publisher was, of all companies, Microsoft, that one name in software that even your grandparents knew. The arrival of Age of Empires signaled a new era of interest and engagement with games by the most daunting single corporate power in the broader field of computing in general. If anyone still needed convincing that computer games were becoming mainstream entertainments in every sense of the phrase, this ought to have been enough to do the trick. For, whatever else one could say about Microsoft, it was not in the habit of exploring the nooks and crannies of the software market — not when there was a sprawling middle ground where it could plant its flag.



The man behind Ensemble Studios was one Tony Goodman, whose life’s direction had been set in the sixth grade, when his father, a professor of management science at Southern Methodist University in Dallas, Texas, brought home a terminal that could be used to connect to the university’s mainframe. “He would give me the same problems that he had given his students,” says Goodman. “My father would say, ‘Tony, I have a puzzle for you.’ Immediately, I was sucked in for the rest of the day. I always looked at the problems as puzzles. I loved puzzles and games, so I just couldn’t get enough. It came to me naturally. I remember saying, ‘This is it. This is what I’m going to do with the rest of my life!'”

In an ironic sense, Goodman’s career path would be the opposite of that of the typical game developer, who joins the world of more plebeian software development only after getting burnt out by the long hours and comparatively low pay in games. Long before starting Ensemble Studios, Goodman made a career for himself in the information-technology departments of the banking industry, specializing, like his father before him, in data-visualization tools and the like that could aid executive-level decision-making. Along the way, he learned much that he would later be able to apply to games — for, he says, good games have much in common with good software of any other stripe: “One of the most valuable things that I learned about developing software was that, for users to be productive, the software had to be fun to use. The key is to keep people entertained long enough to be rewarded. This also happens to be the fundamental dynamic of games and, indeed, all human experiences.”

In 1989, Tony Goodman and three partners formed Ensemble Corporation — not to be confused with Ensemble Studios — in his garage. Two years later, they released Command Center, a user-friendly front-end for Borland’s Paradox database system that could “automate queries, reports, forms, and graphics.” The company exploded from there, becoming a darling of the Forbes and Inc. set.

Throughout his years in business software, Goodman never lost touch with that younger version of himself who had been drawn to computers simply because he found them so wonderfully entertaining. He and his older brother Rick, who joined Ensemble Corporation as a programmer shortly after the release of Command Center, were lifelong board and computer gamers, watching at first-hand the aesthetic and technical evolution of the latter, parallel software industry. They found a kindred soul in another Ensemble programmer named Angelo Laudon, who, like them, could appreciate the higher salaries and profit margins in productivity software but nonetheless felt a longing to engage with his biggest passion. “We would talk about games until the early hours of the morning,” says Tony Goodman. “I loved the business of developing software, but I wanted to create products that everyone would tell their friends about. I wanted to create a pop-culture phenomenon. If you want to create software that people really want, developing videogames places you at the center of the universe.”

He realized that computer games had hit a watershed moment when Microsoft announced Windows 95, and with it DirectX, a software subsystem that would allow people to install and run even cutting-edge games as effortlessly as any other type of software, without the travails of the bespoke IRQ and DMA settings and memory managers that had been such a barrier to entry in the past. If he ever wanted to try to make games of his own, he knew, the time to get started was now, between the market’s expansion and the inevitable market saturation that would follow. Rick Goodman remembers how one day his brother

walks into work, assembles the team of database programmers, and says, “Would any of you guys rather be making games than database applications?”

I think people were caught off-guard. We were looking around the room, like, “Is this a trick question?” But I raised my hand, and Angelo Laudon raised his. Tony was serious. He said, “I’m going to pull you guys aside and we’ll make a game.” I thought that was awesome. I said, “Okay! What kind of game?” None of us had any idea.

For months thereafter, they continued to do their usual jobs during the day, then gathered again in the evening to hash through ideas and plans. During one of these sessions, Rick suddenly brought up a name that Tony hadn’t heard in a long, long time: Bruce Shelley, an older fellow with whom the brothers had played a lot of board games during their pre-teen and teenage years. Shelley worked in computer games now, said Rick — had in fact assisted Sid Meier with the design of Railroad Tycoon and Civilization. “Maybe — maybe —  he’s not busy.”

And lo and behold, it turned out that he wasn’t. After finishing Civilization, Shelley had left Meier and his other colleagues at MicroProse Software in order to follow his new wife, a banking executive, to Chicago, where she’d secured a job that was far more lucrative than any that he’d ever held. He was writing gaming strategy guides out of his home office when Tony Goodman called him up one day out of the blue: “I hadn’t heard from him in fifteen years, and here he is with his own business in Dallas, doing software for banks, and he’s got guys who want to make computer games. We had these long conversations about what it takes to make a game. I told my wife, ‘I think this guy’s going to start a game company.’ And finally he did call me and say, ‘We are going to start a game company, and we want you to be involved.'” Shelley agreed to fly down to Dallas to talk it over.

But they still weren’t sure what kind of game they wanted to make. Then, as Shelley remembers, “One day one of the guys walked in with Warcraft. He said, ‘We’ve got to make this. We’ve got to make one of these. This is blowing the socks off the gaming world right now.'” It all came together quickly after that. Why not combine the hottest current trend in gaming with the last game Shelley had helped to make, which was already widely regarded as a hallowed classic? “The idea was, let’s take the ideas of Civilization — an historical game — and do a Warcraft/Command & Conquer-style RTS.”

This, then, was the guiding ethos of the project, the first line of any pitch document to a potential publisher: to combine the fast action of the typical RTS with at least some of the more expansive scope of Civilization. You would guide a tribe — in time, a full-fledged civilization — through the Paleolithic Age, the Neolithic Age, the Bronze Age, and the early stages of the Iron Age (where this particular voyage through history would end, leaving the table set for a sequel). Along the way, you would research a variety of technologies and build ever more impressive structures, some of which would not be strictly military in application, such as granaries and temples. There would even be a version of Wonders of the World, those grandest of all Civilization achievements, waiting to be built. But the whole experience would be compressed down into the typical RTS time frame of an hour or so, as opposed to the dozen or more hours it might take to get through a full game of MicroProse’s Civilization.

Initially titled Dawn of Man, the game evolved slowly but steadily betwixt and between the usual daily routine at Ensemble Corporation. The other Ensemble principals took Tony Goodman’s after-hours vanity project with a shrug. They didn’t really understand it, but he had worked hard for a long time and was entitled to it, they supposed, in the same way that other successful entrepreneurs were entitled to go out and buy themselves a Porsche.

When Tony Goodman started shopping the game to prospective publishers, it already looked and played decently well. He was growing more and more convinced that he had a winner on his hands. Yet even he was surprised at his good fortune when he made a cold call to Stuart Moulder, a middle manager at Microsoft’s relatively little-remarked games division, and captured the interest of the biggest fish in the software sea.

Historically speaking, Microsoft’s relationship to games had long been a tentative one. It was true that, in the very early days of the company, when it was known chiefly as a peddler of 8-bit BASIC implementations, Microsoft had published a fair number of games. (The most important of these was probably its ethically dodgy commercial version of Will Crowther and Don Woods’s classic Adventure, the game that lent its name to a whole genre.) Even after it signed the landmark deal to provide IBM’s first mass-market personal computer with an operating system — a deal that resulted in the ever-evolving PC standard that remains dominant to this day — Microsoft continued to dabble in games for a while. There was a good reason for this; it’s often forgotten today that IBM and Microsoft first envisioned that original IBM PC becoming a fixture in homes as well as offices. But when home users didn’t embrace the platform as rapturously as the partners had hoped, even as Corporate America took it to its bosom more quickly than they had ever dreamed, Microsoft abandoned games, thanks not only to the bigger profits that could be earned in operating systems and business software but out of fear of the stigma that surrounded games and their makers in the more “serious” software circles of the 1980s. The one exception to Microsoft’s no-fun-allowed policy was — at least according to some people’s definition of “fun” — Flight Simulator, an early product for the IBM PC that turned into a minor cash cow for the company; like Microsoft’s operating systems and productivity packages, it was a program that people proved willing to buy all over again every few years, whenever it was updated to take advantage of the latest graphics cards and microprocessors. Its focus on the pedantic details of flying a real civilian airplane — the complications of VOR navigation systems and the insidious threat of carburetor ice were implemented, but absolutely no guns were to hand — presumably made it acceptable in Microsoft’s staid software lineup.

The release in 1990 of the comparatively approachable, user-friendly Windows 3.0 operating environment marked the moment when more conventional games began to become less of an anathema to Microsoft once again. An implementation of the hoary old card game Solitaire was among this latest Windows’s standard suite of software accessories. As easy to pick up as it was to put down, it became the perfect time killer or palate cleanser for hundreds of millions of office workers all over the world, enough to make it quite probably the most popular videogame ever in terms of sheer number of person-hours played. Microsoft went on to release four “Entertainment Packs” of similarly simple games for the Windows 3.x desktop, and to include a clever Battleship variant called Minesweeper in 1992’s Windows 3.1. Microsoft was slowly loosening up; even Bill Gates confessed to a Minesweeper addiction.

The company now began to dabble in more ambitious games, the kind that could stand on their own rather than needing to be packaged a half-dozen to a box. There came a golf game for the corporate set, and then there came Space Simulator, an attempt to do for armchair astronauts what Flight Simulator had for so long been doing for armchair aviators. But the big shift came with Windows 95, the first (and arguably only) Microsoft operating system whose arrival would become a full-fledged pop-culture event. That old dream of the PC as a standard for the home as well as the office was coming true in spades by now; amidst the hype over multimedia and the World Wide Web, ordinary people were buying computers to use in their homes in unprecedented numbers. Microsoft was determined to serve their wishes and needs just as they had for so long been serving those of the corporate world. One result of this determination was DirectX, which allowed Microsoft’s customers to install and play audiovisually rich, immersive games without having to learn the arcane mantras of MS-DOS or memorize every detail of a computer’s hardware configuration. Another, less initially prominent one was a more empowered games division, which was for the first time given permission to blow through the musty vibes of office life or educational value that had clung to Microsoft’s earlier entertainment efforts and give the hardcore gamers what they really wanted.

At the same time, though, it should be understood that even by this point game publishing had not become a major priority at Microsoft. Far from it. There remained plenty of people inside the company who didn’t think getting into that business was a good idea at all, who feared that it would be perceived as a conflict of interest by the very extant game publishers Microsoft was trying to convince to embrace DirectX, or who thought the potential rewards just weren’t worth the distraction; after all, even if Microsoft managed to publish the most popular computer game in the world, those revenues would still pale in comparison to the Windows and Office juggernauts. Among the skeptics who did no more than tolerate the notion of Microsoft peddling games was Bill Gates himself.

The games division was in the keeping of one Tony Garcia at this time. One day a manager a rung below him on the hierarchy, a “talent scout” named Stuart Moulder whom he had explicitly tasked with finding hot “gamer’s games” to sway the naysayers and reinvigorate the division, knocked on his door to say that he’d just seen an RTS work-in-progress by a brand-new studio that was being bootstrapped out of a business-software maker. Yes, Moulder rushed to add, he understood that no part of that sentence sounded overly promising at first blush. But the game itself looked surprisingly good, he said. Really, really good. This could be the Big One they’d been waiting for.

So, Garcia invited the Dawn of Man crew to come up to Microsoft’s headquarters in Redmond, Washington, and show him what they had. And he too liked what he saw enough to want to put the Microsoft logo on it.

Microsoft was an infamously tough negotiator, but Tony Goodman was no slouch in that department either. “Negotiation is often about compromise,” he says. “However, negotiating with Microsoft is more often about leverage. Microsoft negotiates hard. They don’t respect you unless you do the same.” Goodman gained some of his needed leverage by showing the game to other publishers as well — Electronic Arts, Hasbro, even Discovery Channel Multimedia (who were attracted by the game’s interest in real history) — and showing Microsoft the letters they had sent him to express their very real interest. Meanwhile Microsoft’s marketing department had already come up with the perfect name for a game whose historical time frame extended well beyond the Dawn of Man: Age of Empires. Having invented the name, Microsoft insisted on owning the trademark. Goodman wasn’t able to move the beast from Redmond on this point, but he did secure a royalty rate and other contract terms that he could live with.

In February of 1996, Goodman’s moonlighting venture was transformed from a skunk works inside a business-software maker to a proper games studio at long last, via official articles of incorporation. That said, it wouldn’t do to exaggerate the degree of separation even now: Ensemble Studios was still run out of the office of Ensemble Corporation. It had about ten employees in the beginning. Angelo Laudon was listed as lead programmer and Rick Goodman as lead designer, despite the latter’s complete lack of experience in that field. Fortunately, Bruce Shelley had agreed to join up as well, coming down to Dallas about one week of every month and working from home in Chicago the rest of the time.

Soon after Age of Empires became a real project from a real studio, Tony Garcia left Microsoft. He was replaced by Ed Fries, a veteran member of the Office team who had programmed games for 8-bit Atari computers before starting at Microsoft in 1986. When he agreed to take this new job in games, he was told by his colleagues that he was committing career suicide: “Why would you leave Office, one of the most important parts of this company, to go work on something nobody cares about?”

For all their apparent differences in size and clout, Microsoft and Ensemble Corporation were in an oddly similar boat; both were specialists in other kinds of software who were trying to break into games. Or rather, a handful of passionate individuals within each of the companies was, while everyone else looked on with bemused indifference. In an odd sort of way, though, said indifference was the passionate individuals’ superpower. If the new RTS failed utterly, it wouldn’t show up on the ledgers of Microsoft or Ensemble Corporation as anything more than a slight blip on an otherwise healthy bottom line. This lack of existential stakes — an extreme rarity in an industry whose instability is legendary — was greatly to the game’s benefit. With no pressure to have it finished by such-and-such a date or else, the developers could fuss over it until they got every detail just exactly perfect. Sticking close to the RTS playbook even in his choice of metaphors, Rick Goodman describes time in game development as “a resource, like collecting wood. The more of it you have, the better off you are. We took a lot of time. A lot of time. Most companies would not have survived that length of time.”

During that time, the game got played. Over and over and over and over again, it got played, not only by the Ensemble crew but by lots of folks at Microsoft, including the experts at that company’s “usability laboratory.” Microsoft brought in people from the street who had never played an RTS before, who didn’t even know what those initials stood for, and had them run through the early tutorial missions to see if they communicated what they were supposed to. Rinse and repeat, rinse and repeat. Age of Empires was tested and tweaked no differently than it would have been if it was a $1000 mission-critical software application destined to be the fodder of corporate purchasing departments all over the world.

For this was to be a broad-spectrum computer game, beamed straight at the center of the mass market but wide and diffuse enough to capture an unusual variety of playing styles and priorities. Bruce Shelley has spoken often since of the value of putting “multiple gaming experiences within one box.”

To reach a broad audience, include a variety of game types and adjustable game parameters that combine in different ways to create a range of quite different gaming experiences, all within the same game. Examples of different gaming experiences with the Age of Empires games are multiplayer death matches, single-player campaigns, random-map games, cooperative-play games, and Wonder races. Victory conditions, map types, and level-of-difficulty settings are examples of parameters that can be adjusted to create different gaming experiences.

We want the smartest kid in junior-high school (a hardcore gamer) telling his or her friends that our game is his or her favorite right now. When those friends buy our game, they probably won’t be able to compete with the star, but by adjusting those parameters they can still find a type of game that suits them and have fun. The average kids and the smart kids can both enjoy our game, although they play quite different parts of it.

When we provide a variety of gaming experiences within the single box, we increase the number of people who can buy our game and be happy with it. Each of these satisfied customers becomes in turn a potential evangelist.

Although I wouldn’t directly equate being “hardcore” when it comes to games with being “smarter” than those who are not in the way that Shelley (perhaps inadvertently) does here, the larger point is well-taken. This was something that the industry in general was finally coming to realize by the latter 1990s, probably more belatedly than it ought to have done. By making it possible to play the same game in a variety of different ways, you could dramatically expand the size of that game’s audience. You did so by including varying difficulty levels and speed settings, to make the game as easy or hard, as relaxing or frenetic, as any particular player wished. And you did so by including different modes of play: story-driven campaigns, a single-player skirmish mode, online multiplayer contests. It might take additional time and money to make all of these things, especially if you were determined, as you ought to be, to make them all well, but it remained vastly cheaper than making a whole new game. Most older games dictate to you how you must play them; newer ones ask you how you would like to play them. And this has been, it seems to me, an immensely positive development on the whole, broadening immeasurably the quantity and types of people who are able to enjoy games — both each individual game that appears and gaming in the aggregate.

Certainly Age of Empires understood all of this; in addition to selectable difficulty levels and speed settings, it includes campaigns, pre-crafted singleton maps for single- or multiplayer sessions, randomly generated maps, even a scenario and campaign editor for those who want to turn their hobby into a truly creative pursuit. Anyone who has been reading these histories of mine for a while will surely know that the RTS is far from my favorite sub-genre of games. Yet even I found Age of Empires surprisingly easy to get along with. I turned the difficulty and speed down and approached the campaigns as an interactive whirlwind tour of the ancient world; as readers of this site’s companion The Analog Antiquarian are well aware, that is a subject I can never get enough of. I have a friend, on the other hand, who tells me that he can’t remember ever even starting a campaign back in the day, that he jumped right into multiplayer on Day One to engage in ferocious zero-sum contests with his friends and never looked back. And that’s fine too. Different strokes for different folks.

But since I am the person I am, I just have to say a bit more about the campaigns. There are actually four of them in all, chronicling the evolution of ancient Egypt, Greece, Babylon, and Japan. (An expansion pack that appeared about a year after the base game includes three more campaigns that deal exclusively with the rise and fall of Rome.) The campaigns were a labor of love for the lifetime history buff Bruce Shelley, as were the 40-plus pages in the manual dedicated to the twelve different playable civilizations, whose ranks include not only the aforementioned but also such comparatively obscure cultures as the Minoans, the Phoenicians, and even the Shang Chinese, all with strengths and weaknesses that stem from what we know — in some cases, what little we know — of their real-world inspirations.

“We really only needed one grand theme for a civilization that was historical enough to make people believe,” says Rick Goodman. “Like, they know Rome was good at X and the Greeks were good at Y.” For all that Age of Empires is no one’s idea of a studious exploration of history, it does have a little bit more on its mind than the likes of Warcraft or Command & Conquer. At its best, it can make you ponder where and how human civilization came to be, starting as it does with the bedrock resources, the food and wood and, yes, stone out of which everything that followed was built. I’m sure it must have sent at least a few of its young players scurrying to the library to learn a little more about our shared heritage. Perhaps it managed to spark an enduring passion for history in some of them.

The graphics style was an additional key to Age of Empires’s appeal. Bruce Shelley:

The sun is always shining in Age of Empires. It was always a bright, inviting world that you wanted to know more about. I’ve always had problems with dark, forbidding games. You’re crushing your audience — you’re really narrowing who is going to consider buying a game when you make it ugly, dark, and forbidding. Maybe it appeals to a certain audience, but…

When you set out to develop a PC game, the potential market is everyone on Earth who owns a PC. Once you begin making decisions about your game (gory, sci-fi, RTS, shooter), you begin losing potential customers who are not interested in your topic, genre, or style. Commercially successful games hold onto [a] significant share of that market because they choose a topic, genre, and style that connect with a broad audience. The acceptance of the PC into more world communities, different age groups, and by women means that games do not need to be targeted, and perhaps should not be targeted, solely to the traditional gaming audience of young males.

Age of Empires inevitably comes down to war in the end, as do most computerized depictions of history. But the violence is kept low-key in comparison to many another RTS bloodbath, and there is at least a nod in the direction of a non-conquest victory, an equivalent to sending a spaceship off to Alpha Centauri as a capstone to a game of Civilization: if you can build yourself a Wonder of the World in Age of Empires, then defend it for a period of time against all comers, you are declared the victor then and there. A “religious” victory can also be achieved, by collecting all of the religious artifacts on the map or holding all of its sacred sites for a period of 2000 years — about ten minutes in game time. There’s even some nods toward diplomacy, although in practice becoming allies usually just means you’ve agreed not to fight each other quite yet.

I don’t want to overstate the scale of the game’s innovations. At the end of the day, Age of Empires remains an RTS in the classic mold, with far more in common with Warcraft and Command & Conquer than it has with Civilization. It’s an extremely well-made derivative work with a handful of fresh ideas, not a revolution from whole cloth. Its nods in the direction of Civilization are no more than that; it’s not, that is to say, the full-blown fusion that may have been Bruce Shelley’s original vision for it. Compressing into just one hour the first 10,000 to 12,000 years of human civilization, from the dawn of sedentary farming to the splendors of high antiquity, means that lots of the detail and texture that make the game called Civilization so compelling must get lost. Even if you’re a story guy like me, you’ll no longer be marveling that you’ve brought writing, irrigation, or religion to your little group of meeples after you’ve played your first map or two; those things will have become mere rungs on the ladder to the victory screen, the real point of the endeavor. In a rare lukewarm review, GameSpot‘s T. Liam MacDonald put his finger on some of the places where Age of Empires’s aspirations toward Civilization don’t live up to the reality of its well-worn RTS template.

I wish that Age of Empires was what it claimed to be: Civilization with a Warcraft twist. Instead, it is Warcraft with a hint of Civilization. That’s all well and good, but it places it firmly in the action-oriented real-time combat camp, rather than in the high-minded empire-building [camp] of Civilization. The result is Warcraft in togas, with slightly more depth but a familiar feel.

I too must confess that I did eventually get bored with the standard RTS drill of collect, build, and attack that is the basis of almost every scenario. As the scenarios got harder, I gradually lost the will to put in the effort it would take to beat them; I wound up quitting without regrets about halfway through the second campaign, satisfied that I’d had my measure of fun and certain that life is too short to continue with entertainments of any type that you no longer find entertaining. Still, I won’t soon forget Age of Empires, and not just because its theme and atmosphere make it stand out so from the crowd. I would be the last person to deny that it’s an incredibly polished product from top to bottom, a game that was clearly fussed over and thought about to the nth degree. It exudes quality from its every virtual pore.


The Age of Empire intro movie displays some of the game’s contradictory impulses. The scenes of combat are no better nor worse than those of any other game that attempts to make war seem glorious rather than terrible. Yet the weathered ancient stone raises other, more poignant thoughts about the cycles of life, time, and civilization. “For dust you are, and to dust you shall return.”

Each campaign follows the historical development of the civilization in question to whatever extent the demands of gameplay allow.



In commercial terms, Age of Empires was a perfect storm, a great game with wide appeal combined with a lot of marketing savvy and the international distributional muscle of the biggest software publisher in the world. The principals from Ensemble remember a pivotal demonstration to Bill Gates, whose reservations about Microsoft’s recent push into games were well-known to all of them. He emerged from his first first-hand encounter with Age of Empires calling it “amazing,” assuring it the full support of the Microsoft machine.

While Microsoft’s marketing department prepared an advertising campaign whose slick sophistication would make it the envy of the industry, Tony Goodman deployed a more personal touch, working the phones at the big gaming magazines. He wasn’t above using some psychological sleight-of-hand to inculcate a herd mentality.

I built relationships with the most recognized gaming magazines. I invested a lot of time with key editors, seeding the idea that Age of Empires was “revolutionary” and would become a “phenomenon.” They may not have believed me at first, but my goal wasn’t to convince them. My goal was to plant wondrous possibilities in their brains and create anticipation, like Christmas for kids.

When the early previews began appearing, they were using the terms that we seeded: “revolutionary” and “phenomenon.” These early opinions were then picked up and echoed by other publications, creating a snowball effect. Eventually, all the publications would get on board with this message, just so they didn’t look out of touch.

Sure enough, in the Computer Gaming World RTS roundup with which I opened this article, Age of Empires was given pride of place at the top of the otherwise alphabetized pile, alongside just one august companion: Starcraft, Blizzard’s long-awaited follow-up to Warcraft II, which was to try the science-fiction side of the usual RTS fantasy/science-fiction dichotomy on for size. As it happened, Starcraft would wind up slipping several months into 1998, leaving the coming yuletide season free to become the Christmas of Age of Empires.

So, while Age of Empires may not have quite lived up to its “revolutionary” billing in gameplay terms, it definitely did become a marketplace phenomenon after its release in October of 1997, demonstrating to everyone what good things can happen when a fun game with broad appeal is combined with equally broad and smart marketing. It doubled Microsoft’s own lifetime sales projections of about 400,000 units in its first three months; it would probably have sold considerably more than that, but Microsoft had under-produced based on those same sales predictions, leaving the game out of stock on many store shelves for weeks on end while the factories scrambled to take up the slack. Age of Empires recovered from those early travails well enough to sell 3 million units by 1999, grossing a cool $120 million. It left far behind even those other members of the RTS Class of 1997 that did very well for themselves by the conventional standards of the industry, such as Myth and Total Annihilation. In fact, Age of Empires and the franchise that it spawned came to overshadow even Command & Conquer, taking the latter’s place as the only RTS series capable of going toe-to-toe with Blizzard’s Warcraft and Starcraft.

And yet that is only a part of Age of Empires’s legacy — in a way, the smaller part. In the process of single-handedly accounting for half or more of the Microsoft games division’s revenue during the last couple of years of the 1990s, Age of Empires changed Microsoft’s attitude about games forever. The direct result of that shift in attitude would be a little product called the Xbox. “I believe there were two successes that had to happen at Microsoft in order for the Xbox console to happen,” says Stuart Moulder. “One was DirectX, which showed that we had the chops on the operating-system side to deliver technology that made it possible to build great games. Then, on the other side, we had to show that we had the ability as a first-party publisher to deliver a hit game aimed at core gamers — because that’s [the] people who buy and play console games.” Thanks to Age of Empires, gaming would be overlooked no more at Microsoft.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book Gamers at Work by Morgan Ramsay; Computer Gaming World of October 1997, November 1997, and January 1998; Next Generation of June 1996; InfoWorld of April 22 1991.

Online sources include Soren Johnson’s interview with Bruce Shelley, Scott Stilphen’s interview with Ed Fries, David L. Craddock’s long ShackNews series on Microsoft’s gaming history (especially the chapter dealing directly with Age of Empires), Thomas Wilde’s profile of Ed Fries for GeekWire, Richard C. Moss’s history of Age of Empires for Ars Technica, a Microsoft press release from February of 1998, T. Liam MacDonald’s vintage review of Age of Empires for GameSpot.

Finally, the box of documents that Bruce Shelley donated to the Strong Museum of Play were a valuable resource.

A “Definitive Edition” of the original Age of Empires is available as a digital purchase on Steam.

 
 

Tags: , , ,

The Rise of POMG, Part 4: A World for the Taking

Just as the Ultima Online beta test was beginning, Electronic Arts was initiating the final phase of its slow-motion takeover of Origin Systems. In June of 1997, the mother ship in California sent down two Vice Presidents to take over completely in Texas, integrate Origin well and truly into the EA machine, and end once and for all any semblance of independence for the studio. Neil Young became Origin’s new General Manager on behalf of EA, while Chris Yates became Chief Technical Officer. Both men were industry veterans.

Appropriately enough given that he was about to become the last word on virtual Britannia, Neil Young was himself British. He attributes his career choice to the infamously awful English weather. “There are a lot of people in the games industry that come from the UK,” he says. “I think it’s because the weather is so bad that you don’t have a lot to do, so you either go into a band or teach yourself to program.” He chose the latter course at a time when computer games in Britain were still being sold on cassette tape for a couple of quid. After deciding to forgo university in favor of a programming job at a tiny studio called Imagitec Design in 1988, he “quickly realized there were more gifted engineers,” as he puts it, and “moved into producing.” Having made a name for himself in that role, he was lured to the United States by Virgin Interactive in 1992, then moved on to EA five years later, which organization had hand-picked him for the task of whipping its sometimes wayward and lackadaisical stepchild Origin into fighting shape.

Chris Yates had grown up amidst the opposite of English rain, hailing as he did from the desert gambler’s paradise Las Vegas. He was hired by the hometown studio Westwood Associates in 1988, where he worked as a programmer on games like Eye of the Beholder, Dune II, and Lands of Lore. In 1994, two years after Virgin acquired Westwood, he moved to Los Angeles to join the parent company. There he and Young became close friends as well as colleagues, such that they chose to go to EA together as a unit.

The two were so attractive to EA thanks not least to an unusual project which had occupied some of their time during their last year and a half or so at Virgin. Inspired by Air Warrior, the pioneering massively-multiplayer online flight simulator that had been running on the GEnie commercial online service since the late 1980s, a Virgin programmer named Rod Humble proposed in 1995 that his company invest in something similar, but also a bit simpler and more accessible: a massively-multiplayer version of Asteroids, the 1979 arcade classic whose roots stretched all the way back to Spacewar!, that urtext of videogaming. Neil Young and his friend Chris Yates went to bat for the project: Young making the business case for it as an important experiment that could lead to big windfalls later on, Yates pitching in to offer his exceptional technical expertise whenever necessary. Humble and a colleague named Jeff Paterson completed an alpha version of the game they called SubSpace in time to put it up on the Internet for an invitation-only testing round in December of 1995. Three months later, the server was opened to anyone who cared to download the client — still officially described as a beta version — and have at it.

SubSpace was obviously a very different proposition from the likes of Ultima Online, but it fits in perfectly with this series’s broader interest in persistent online multiplayer gaming (or POMG as I’ve perhaps not so helpfully shortened it). For, make no mistake, the quality of persistence was as key to its appeal as it was to that of such earlier featured players in this series as Kali or Battle.net. SubSpace spawned squads and leagues and zones; it became an entire subculture unto itself, one that lived in and around the actual battles in space. The distinction between it and the games of Kali and Battle.net was that SubSpace was massively — or at least bigly — multiplayer. Whereas an online Diablo session was limited to four participants, SubSpace supported battles involving up to 250 players, sometimes indulging in crazy free-for-alls, more often sorted into two or more teams, each of them flying and fighting in close coordination. It thus quickly transcended Asteroids in its tactical dimensions as well as its social aspects — transcended even other deceptively complex games with the same roots, such as Toys for Bobs’s cult classic Star Control. That it was playable at all over dial-up modem connections was remarkable; that it was so much fun to play and then to hang out in afterward, talking shop and taking stock, struck many of the thousands of players who stumbled across it as miraculous; that it was completely free for a good long time was the icing on the cake.

It remained that way because Virgin didn’t really know what else to do with it. When the few months that had been allocated to the beta test were about to run out, the fans raised such a hue and cry that Virgin gave in and left it up. And so the alleged beta test continued for more than a year, the happy beneficiary of corporate indecision. In one of his last acts before leaving Virgin, Neil Young managed to broker a sponsorship deal with Pepsi Cola, which gave SubSpace some actual advertising and another lease on life as a free-to-play game. During that memorable summer of the Ultima Online beta test, SubSpace was enjoying what one fan history calls its “greatest days” of all: “The population tripled in three months, and now there were easily 1500-plus people playing during peak times.”

With the Pepsi deal about to run out, Virgin finally took SubSpace fully commercial in October of 1997, again just as Ultima Online was doing the same. Alas, it didn’t go so well for SubSpace. Virgin released it as a boxed retail game, with the promise that, once customers had plunked down the cash to buy it, access would be free in perpetuity. This didn’t prevent half or more of the existing user base from leaving the community, even as nowhere near enough new players joined to replace them. Virgin shut down the server in November of 1998; “in perpetuity” had turned out to be a much shorter span of time than anyone had anticipated.

As we’ve seen before in this series, however, the remaining hardcore SubSpace fans simply refused to let their community die. They put up their own servers — Virgin had made the mistake of putting all the code you needed to do so on the same disc as the client — and kept right on space-warring. You can still play SubSpace today, just as you can Meridian 59 and The Realm. A website dedicated to tracking the game’s “population statistics” estimated in 2015 that the community still had between 2000 and 3000 active members, of whom around 300 might be online at any given time; assuming these numbers are to be trusted, a bit of math reveals that those who like the game must really like it, spending 10 percent or more of their lives in it. That same year, fans put their latest version of the game, now known as Subspace Continuum, onto Steam for free. Meanwhile its original father Rod Humble has gone on to a long and fruitful career in POMG, working on Everquest, The Sims Online, and Second Life among other projects.



But we should return now to the summer of 1997 and to Origin Systems, to which Neil Young and Chris Yates came as some of the few people in existence who could boast not only of ideas about POMG but of genuine commercial experience in the field, thanks to SubSpace. EA hoped this experience would serve them well when it came to Ultima Online.

Which isn’t to say that the latter was the only thing they had on their plates: the sheer diversity of Young’s portfolio as an EA general manager reflects the confusion about what Origin’s identity as a studio should be going forward. There were of course the two perennials, Ultima — meaning for the moment at least Ultima Online — and Wing Commander, which was, as Young says today, “a little lost as a product.” Wing Commander, the franchise in computer gaming during the years immediately prior to DOOM, was becoming a monstrous anachronism by 1997. Shortly after the arrival of Young and Yates, Origin would release Wing Commander: Prophecy, whose lack of the Roman numeral “V” that one expected to see in its name reflected a desire for a fresh start on a more sustainable model in this post-Chris Roberts era, with a more modest budget to go along with more modest cinematic ambitions. But instead of heralding the dawn of a new era, it would prove the franchise’s swan song; it and its 1998 expansion pack would be the last new Wing Commander computer games ever. Their intended follow-up, a third game in the Wing Commander: Privateer spinoff series of more free-form outer-space adventures, would be cancelled.

In addition to Ultima and Wing Commander, EA had chosen to bring under the Origin umbrella two product lines that were nothing like the games for which the studio had always been known. One was a line of military simulations that bore the imprimatur of “Jane’s,” a print publisher which had been the source since the turn of the twentieth century of the definitive encyclopedias of military hardware of all types. The Jane’s simulations were overseen by one Andy Hollis, who had begun making games of this type for MicroProse back in the early 1980s. The other line involved another MicroProse alum — in fact, none other than Sid Meier, whose name had entered the lexicon of many a gaming household by serving as the prefix before such titles as Pirates!, Railroad Tycoon, Civilization, and Colonization. Meier and two other MicroProse veterans had just set up a studio of their own, known as Firaxis Games, with a substantial investment from EA, who planned to release their products under the Origin Systems label. Origin was becoming, in other words, EA’s home for all of its games that were made first and usually exclusively for computers rather than for the consoles that now provided the large majority of EA’s revenues; the studio had, it seemed, more value in the eyes of the EA executive suite as a brand than as a working collective.

Still, this final stage of the transition from independent subsidiary to branch office certainly could have been even more painful than it was. Neil Young and Chris Yates were fully aware of how their arrival would be seen down in Austin, and did everything they could to be good sports and fit into the office culture. Brit-in-Texas Young was the first to come with the fish-out-of-water jokes at his own expense — “I was expecting a flat terrain with lots of cowboys, cacti, and horses, so I was pleasantly surprised,” he said of Austin — and both men rolled up their sleeves alongside Richard Garriott to serve the rest of the company a turkey dinner at Thanksgiving, a longtime Origin tradition.

Neil Young and Chris Yates on the Thanksgiving chow line.

Young and Yates had received instructions from above that Ultima Online absolutely had to ship by the end of September. Rather than cracking the whip, they tried to cajole and josh their way to that milestone as much as possible. They agreed to attend the release party in drag if the deadline was met; then Young went one step farther, promising Starr Long a kiss on the lips. Yates didn’t go that far, but he did agree to grow a beard to commemorate the occasion, even as Richard Garriott, whose upper lip hadn’t seen the sun since he’d graduated from high school, agreed to shave his.

Young and Yates got it done, earning for themselves the status of, if not the unsung heroes of Ultima Online, two among a larger group of same. The core group of ex-MUDders whose dream and love Ultima Online had always been could probably have kept running beta tests for years to come, had not these outsiders stepped in to set the technical agenda. “That meant trading off features with technology choices and decisions every minute of the day,” says Young. He brought in one Rich Vogel, who had set up and run the server infrastructure for Meridian 59 at The 3DO Company, to do the same for Ultima Online. In transforming Origin Systems into a maintainer of servers and a seller of subscriptions, he foreshadowed a transition that would eventually come to the games industry in general, from games as boxed products to gaming as a service. These tasks did not involve the sexy, philosophically stimulating ideas about virtual worlds and societies with which Raph Koster and his closest colleagues spent their time and which will always capture the lion’s share of the attention in articles like this one, but the work was no less essential for all that, and no less of a paradigm shift in its way.

So, the big day came and the deadline was met: Ultima Online shipped on September 24, 1997, three days before Meridian 59 would celebrate its first anniversary. The sleek black box was an end and a beginning at the same time. Young and Yates did their drag show, Starr Long got his kiss, and, most shockingly of all, Richard Garriott revealed his naked upper lip to all and sundry. (Opinions were divided as to whether the mangy stubble which Chris Yates deigned to grow before picking up his razor again really qualified as a beard or not.) And then everyone waited to see what would happen next.

A (semi-)bearded Chris Yates and a rare sight indeed: a clean-shaven Richard Garriott.

EA made and shipped to stores all over the country 50,000 copies of Ultima Online, accompanying it with a marketing campaign that was, as Wired magazine described it, of “Hollywood proportions.” The virtual world garnered attention everywhere, from CNN to The New York Times. These mainstream organs covered it breathlessly as the latest harbinger of humanity’s inevitable cyber-future, simultaneously bracing and unnerving. Flailing about for a way to convey some sense of the virtual world’s scope, The New York Times noted that it would take 38,000 computer monitors — enough to fill a football field — to display it in its entirety at one time. Needless to say, the William Gibson quotes, all “collective hallucinations” and the like, flew thick and fast, as they always did to mark events like this one.

Three weeks after the launch, 38,000 copies of Ultima Online had been sold and EA was spooling up the production line again to make another 65,000. Sales would hit the 100,000 mark within three months of the release. Such numbers were more than gratifying. EA knew that 100,000 copies sold of this game ought to be worth far more to its bottom line than 100,000 copies of any other game would have been, given that each retail sale hopefully represented only the down payment on a long-running subscription at $10 per month. For its publisher, Ultima Online would be the gift that kept on giving.

In another sense, however, the sales figures were a problem. When Ultima Online went officially live, it did so on just three shards: the Atlantic and Pacific shards from the beta test, plus a new Great Lakes one to handle the middle of the country. Origin was left scrambling to open more to meet the deluge of subscribers. Lake Superior came up on October 3, Baja on October 10, Chesapeake on October 16,  Napa Valley on November 14, Sonoma on December 13, Catskills on December 22. And still it wasn’t enough.

Origin’s estimates of how many players a single server could reliably support proved predictably overoptimistic. But rather than dial back on the number of players they allowed inside, thereby ensuring that each of them who did get in could have a reasonably enjoyable experience, they kept trying to cover the gap between technical theory and reality by hacking their code on the fly. As a result, Ultima Online became simultaneously the most loved and most hated game in the country. When it all came together, it was magic for many of its players. But truth be told, that didn’t happen anywhere near as often as one might have wished in that first year or so. Extreme lag, inexplicable glitches, dropped connections, and even total server crashes were the more typical order of the day. Of course, with almost everyone who surfed the Web still relying on dial-up modems running over wires that had been designed to carry voices rather than computer data, slowdowns and dropped connections were a reality of daily online life even for those who weren’t attempting to log onto virtual worlds. This created a veneer of plausible deniability, which Origin’s tech-support people, for lack of any other suggestions or excuses to offer, leaned on perhaps a bit too heavily. After all, who could say for sure that the problem any individual player might be having wasn’t downstream from Origin’s poor overtaxed server?

Weaselly excuses like these led to the first great act of civil disobedience by the residents of Britannia, just a few weeks after the launch, when hundreds of players gathered outside Lord British’s castle, stripped themselves naked, broke into the throne room, drank gallons of wine, and proceeded to disgorge all of it onto Richard Garriott’s virtual furniture, whilst chanting in unison their demands for a better, stabler virtual world. The world’s makers were appalled, but also weirdly gratified. What better sign of a budding civic life could there be than a full-on political protest? “We were all watching and thinking it was a grand statement about the project,” says Richard Garriott. “As unhappy as they were about the game, they voiced their unhappiness in the context of the game.” Much of what happened inside Ultima Online during the first year especially had the same quality of being amazing for philosophers of virtual worlds to witness, but stressful for the practical administrators who were trying to turn this one into a sustainable money tree. The rub was that the two categories were combined in the very same people, who were left feeling conflicted to say the least.

The journals of hardcore gaming, hardly known for their stoicism in the face of hype on most days, were ironically more reserved and skeptical than the mainstream press on the subject of Ultima Online, perchance because they were viewing the virtual world less as a harbinger of some collective cyber-future and more as a game that their readers might wish to, you know, actually play. Computer Gaming World wittily titled its scathing review, buried on page 162 and completely unmentioned on the cover of the issue in question, simply “Uh-Oh.” Among the litany of complaints were “numerous and never-ending bugs, horrible lag time, design issues [that] lead to repetitive and time-consuming activities, and [an] unbalanced economy.” The magazine did admit that “Ultima Online could become a truly great game. But we can’t review potential, we can only review concrete product.” Editor-in-chief Johnny L. Wilson, for his part, held out little hope for improvement. “Ultima Online begins with hubris and ends in Greek tragedy,” he said. “The hubris is a result of being unwilling to learn from others’ mistakes. The tragedy is that it could have been so much more.” Randy Farmer, co-creator of the earlier would-be virtual world Habitat, expressed a similar sentiment, saying that “Origin seems to have ignored many of the lessons that our industry has learned in the last ten years of building online worlds. They’re making the same mistakes that first-time virtual-world builders always make.”

The constant crashes and long periods of unexplained down time associated with a service for which people were paying good money constituted a corporate lawyer’s worst nightmare — or a different sort of lawyer’s wet dream. One of these latter named George Schultz began collecting signatures from Origin’s most disgruntled customers within weeks, filing a class-action lawsuit in San Diego at the beginning of March of 1998. Exhibit A was the copy right there on the back of the box, promising “a living, growing world where thousands of real people discover real fantasy and adventure, 24 hours a day, every day of the year,” with all of it taking place “in real time.” This was, claimed Schultz, a blatant case of false advertising. “We’re not trying to tell anyone how to design a good or a bad game,” he said. “What it’s about is holding Origin and EA to the promises they made on the box, in their advertising, and [in] the manual. It’s about the misrepresentations they’ve made. A big problem with the gaming industry is that they think there are some special rules that only apply to them.”

Whatever the truth of that last claim, there was no denying that just about half of the learning curve of Ultima Online was learning to navigate around the countless bugs and technical quirks. For example, Origin took down each shard once per day for a backup and a “therapeutic” reboot that was itself a testament to just what a shaky edifice the software and hardware were. When the server came back up again, it restored the state of the world from the last backup. But said state was a snapshot in time from one hour before the server went down. There was, in other words, an hour every day during which everything you did in virtual Britannia was doomed to be lost; this was obviously not a time to go on any epic, treasure- and experience-point-rich adventures. Yet such things were documented nowhere; one learned them only through the proverbial school of hard knocks.

In their defense, Origin was sailing into completely uncharted waters with Ultima Online. Although there had been online virtual worlds before, dating all the way back to that first MUD of 1978 or 1979, none of them — no, not even Meridian 59 and The Realm — had been as expansive, sophisticated, and most of all popular as these shards of Britannia. Most of the hardware technologies that would give rise to the era of Web 2.0, from DSL in homes to VPS’s in data centers, existed only as blueprints; ditto most of the software. No one had ever made a computer game before that required this much care and feeding after the initial sale. And it wasn’t as if the group entrusted with maintaining the beast was a large one. Almost the entirety of the Ultima IX team which had been parachuted in six months before the launch to just get the world done already was pulled out just as abruptly as soon as it started accepting paying subscribers, leaving behind a crew of maintainers that was little bigger than the original team of ex-MUDders who had labored in obscurity for so long before catching the eye of EA’s management. The idea that maintaining a virtual world might require almost as much manpower and ongoing creative effort as making it in the first place was too high a mental hurdle for even otherwise clever folks like Neil Young and Chris Yates to clear at this point.

Overwhelmed as they were, the maintainers began to rely heavily on unpaid volunteers from the community of players to do much of the day-to-day work of administrating the world, just as was the practice on MUDs. But Ultima Online ran on a vastly larger scale than even the most elaborate MUDs, making it hard to keep tabs on these volunteer overseers. While some were godsends, putting in hours of labor every week to make Britannia a better place for their fellow players, others were corrupted by their powers, manipulating the levers they had to hand to benefit their friends and punish their enemies. Then, too, the volunteer system was another legal quagmire, one that would doubtless have sent EA’s lawyers running screaming from the room if anyone had bothered to ask them about it before it was rolled out; sure enough, it would eventually lead to another lawsuit, this one more extended, serious, and damaging than the first.

In the meanwhile, though, most players did not rally behind the first lawsuit to anything like the degree that George Schultz might have been hoping. The fact was that even the ones who had vomited all over Lord British’s throne had done so because they loved their virtual Britannia and wanted to see it fixed rather than destroyed, as it would likely be if Schultz won the day. The suit concluded in a settlement at the end of 1998. The biggest concession on the part of the defendants was a rather weird one that gave no recompense to any individual inhabitant of virtual Britannia: EA agreed to donate $15,000 to the San Jose Tech Museum of Innovation. Perhaps Schultz thought that it would be able to innovate up a more reliable virtual world.

While many of the technical problems that beset Ultima Online were only to be expected in the context of the times, some of the other obstacles to enjoying the virtual world were more puzzling. First and foremost among these was the ever-present issue of players killing other players, which created so much frustration that George Schultz felt compelled to explicitly wall it off from the breach-of-trust claims that were the basis of his lawsuit: “We’re not getting into whether there should be player-killing.” Given that it had been such a constant theme of life (and death) in virtual Britannia going all the way back to the alpha-testing phase, the MUDders might have taken more steps to address it before the launch. As it was, though, one senses that, having seen so many of their ideas about a virtual ecology and the like not survive contact with real players, having been forced to give up in so many ways on virtual Britannia as a truly self-sustaining, living world, they were determined to make this the scene of their last stand, the hill that they would either hold onto or die trying.

Their great white hope was still the one that Richard Garriott had been voicing in interviews since well before the world’s commercial debut: that purely social pressures would act as a constraint on player-killing — that, in short, their world would learn to police itself. In fact, the presence of player-killing might act as a spur to civilization — for, as Raph Koster said, “cultures define and refine themselves through conflict.” They kept trying to implement systems that would nudge this particular culture in the right direction. They decided that, after committing murder five times, a player would be branded with literal scarlet letters: the color of his onscreen name would change from blue to red. Hopefully this would make him a pariah among his peers, while also making it very dangerous for him to enter a town, whose invulnerable computer-controlled guards would attack him on sight. The designers didn’t reckon with the fact that a virtual life is, no matter how much they might wish otherwise, simply not the same as a real life. Some percentage of players, presumably perfectly mild-mannered and law-abiding in the real world, reveled in the role of murderous outlaws online, taking the red letters of their name as a badge of honor rather than shame, the dangers of the cities as a challenge rather than a deterrent. To sneak past the city gates, creep up behind an unsuspecting newbie and stab her in the back, then get out of Dodge before the city watch appeared… now, that was good times. The most-wanted rolls posted outside the guard stations of Britannia became, says Raph Koster, “a high-score table for player killers.”

The MUDders’ stubborn inflexibility on this issue — an issue that was by all indications soon costing Ultima Online large numbers of customers — was made all the more inexplicable in the opinion of many players by the fact that it was, in marked contrast to so many of the other problems, almost trivial to address in programming terms. An “invulnerability” flag had long existed, to be applied not only to computer-controlled city guards but to special human-controlled personages such as Lord British to whom the normal laws of virtual time and space did not apply. All Origin had to do was add a few lines of code to automatically turn the flag on when a player walked into designated “safe” spaces. That way, you could have places where those who had signed up mostly in order to socialize could hang out without having to constantly look over their backs, along with other places where the hardcore pugilists could pummel one another to their heart’s content. Everyone would be catered to. Problem solved.

But Raph Koster and company refused to take this blindingly obvious step, having gotten it into their heads that to do so would be to betray their most cherished ideals. They kept tinkering around the edges of the problem, looking for a subtler solution that would preserve their world’s simulational autonomy. For example, they implemented a sort of karmic justice system, which dictated that players who had been evil during life would be resurrected after death only after losing a portion of their stats and skills. Inevitably, the player killers just took this as another challenge. Just don’t get killed, and you would never have to worry about it.

The end result was to leave the experience of tens of thousands of players in the unworthy hands of a relatively small minority of “griefers,” people who thrived on causing others pain and distress. Like all bullies, they preyed on the weak; their typical victims were the newbies, unschooled in the ways of defense, guiding characters with underwhelming statistics and no arms or armor to speak of. Such new players were, of course, the ones whose level of engagement with the game was most tentative, who were the mostly likely to just throw up their hands and go find something else to play after they’d been victimized once or twice, depriving Origin of potentially hundreds of dollars in future subscription revenue.

In light of this, it’s strange that no one from EA or Origin overrode the MUDders on this point. For his part, Richard Garriott was adamantly on their side, insisting that Ultima Online simply had to allow player-killing if it wasn’t to become a mockery of itself. It was up to the dissatisfied and victimized residents themselves to band together and turn Britannia into the type of world they wanted to live in; it wasn’t up to Origin to step in and fix their problems for them with a deus ex machina. “When we first launched Ultima Online, we set out to create a world that supported the evil player as a legitimate role,” said Garriott in his rather high-handed way. “Those who have truly learned the lessons of the [single-player] Ultima games should cease their complaining, rise to the challenge, and make Britannia into the place they want it to be.” He liked to tell a story on this subject. (Knowing Garriott’s penchant for embellishment, it probably didn’t happen, or at least didn’t happen quite like this. But that’s not relevant to its importance as allegory.)

One evening, he was wandering the streets of the capital in his Lord British persona, when he heard a woman screaming. Rushing over to her, he was told that a thief had stolen all of her possessions. His spirit of chivalry was awoken; he told her that he would get her things back for her. Together they tracked down the thief and cornered him in a back alley. Lord British demanded that the thief return the stolen goods, and the thief complied. They all went their separate ways. A moment later, the woman cried out again; the thief had done it again.

This time, Lord British froze the thief with a spell before he could leave the scene of the crime. “I told you not to do that,” he scolded. “What are you doing?”

“Sorry, I won’t do it again,” said the thief as he turned over the goods for a second time.

“If you do that again, I’m going to ban you from the game,” said Lord British.

You might be able to guess what happened next: the thief did it yet again. “I said I was going to ban you, and now I have to,” shouted Lord British, now well and truly incensed. “What’s wrong with you? I told you not to steal from this woman!”

The thief’s answer stopped Garriott in his tracks. “Listen. You created this world, and I’m a thief,” he said, breaking character for the first time. “I steal. That’s what I do. And now you’re going to ban me from the game for playing the role I’m supposed to play? I lied to you before because I’m a thief. The king caught me and told me not to steal. What am I going to do, tell you that as soon as you turn around I’m going to steal again? No! I’m going to lie.”

And Garriott realized that the thief was right. Garriott could do whatever he wished to him as Lord British, the reigning monarch of this world. But if he wished to stay true to all the things he had said in the past about what virtual Britannia was and ought to be, he couldn’t go outside the world to punish him as Richard Garriott, the god of the server looking down from on-high.

Some of the questions with which Origin was wrestling resonate all too well today: questions involving the appropriate limits of online free speech — or rather free action, in this case. They are questions with which everyone who has ever opened an Internet discussion up to the public, myself included, have had to engage. When does strongly felt disagreement spill over into bad faith, counterpoint into disruption for the sake of it? And what should we do about it when it does? In Origin’s case, the pivotal philosophical question at hand was where the boundary lay between playing an evil character in good faith in a fantasy world and purposely, willfully trying to cause real pain to other real people sitting behind other real computers. Origin had chosen to embrace a position close to the ground staked out by our self-described “free-speech maximalists” of today. And like them, Origin was learning that the issue is more dangerously nuanced than they had wished to believe.

But there were others sorts of disconnect at play here as well. Garriott’s stern commandment that his world’s inhabitants should “cease their complaining, rise to the challenge, and make Britannia into the place they want it to be” becomes more than a bit rich when we remember that it was being directed toward Origin’s paying customers. Many of them might have replied that it was up to Origin rather than they themselves to make Britannia a place they wanted to be, lest they choose to spend their $10 per month on something else. The living-world dynamic held “as long as everyone is playing the same game,” wrote Amy Jo Kim in an article about Ultima Online and its increasingly vocalized discontents that appeared in Wired magazine in the spring of 1998. “But what happens when players who think they’re attending an online Renaissance Faire find themselves at the mercy of a violent, abusive gang of thugs? In today’s Britannia, it’s not uncommon to stumble across groups of evil players who talk like Snoop Doggy Dogg, dress like gangstas, and act like rampaging punks.” To be sure, some players were fully onboard with the “living-world” policy of (non-)administration. Others, however, had thought, reasonably enough given what they had read on the back of the game’s box, that they were just buying an entertainment product, a place to hang out in a few hours per day or week and have fun, chatting and exploring and killing monsters. They hadn’t signed up to organize police forces or lead political rallies. Nor had they signed up to be the guinea pigs in some highfalutin social experiment. No; they had signed up to play a game.

As it was, Ultima Online was all but impossible to play casually, thanks not only to the murderers skulking in its every nook and cranny but to core systems of the simulation itself. For example, if you saved up until you could afford to build yourself a nice little house, made it just like you wanted it, then failed to log on for a few days, when you did return you’d find that your home had disappeared, razed to make room for some other, more active player to build something. Systems like these pushed players to spend more time online as a prerequisite to having fun when they were there. Some left when the demands of the game conflicted with those of real life, which was certainly the wisest choice. But some others began to spend far more time in virtual Britannia than was really good for them, raising the specter of gaming addiction, a psychological and sociological problem that would only become more prevalent in the post-millennial age.

Origin estimated that the median hardcore player spent a stunning if not vaguely horrifying total of six hours per day in the virtual world. And if the truth be told, many of the non-murderous things with which they were expected to fill those hours do seem kind of boring on the face of it. This is the flip side of making a virtual world that is more “realistic”: most people play games to escape from reality for a while, not to reenact it. With all due respect to our dedicated and talented real-world tailors and bakers, most people don’t dream of spending their free time doing such jobs online. Small wonder so many became player killers instead; at least doing that was exciting and, for some people at any rate, fun. From Amy Jo Kim’s article:

There’s no shortage of realism in this game — the trouble is, many of the nonviolent activities in Ultima Online are realistic to the point of numbingly lifelike boredom. If you choose to be a tailor, you can make a passable living at it, but only after untold hours of repetitive sewing. And there’s no moral incentive for choosing tailoring — or any honorable, upstanding vocation, for that matter. So why be a tailor? In fact, why not prey on the tailors?

True, Ultima Online is many things to many people. Habitués of online salons come looking for intellectual sparring and verbal repartee. Some other people log on in search of intimate but anonymous social relationships. Still others play the game with cunning yet also a discernible amount of self-restraint, getting rich while staying pretty honest. But there’s no avoiding where the real action is: an ever-growing number are playing Ultima Online to kill everything that moves.

All of this had an effect: all signs are that, after the first rush of sales and subscriptions, Ultima Online began to stagnate, mired in bad reviews, ongoing technical problems, and a growing disenchantment with the player-killing and the other barriers to casual fun. Raph Koster admits that “our subscriber numbers, while stratospheric for the day, weren’t keeping up” with sales of the boxed game, because “the losses [of frustrated newbies] were so high.”

Although Origin and EA never published official sales or subscriber numbers, I have found one useful data point from the early days of Ultima Online, in an internal Origin newsletter dated October 30, 1998. As of this date, just after its first anniversary, the game had 90,000 registered users, of whom approximately half logged on on any given day. These numbers are depicted in the article in question as very impressive, as indeed they were in comparison to the likes of Meridian 59 and The Realm. Still, a bit of context never hurts. Ultima Online had sold 100,000 boxed copies in its first three months, yet it didn’t have even that many subscribers after thirteen months, when its total boxed sales were rounding the 200,000 mark. The subscriber-retention rate, in other words, was not great; a lot of those purchased CDs had become coasters in fairly short order.

Nine shards were up in North America at this time, a number that had stayed the same since the previous December. And it’s this number that may be the most telling one of all. It’s true that, since demand was concentrated at certain times of day, Ultima Online was hosting just about all the players it could handle with its current server infrastructure as of October of 1998. But then again, this was by no means all the players it should be able to handle in the abstract: new shards were generally brought into being in response to increasing numbers of subscribers rather than vice versa. The fact that no new North American shards had been opened since December of 1997 becomes very interesting in this light.

I don’t want to overstate my case here: Ultima Online was extremely successful on its own, somewhat experimental terms. We just need to be sure that we understand what those terms were. By no means were its numbers up there with the industry’s biggest hits. As a point of comparison, let’s take Riven, the long-awaited sequel to the mega-hit adventure game Myst. It was released two months after Ultima Online and went on to sell 1 million units in its first year — at least five times the number of boxed entrées to Origin’s virtual world over the same time period, despite being in a genre that was in marked decline in commercial terms. Another, arguably more pertinent point of comparison is Age of Empires, a new entry in the red-hot real-time-strategy genre. Released just one month after Ultima Online, it outsold Origin’s virtual world by more than ten to one over its first year. Judged as a boxed retail game, Ultima Online was a middling performer at best.

Of course, Ultima Online was not just another boxed retail game; the unique thing about it was that each of the 90,000 subscribers it had retained was paying $10 every month, yielding a steady revenue of almost $11 million per year, with none of it having to be shared with any distributor or retailer. That was really, really nice — nice enough to keep Origin’s head above water at a time when the studio didn’t have a whole lot else to point to by way of justifying its ongoing existence to EA. And yet the reality remained that Ultima Online was a niche obsession rather than a mass-market sensation. As so often happens in life, taking the next step forward in commercial terms, not to mention fending off the competition that was soon to appear with budgets and publisher support of which Meridian 59 and The Realm couldn’t have dreamed, would require a degree of compromise with its founding ideals.

Be that as it may, however, one thing at least was now clear: there was real money to be made in the MMORPG space. Shared virtual worlds would soon learn to prioritize entertainment over experimentation. Going forward, there would be less talk about virtual ecologies and societies, and more focus on delivering slickly packaged fun, of the sort that would keep all kinds of players coming back for more — and, most importantly of all, get those subscriber counts rising once more.

I’ll continue to follow the evolution of PMOG, MMORPGs, and Ultima Online in future articles, and maybe see if I can’t invent some more confusing acronyms while I’m at it. But not right away… other subjects beg for attention in the more immediate future.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Braving Britannia: Tales of Life, Love, and Adventure in Ultima Online by Wes Locher, Postmortems: Selected Essays, Volume One by Raph Koster, Online Game Pioneers at Work by Morgan Ramsay, Through the Moongate, Part II by Andrea Contato, Explore/Create by Richard Garriott, and MMOs from the Inside Out by Richard Bartle, and Dungeons and Dreamers by Bard King and John Borland. Origin Systems’s internal newsletter Point of Origin of February 20 1998 and October 30 1998; Computer Gaming World of February 1998 and November 1998; New York Times of October 20 1997; Wired of May 1998.

Web sources include a 2018 Game Developers Conference talk by some of the Ultima Online principals, an Ultima Online timeline at UOGuide, and GameSpot‘s vintage reviews of Ultima Online and its first expansion, The Second Age. On the subject of SubSpace, we have histories by Rod Humble and Epinephrine, another vintage GameSpot review, and a Vice article by Emanuel Maiberg.

 

Tags: , , , , , , , , ,