RSS

Tag Archives: virgin

Blade Runner

Blade Runner has set me thinking about the notion of a “critical consensus.” Why should we have such a thing at all, and why should it change over time?

Ridley Scott’s 1982 film Blade Runner is an adaptation of Philip K. Dick’s 1968 novel Do Androids Dream of Electric Sheep?, about a police officer cum bounty hunter — a “blade runner” in street slang — of a dystopian near-future whose job is to “retire” android “replicants” of humans whose existence on Earth is illegal. The movie had a famously troubled gestation, full of time and budget overruns, disputes between Scott and his investors, and an equally contentious relationship between the director and his leading man, Harrison Ford. When it was finally finished, the first test audiences were decidedly underwhelmed, such that Scott’s backers demanded that the film be recut, with the addition of a slightly hammy expository voice-over and a cheesy happy-ending epilogue which was cobbled together quickly using leftover footage from, of all movies, Stanley Kubrick’s The Shining.

It didn’t seem to help. The critical consensus on the released version ranged over a continuum from ambivalence to outright hostility. Roger Ebert’s faint praise was typically damning: “I was never really interested in the characters in Blade Runner. I didn’t find them convincing. What impressed me in the film was the special effects, the wonderful use of optical trickery to show me a gigantic imaginary Los Angeles, which in the vision of this movie has been turned into sort of a futuristic Tokyo. It’s a great movie to look at, but a hard one to care about. I didn’t appreciate the predictable story, the standard characters, the cliffhanging clichés… but I do think the special effects make Blade Runner worth going to see.” Pauline Kael was less forgiving of what she saw as a cold, formless, ultimately pointless movie: “If anybody comes around with a test to detect humanoids, maybe Ridley Scott and his associates should hide. With all the smoke in this movie, you feel as if everyone connected with it needs to have his flue cleaned.” Audiences do not always follow the critics’ lead, but in this case they largely did. During its initial theatrical run, Blade Runner fell well short of earning back the $30 million it had cost to make.

Yet remarkably soon after it had disappeared from theaters, its rehabilitation got underway in fannish circles. In 1984, William Gibson published his novel Neuromancer, the urtext of a new “cyberpunk” movement in science fiction that began in printed prose but quickly spiraled out from there into comics, television, and games. Whereas Blade Runner‘s dystopic Los Angeles looked more like Tokyo than any contemporary American city, Gibson’s book actually began in Japan, before moving on to a similarly over-urbanized United States. The two works’ neon-soaked nighttime cityscapes were very much of a piece. The difference was that Gibson added to the equation a computer-enabled escape from reality known as cyberspace, creating a combination that would prove almost irresistibly alluring to science-fiction fans as the computer age around them continued to evolve apace.

Blade Runner‘s rehabilitation spread to the mainstream in 1992, when a “director’s cut” of the film was re-released in theaters, lacking the Captain Obvious voice-over or the tacked-on happy ending but sporting a handful of new scenes that added fresh layers of nuance to the story. Critics — many of them the very same critics who had dismissed the movie a decade earlier — now rushed to praise it as a singular cinematic vision and a science-fiction masterpiece. They found many reasons for its box-office failure on the first go-round, even beyond the infelicitous changes that Ridley Scott had been forced by his backers to make to it. For one thing, it had been unlucky enough to come out just one month after E.T.: The Extraterrestrial, the biggest box-office smash of all time to that point, whose long shadow was as foreboding and unforgiving a place to dwell as any of Blade Runner‘s own urban landscapes. Then, too, the audience was conditioned back then to see Harrison Ford as Han Solo or Indiana Jones — a charming rogue with a heart of gold, not the brooding, morally tormented cop Rick Deckard, who has a penchant for rough sex and a habit of shooting women in the back. In light of all this, surely the critics too could be forgiven for failing to see the film’s genius the first time they were given the chance.

Whether we wish to forgive them or not, I find it fascinating that a single film could generate such polarized reactions only ten years apart in time from people who study the medium for a living. The obvious riposte to my sense of wonder is, of course, that the Blade Runner of 1992 really wasn’t the same film at all as the one that had been seen in 1982. Yet I must confess to considerable skepticism about this as a be-all, end-all explanation. It seems to me that, for all that the voice-over and forced happy ending did the movie as a whole no favors, they were still a long way from destroying the qualities that made Blade Runner distinct.

Some of my skepticism may arise from the fact that I’m just not onboard with the most vaunted aspect of the director’s cut, its subtle but undeniable insinuation that Deckard is himself a replicant with implanted memories, no different from the androids he hunts down and kills. This was not the case in Philip K. Dick’s novel, nor was it the original intention of the film’s scriptwriters. I rather suspect, although I certainly cannot prove it, that even Ridley Scott’s opinion on the subject was more equivocal during the making of the film than it has since become. David Peoples, one of the screenwriters, attributes the genesis of the idea in Scott’s mind to an overly literal reading on his part of a philosophical meditation on free will and the nature of human existence in an early draft of the script. Peoples:

I invented a kind of contemplative voice-over for Deckard. Here, let me read it to you:

“I wondered who designs the ones like me and what choices we really have, and which ones we just think we have. I wondered which of my memories were real and which belonged to someone else. The great Tyrell [the genius inventor and business magnate whose company made the replicants] hadn’t designed me, but whoever had hadn’t done so much better. In my own modest way, I was a combat model.”

Now, what I’d intended with this voice-over was mostly metaphysical. Deckard was supposed to be philosophically questioning himself about what it was that made him so different from Rachael [a replicant with whom he falls in love or lust] and the other replicants. He was supposed to be realizing that, on the human level, they weren’t so different. That Deckard wanted the same things the replicants did. The “maker” he was referring to wasn’t Tyrell. It was supposed to be God. So, basically, Deckard was just musing about what it meant to be human.

But then, Ridley… well, I think Ridley misinterpreted me. Because right about this period of time, he started announcing, “Ah-ha! Deckard’s a replicant! What brilliance!” I was sort of confused by this response, because Ridley kept giving me all this praise and credit for this terrific idea. It wasn’t until many years later, when I happened to be browsing through this draft, that I suddenly realized the metaphysical material I had written could just as easily have been read to imply that Deckard was a replicant, even though it wasn’t what I meant at all. What I had meant was, we all have a maker, and we all have an incept date [a replicant’s equivalent to a date of birth]. We just can’t address them. That’s one of the similarities we had to the replicants. We couldn’t go find Tyrell, but Tyrell was up there somewhere. For all of us.

So, what I had intended as kind of a metaphysical speculation, Ridley had read differently, but now I realize there was nothing wrong with this reading. That confusion was my own fault. I’d written this voice-over so ambiguously that it could indeed have meant exactly what Ridley took it to mean. And that, I think, is how the whole idea of Deckard being a replicant came about.

The problem I have with Deckard being a replicant is that it undercuts the thematic resonance of the story. In the book and the movie, the quality of empathy, or a lack thereof, is described as the one foolproof way to distinguish real from synthetic humans. To establish which is which, blade runners like Deckard use something called the Voight-Kampff test, in which suspects are hooked up to a polygraph-like machine which measures their emotional response to shockingly transgressive statements, starting with stuff like “my briefcase is made out of supple human-baby skin” and getting steadily worse from there. Real humans recoil, intuitively and immediately. Replicants can try to fake the appropriate emotional reaction — might even be programmed to fake it to themselves, such that even they don’t realize what they are — but there is always a split-second delay, which the trained operator can detect.

The central irony of the film is that cops like Deckard are indoctrinated to have absolutely no empathy for the replicants they track down and murder, even as many of the replicants we meet evince every sign of genuinely caring for one another, leading one to suspect that the Voight-Kampff test may not be measuring pure, unadulterated empathy in quite the way everyone seems to think it is. The important transformation that Deckard undergoes, which eventually brings his whole world down around his head, is that of allowing himself to feel the pain and fear of those he hunts. He is a human who rediscovers and re-embraces his own humanity, who finally begins to understand that meting out suffering and death to other feeling creatures is no way to live, no matter how many layers of justification and dogma his actions are couched within.

But in Ridley Scott’s preferred version of the film, the central theme falls apart, to be replaced with psychological horror’s equivalent of a jump scare: “Deckard himself is really a replicant, dude! What a mind fuck, huh?” For this reason, it’s hard for me to see the director’s cut as an holistically better movie than the 1982 cut, which at least leaves some more room for debate about the issue.

This may explain why I’m lukewarm about Blade Runner as a whole, why none of the cuts — and there have been a lot of them by now — quite works for me. As often happens in cases like this one, I find that my own verdict on Blade Runner comes down somewhere between the extremes of then and now. There’s a lot about Roger Ebert’s first hot-take that still rings true to me all these years later. It’s a stunning film in terms of atmosphere and audiovisual composition; I defy anyone to name a movie with a more breathtaking opening shot than the panorama of nighttime Tokyo… er, Los Angeles that opens this one. Yet it’s also a distant and distancing, emotionally displaced film that aspires to a profundity it doesn’t completely earn. I admire many aspects of its craft enormously and would definitely never discourage anyone from seeing it, but I just can’t bring myself to love it as much as so many others do.

The opening shot of Blade Runner the movie.

These opinions of mine will be worth keeping in mind as we move on now to the 1997 computer-game adaptation of Blade Runner. For, much more so than is the case even with most licensed games, your reaction to this game might to be difficult to separate from your reaction to the movie.


Thanks to the complicated, discordant circumstances of its birth, Blade Runner had an inordinate number of vested interests even by Hollywood standards, such that a holding company known as The Blade Runner Partnership was formed just to administer them. When said company started to shop the property around to game publishers circa 1994, the first question on everyone’s lips was what had taken them so long. The film’s moody, neon-soaked aesthetic if not its name had been seen in games for years by that point, so much so that it had already become something of a cliché. Just among the games I’ve written about on this site, Rise of the Dragon, Syndicate, System Shock, Beneath a Steel Sky, and the Tex Murphy series all spring to mind as owing more than a small debt to the movie. And there are many, many more that I haven’t written about.

Final Fantasy VII is another on the long list of 1990s games that owes more than a little something to Blade Runner. It’s hard to imagine its perpetually dark, polluted, neon-soaked city of Midgar ever coming to exist without the example of Blade Runner’s Los Angeles. Count it as just one more way in which this Japanese game absorbed Western cultural influences and then reflected them back to their point of origin, much as the Beatles once put their own spin on American rock and roll and sold it back to the country of its birth.

Meanwhile the movie itself was still only a cult classic in the 1990s; far more gamers could recognize and enjoy the gritty-cool Blade Runner aesthetic than had actually seen its wellspring. Blade Runner was more of a state of mind than it was a coherent fictional universe in the way of other gaming perennials like Star Trek and Star Wars. Many a publisher therefore concluded that they could have all the Blade Runner they needed without bothering to pay for the name.

Thus the rights holders worked their way down through the hierarchy of publishers, beginning with the prestigious heavy hitters like Electronic Arts and Sierra and continuing into the ranks of the mid-tier imprints, all without landing a deal. Finally, they found an interested would-be partner in the financially troubled Virgin Interactive.

The one shining jewel in Virgin’s otherwise tarnished crown was Westwood Studios, the pioneer of the real-time-strategy genre that was on the verge of becoming one of the two hottest in all of gaming. And one of the founders of Westwood was a fellow named Louis Castle, who listed Blade Runner as his favorite movie of all time. His fandom was such that Westwood probably did more than they really needed to in order to get the deal. Over a single long weekend, the studio’s entire art department pitched in to meticulously recreate the movie’s bravura opening shots of dystopic Los Angeles. It did the trick; the Blade Runner contract was soon given to Virgin and Westwood. It also established, for better or for worse, the project’s modus operandi going forward: a slavish devotion not just to the film’s overall aesthetic but to the granular details of its shots and sets.

The opening shot of Blade Runner the game.

Thanks to the complicated tangle of legal rights surrounding the film, Westwood wasn’t given access to any of its tangible audiovisual assets. Undaunted, they endeavored to recreate almost all of them on the monitor screen for themselves by using pre-rendered 3D backgrounds combined with innovative real-time lighting effects; these were key to depicting the flashing neon and drifting rain and smoke that mark the film. The foreground actors were built from motion-captured human models, then depicted onscreen using voxels, collections of tiny cubes in a 3D space, essentially pixels with an added Z-dimension of depth.

At least half of what you see in the Blade Runner game is lifted straight from the movie, which Westwood pored over literally frame by frame in order to include even the tiniest details, the sorts of things that no ordinary moviegoer would ever notice. The Westwood crew took a trip from their Las Vegas offices to Los Angeles to measure and photograph the locations where the film had been shot, the better to get it all exactly correct. Even the icy, synth-driven soundtrack for the movie was deconstructed, analyzed, and then mimicked in the game, note by ominous note.

The two biggest names associated with the film, Ridley Scott and Harrison Ford, were way too big to bother with a project like this one, but a surprising number of the other actors agreed to voice their parts and to allow themselves to be digitized and motion-captured. Among them were Sean Young, who had played Deckard’s replicant love interest Rachael; Edward James Olmos, who had played his enigmatic pseudo-partner Gaff; and Joe Turkel, who had played Eldon Tyrell, the twisted genius who invented the replicants. Set designers and other behind-the-scenes personnel were consulted as well.

It wasn’t judged practical to clone the movie’s plot in the same way as its sights and sounds, if for no other reason than the absence of Harrison Ford; casting someone new in the role of Deckard would have been, one senses, more variance than Westwood’s dedication to re-creation would have allowed. Instead they came up with a new story that could play out in the seams of the old one, happening concurrently with the events of the film, in many of the same locations and involving many of the same characters. Needless to say, its thematic concerns too would be the same as those of the film — and, yes, its protagonist cop as well would eventually be given reason to doubt his own humanity. His name was McCoy, another jaded gumshoe transplanted from a Raymond Chandler novel into an equally noirish future. But was he a “real” McCoy?

Westwood promised great things in the press while Blade Runner was in development: a truly open-world game taking place in a living, breathing city, full of characters that went about their own lives and pursued their own agendas, whose response to you in the here and now would depend to a large degree on how you had treated them and their acquaintances and enemies in the past. There would be no fiddly puzzles for the sake of them; this game would expect you to think and act like a real detective, not as the typical adventure-game hero with an inventory full of bizarre objects waiting to be put to use in equally bizarre ways. To keep you on your toes and add replay value — the lack of which was always the adventure genre’s Achilles heel as a commercial proposition — the guilty parties in the case would be randomly determined, so that no two playthroughs would ever be the same. And there would be action elements too; you would have to be ready to draw your gun at almost any moment. “There’s actually very little action in the film,” said Castle years later, “but when it happens, it’s violent, explosive, and deadly. I wanted to make a game where the uncertainty of what’s going to happen makes you quiver with anticipation every time you click the mouse.”

As we’ll soon see, most of those promises would be fulfilled only partially, but that didn’t keep Blade Runner from becoming a time-consuming, expensive project by the standards of its era,  taking two years to make and costing about $2 million. It was one of the last times that a major, mainstream American studio swung for the fences with an adventure game, a genre that was soon to be relegated to niche status, with budgets and sales expectations to match.

In fact, Blade Runner’s commercial performance was among the reasons that down-scaling took place. Despite a big advertising push on Virgin Interactive’s part, it got lost in the shuffle among The Curse of Monkey Island, Riven, and Zork: Grand Inquisitor, three other swansongs of the AAA adventure game that all competed for a dwindling market share during the same holiday season of 1997. Reviews were mixed, often expressing a feeling I can’t help but share: what was ultimately the point of so slavishly re-creating another work of art if you’re weren’t going to add much of anything of your own to it? “The perennial Blade Runner images are here, including the winking woman in the Coca-Cola billboard and vehicles flying over the flaming smokestacks of the industrial outskirts,” wrote GameSpot. “Unfortunately, most of what’s interesting about the game is exactly what was interesting about the film, and not much was done to extend the concepts or explore them any further.” Computer and Video Games magazine aptly called it “more of a companion to the movie than a game.” Most gamers shrugged and moved on the next title on the shelf; Blade Runner sold just 15,000 copies in the month of its release.[1]Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.

As the years went by, however, a funny thing happened. Blade Runner never faded completely from the collective gamer consciousness like so many other middling efforts did. It continued to be brought up in various corners of the Internet, became a fixture of an “abandonware” scene whose rise preceded that of back-catalog storefronts like GOG.com, became the subject of retrospectives and think pieces on major gaming sites. Finally, in spite of the complications of its licensing deal, it went up for sale on GOG.com in 2019. Then, in 2022, Night Dive Studios released an “enhanced” edition. It seems safe to say today that many more people have played Westwood’s Blade Runner since the millennium than did so before it. The critical consensus surrounding it has shifted as well. As of this writing, Blade Runner is rated by the users of MobyGames as the 51st best adventure game of all time — a ranking that doesn’t sound so impressive at first, until you realize that it’s slightly ahead of such beloved icons of the genre as LucasArts’s Monkey Island 2 and Indiana Jones and the Fate of Atlantis.[2]This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release. I trust that I need not belabor the parallels with the reception history of Ridley Scott’s movie. In this respect as well as so many others, the film and the game seem joined at the hip. And the latter wouldn’t have it any other way.


In all my years of writing these histories, I’m not sure I’ve ever come across a game that combines extremes of derivation and innovation in quite the way of Westwood’s Blade Runner. While there is nary an original idea to be found in the fiction, the gameplay has if anything too many of them.

I’ve complained frequently in the past that most alleged mystery games aren’t what they claim to be at all, that they actually solve the mystery for you while you occupy your time with irrelevant lock-and-key puzzles and the like. Louis Castle and his colleagues at Westwood clearly had the same complaints; there are none of those irrelevancies here. Blade Runner really does let you piece together its clues for yourself. You feel like a real cop — or at least a television one — when you, say, pick out the license plate of a car on security-camera footage, then check the number in the database of the near-future’s equivalent to the Department of Motor Vehicles to get a lead. Even as it’s rewarding, the game is also surprisingly forgiving in its investigative aspects, not an adjective that’s frequently applied to adventures of this period. There are a lot of leads to follow, and you don’t need to notice and run down all of them all to make progress in your investigation. At its best, then, this game makes you feel smart — one of the main reasons a lot of us play games, if we’re being honest.

Those problems that do exist here arise not from the developers failing to do enough, but rather from trying to do too much. There’s an impossibly baroque “clues database” that purports to aid you in tying everything together. This experiment in associative, cross-referenced information theory would leave even Ted Nelson scratching his head in befuddlement. Thankfully, it isn’t really necessary to engage with it at all. You can keep the relevant details in your head, or at worst in your trusty real-world notepad, easily enough.

If you can make any sense of this, you’re a better detective than I am.

Features like this one seem to be artifacts of that earlier, even more conceptually ambitious incarnation of Blade Runner that was promoted in the press while the game was still being made.[3]Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article. As I noted earlier, this was to have been a game that you could play again and again, with the innocent and guilty parties behind the crime you investigated being different each time. It appears that, under the pressure of time, money, and logistics, that concept got boiled down to randomizing which of the other characters are replicants and which are “real” humans, but not changing their roles in the story in response to their status in any but some fairly cosmetic ways. Then, too, the other characters were supposed to have had a great deal of autonomy, but, again, the finished product doesn’t live up to this billing. In practice, what’s left of this aspiration is more of an annoyance than anything else. While the other characters do indeed move around, they do so more like subway trains on a rigid schedule than independent human actors. When the person you need to speak to isn’t where you go to speak to him, all you can do is go away and return later. This leads to tedious rounds of visiting the same locations again and again, hoping someone new will turn up to jog the plot forward. While this may not be all that far removed from the nature of much real police work, it’s more realism than I for one need.

This was also to have been an adventure game that you could reasonably play without relying on saving and restoring, taking your lumps and rolling with the flow. Early on, the game just about lives up to this ideal. At one point, you chase a suspect into a dark alleyway where a homeless guy happens to be rooting through a dumpster. It’s damnably easy in the heat of the moment to shoot the wrong person. If you do so — thus committing a crime that counts as murder, unlike the “retiring” of a replicant — you have the chance to hide the body and continue on your way; life on the mean streets of Los Angeles is a dirty business, regardless of the time period. Even more impressively, you might stumble upon your victim’s body again much later in the game, popping up out of the murk like an apparition from your haunted conscience. If you didn’t kill the hobo, on the other hand, you might meet him again alive.

But sadly, a lot of this sort of thing as well falls away as the game goes on. The second half is rife with learning-by-death moments that would have done the Sierra of the 1980s proud, all people and creatures jumping out of the shadows and killing you without warning. Hope you have a save file handy, says the game. The joke’s on you!

By halfway through, the game has just about exhausted the movie’s iconic set-pieces and is forced to lean more on its own invention, much though this runs against its core conviction that imitation trumps originality. Perhaps that conviction was justified after all: the results aren’t especially inspiring. What we see are mostly generic sewers, combined with characters who wouldn’t play well in the dodgiest sitcom. The pair of bickering conjoined twins — one smart and urbane, the other crude and rude — is particularly cringe-worthy.

Writers and other artists often talk about the need to “kill your darlings”: to cut out those scenes and phrases and bits and bobs that don’t serve the art, that only serve to gratify the vanity of the artist. This game is full of little darlings that should have died well before it saw release. Some of them are flat-out strange. For example, if you like, you can pre-pick a personality for McCoy: Polite, Normal, (don’t call me) Surly, or Erratic. Doing so removes the conversation menu from the interface; walk up to someone and click on her, and McCoy just goes off on his own tangent. I don’t know why anyone would ever choose to do this, unless it be to enjoy the coprolalia of Erratic McCoy, who jumps from Sheriff Andy Taylor to Dirty Harry and back again at a whipsaw pace, leaving everyone on the scene flummoxed.

Even when he’s ostensibly under your complete control, Detective McCoy isn’t the nimblest cowboy at the intellectual rodeo. Much of the back half of the game degenerates into trying to figure out how and when to intervene to keep him from doing something colossally stupid. When a mobster you’ve almost nailed hands him a drink, you’re reduced to begging him silently: Please, please, do not drink it, McCoy! And of course he does so, and of course it’s yet another Game Over. (After watching the poor trusting schmuck screw up this way several times, you might finally figure out that you have about a two-second window of control to make him draw his gun on the other guy — no other action will do — before he scarfs down the spiked cocktail.)

Bottoms up! (…sigh…)

All my other complaints aside, though, for me this game’s worst failing remains its complete disinterest in standing on its own as either a piece of fiction or as an aesthetic statement of any stripe. There’s an embarrassingly mawkish, subservient quality that dogs it even as it’s constantly trying to be all cool and foreboding and all, with all its darkness and its smoke. Its brand of devotion is an aspect of fan culture that I just don’t get.

So, I’m left sitting here contemplating an argument that I don’t think I’ve ever had to make before in the context of game development: that you can actually love something too much to be able to make a good game out of it, that your fandom can blind you as surely as the trees of any forest. This game is doomed, seemingly by design, to play a distant second fiddle to its parent. You can almost hear the chants of “We’re not worthy!” in the background. When you visit Tyrell in his office, you know it can have no real consequences for your story because the resolution of that tycoon’s fate has been reserved for the cinematic story that stars Deckard; ditto your interactions with Rachael and Gaff and others. They exist here at all, one can’t help but sense, only because the developers were so excited at the prospect of having real live Blade Runner actors visit them in their studio that they just couldn’t help themselves. (“We’re not worthy!”) For the player who doesn’t live and breathe the lore of Blade Runner like the developers do, they’re living non sequiturs who have nothing to do with anything else that’s going on.

Even the endings here — there are about half a dozen major branches, not counting the ones where McCoy gets shot or stabbed or roofied midway through the proceedings — are sometimes in-jokes for the fans. One of them is a callback to the much-loathed original ending of the film — a callback that finds a way to be in much worse taste than its inspiration: McCoy can run away with one of his suspects, who happens to be a fourteen-year-old girl who’s already been the victim of adult molestation. Eww!

What part of “fourteen years old and already sexually traumatized” do you not understand, McCoy?

Even the options menu of this game has an in-joke that only fans will get. If you like, you can activate a “designer cut” here that eliminates all of McCoy’s explanatory voice-overs, a callback to the way that Ridley Scott’s director’s cut did away with the ones in the film. The only problem is that in this medium those voice-overs are essential for you to have any clue whatsoever what’s going on. Oh, well… the Blade Runner fans have been served, which is apparently the important thing.

I want to state clearly here that my objections to this game aren’t abstract objections to writing for licensed worlds or otherwise building upon the creativity of others. It’s possible to do great work in such conditions; the article I published just before this one praised The Curse of Monkey Island to the skies for its wit and whimsy, despite that game making absolutely no effort to bust out of the framework set up by The Secret of Monkey Island. In fact, The Curse of Monkey Island too is bursting at the seams with in-jokes and fan service. But it shows how to do those things right: by weaving them into a broader whole such that they’re a bonus for the people who get them but never distract from the experience of the people who don’t. That game illustrates wonderfully how one can simultaneously delight hardcore fans of a property and welcome newcomers into the fold, how a game can be both a sequel and fully-realized in an Aristotelian sense. I’m afraid that this game is an equally definitive illustration of how to do fan service badly, such that it comes across as simultaneously elitist and creatively bankrupt.

Westwood always prided themselves on their technical excellence, and this is indeed a  technically impressive game in many respects. But impressive technology is worth little on its own. If you’re a rabid fan of the movie in the way that I am not, I suppose you might be excited to live inside it here and see all those iconic sets from slightly different angles. If you aren’t, though, it’s hard to know what this game is good for. In its case, I think that the first critical consensus had it just about right.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The book Future Noir: The Making of Blade Runner by Paul M. Sammon; Computer and Video Games of January 1998; PC Zone of May 1999; Next Generation of July 1997; Computer Gaming World of March 1998; Wall Street Journal of January 21 1998; New Yorker of July 1982; Retro Gamer 142.

Online sources include Ars Technica’s interview with Louis Castle, Game Developer‘s interview with Castle, Edges feature on the making of the game, the original Siskel and Ebert review of the movie, an unsourced but apparently authentic interview with Philip K. Dick, and GameSpot’s vintage Blade Runner review.

Blade Runner is available for digital purchase at GOG.com, in both its original edition that I played for this article and the poorly received enhanced edition. Note that the latter actually includes the original game as well as of this writing, and is often cheaper than buying the original alone…

Footnotes

Footnotes
1 Louis Castle has often claimed in later decades that Blade Runner did well commercially, stating at least once that it sold 1 million copies(!). I can’t see how this could possibly have been the case; I’ve learned pretty well over my years of researching these histories what a million-selling game looked like in the 1990s, and can say very confidently that it did not look like this one. Having said that, though, let me also say that I don’t blame him for inflating the figures. It’s not easy to pour your heart and soul into something and not have it do well. So, as the press of real data and events fades into the past, the numbers start to go up. This doesn’t make Castle dishonest so much as it just makes him human.
2 This chart in general is distorted greatly by the factor of novelty; many or most of the highest-ranking games are very recent ones, rated in the first blush of excitement following their release.
3 Louis Castle’s own testimony contradicts this notion as well. He has stated in various interview that “Blade Runner is as close as I have ever come to realizing a design document verbatim.” I don’t wish to discount his words out of hand, but boy, does this game ever strike me, based on pretty long experience in studying these things, as being full of phantom limbs that never got fully wired into the greater whole. I decided in the end that I had to call it like I see it in this article.
 

Tags: , ,

The Rise of POMG, Part 4: A World for the Taking

Just as the Ultima Online beta test was beginning, Electronic Arts was initiating the final phase of its slow-motion takeover of Origin Systems. In June of 1997, the mother ship in California sent down two Vice Presidents to take over completely in Texas, integrate Origin well and truly into the EA machine, and end once and for all any semblance of independence for the studio. Neil Young became Origin’s new General Manager on behalf of EA, while Chris Yates became Chief Technical Officer. Both men were industry veterans.

Appropriately enough given that he was about to become the last word on virtual Britannia, Neil Young was himself British. He attributes his career choice to the infamously awful English weather. “There are a lot of people in the games industry that come from the UK,” he says. “I think it’s because the weather is so bad that you don’t have a lot to do, so you either go into a band or teach yourself to program.” He chose the latter course at a time when computer games in Britain were still being sold on cassette tape for a couple of quid. After deciding to forgo university in favor of a programming job at a tiny studio called Imagitec Design in 1988, he “quickly realized there were more gifted engineers,” as he puts it, and “moved into producing.” Having made a name for himself in that role, he was lured to the United States by Virgin Interactive in 1992, then moved on to EA five years later, which organization had hand-picked him for the task of whipping its sometimes wayward and lackadaisical stepchild Origin into fighting shape.

Chris Yates had grown up amidst the opposite of English rain, hailing as he did from the desert gambler’s paradise Las Vegas. He was hired by the hometown studio Westwood Associates in 1988, where he worked as a programmer on games like Eye of the Beholder, Dune II, and Lands of Lore. In 1994, two years after Virgin acquired Westwood, he moved to Los Angeles to join the parent company. There he and Young became close friends as well as colleagues, such that they chose to go to EA together as a unit.

The two were so attractive to EA thanks not least to an unusual project which had occupied some of their time during their last year and a half or so at Virgin. Inspired by Air Warrior, the pioneering massively-multiplayer online flight simulator that had been running on the GEnie commercial online service since the late 1980s, a Virgin programmer named Rod Humble proposed in 1995 that his company invest in something similar, but also a bit simpler and more accessible: a massively-multiplayer version of Asteroids, the 1979 arcade classic whose roots stretched all the way back to Spacewar!, that urtext of videogaming. Neil Young and his friend Chris Yates went to bat for the project: Young making the business case for it as an important experiment that could lead to big windfalls later on, Yates pitching in to offer his exceptional technical expertise whenever necessary. Humble and a colleague named Jeff Paterson completed an alpha version of the game they called SubSpace in time to put it up on the Internet for an invitation-only testing round in December of 1995. Three months later, the server was opened to anyone who cared to download the client — still officially described as a beta version — and have at it.

SubSpace was obviously a very different proposition from the likes of Ultima Online, but it fits in perfectly with this series’s broader interest in persistent online multiplayer gaming (or POMG as I’ve perhaps not so helpfully shortened it). For, make no mistake, the quality of persistence was as key to its appeal as it was to that of such earlier featured players in this series as Kali or Battle.net. SubSpace spawned squads and leagues and zones; it became an entire subculture unto itself, one that lived in and around the actual battles in space. The distinction between it and the games of Kali and Battle.net was that SubSpace was massively — or at least bigly — multiplayer. Whereas an online Diablo session was limited to four participants, SubSpace supported battles involving up to 250 players, sometimes indulging in crazy free-for-alls, more often sorted into two or more teams, each of them flying and fighting in close coordination. It thus quickly transcended Asteroids in its tactical dimensions as well as its social aspects — transcended even other deceptively complex games with the same roots, such as Toys for Bobs’s cult classic Star Control. That it was playable at all over dial-up modem connections was remarkable; that it was so much fun to play and then to hang out in afterward, talking shop and taking stock, struck many of the thousands of players who stumbled across it as miraculous; that it was completely free for a good long time was the icing on the cake.

It remained that way because Virgin didn’t really know what else to do with it. When the few months that had been allocated to the beta test were about to run out, the fans raised such a hue and cry that Virgin gave in and left it up. And so the alleged beta test continued for more than a year, the happy beneficiary of corporate indecision. In one of his last acts before leaving Virgin, Neil Young managed to broker a sponsorship deal with Pepsi Cola, which gave SubSpace some actual advertising and another lease on life as a free-to-play game. During that memorable summer of the Ultima Online beta test, SubSpace was enjoying what one fan history calls its “greatest days” of all: “The population tripled in three months, and now there were easily 1500-plus people playing during peak times.”

With the Pepsi deal about to run out, Virgin finally took SubSpace fully commercial in October of 1997, again just as Ultima Online was doing the same. Alas, it didn’t go so well for SubSpace. Virgin released it as a boxed retail game, with the promise that, once customers had plunked down the cash to buy it, access would be free in perpetuity. This didn’t prevent half or more of the existing user base from leaving the community, even as nowhere near enough new players joined to replace them. Virgin shut down the server in November of 1998; “in perpetuity” had turned out to be a much shorter span of time than anyone had anticipated.

As we’ve seen before in this series, however, the remaining hardcore SubSpace fans simply refused to let their community die. They put up their own servers — Virgin had made the mistake of putting all the code you needed to do so on the same disc as the client — and kept right on space-warring. You can still play SubSpace today, just as you can Meridian 59 and The Realm. A website dedicated to tracking the game’s “population statistics” estimated in 2015 that the community still had between 2000 and 3000 active members, of whom around 300 might be online at any given time; assuming these numbers are to be trusted, a bit of math reveals that those who like the game must really like it, spending 10 percent or more of their lives in it. That same year, fans put their latest version of the game, now known as Subspace Continuum, onto Steam for free. Meanwhile its original father Rod Humble has gone on to a long and fruitful career in POMG, working on Everquest, The Sims Online, and Second Life among other projects.



But we should return now to the summer of 1997 and to Origin Systems, to which Neil Young and Chris Yates came as some of the few people in existence who could boast not only of ideas about POMG but of genuine commercial experience in the field, thanks to SubSpace. EA hoped this experience would serve them well when it came to Ultima Online.

Which isn’t to say that the latter was the only thing they had on their plates: the sheer diversity of Young’s portfolio as an EA general manager reflects the confusion about what Origin’s identity as a studio should be going forward. There were of course the two perennials, Ultima — meaning for the moment at least Ultima Online — and Wing Commander, which was, as Young says today, “a little lost as a product.” Wing Commander, the franchise in computer gaming during the years immediately prior to DOOM, was becoming a monstrous anachronism by 1997. Shortly after the arrival of Young and Yates, Origin would release Wing Commander: Prophecy, whose lack of the Roman numeral “V” that one expected to see in its name reflected a desire for a fresh start on a more sustainable model in this post-Chris Roberts era, with a more modest budget to go along with more modest cinematic ambitions. But instead of heralding the dawn of a new era, it would prove the franchise’s swan song; it and its 1998 expansion pack would be the last new Wing Commander computer games ever. Their intended follow-up, a third game in the Wing Commander: Privateer spinoff series of more free-form outer-space adventures, would be cancelled.

In addition to Ultima and Wing Commander, EA had chosen to bring under the Origin umbrella two product lines that were nothing like the games for which the studio had always been known. One was a line of military simulations that bore the imprimatur of “Jane’s,” a print publisher which had been the source since the turn of the twentieth century of the definitive encyclopedias of military hardware of all types. The Jane’s simulations were overseen by one Andy Hollis, who had begun making games of this type for MicroProse back in the early 1980s. The other line involved another MicroProse alum — in fact, none other than Sid Meier, whose name had entered the lexicon of many a gaming household by serving as the prefix before such titles as Pirates!, Railroad Tycoon, Civilization, and Colonization. Meier and two other MicroProse veterans had just set up a studio of their own, known as Firaxis Games, with a substantial investment from EA, who planned to release their products under the Origin Systems label. Origin was becoming, in other words, EA’s home for all of its games that were made first and usually exclusively for computers rather than for the consoles that now provided the large majority of EA’s revenues; the studio had, it seemed, more value in the eyes of the EA executive suite as a brand than as a working collective.

Still, this final stage of the transition from independent subsidiary to branch office certainly could have been even more painful than it was. Neil Young and Chris Yates were fully aware of how their arrival would be seen down in Austin, and did everything they could to be good sports and fit into the office culture. Brit-in-Texas Young was the first to come with the fish-out-of-water jokes at his own expense — “I was expecting a flat terrain with lots of cowboys, cacti, and horses, so I was pleasantly surprised,” he said of Austin — and both men rolled up their sleeves alongside Richard Garriott to serve the rest of the company a turkey dinner at Thanksgiving, a longtime Origin tradition.

Neil Young and Chris Yates on the Thanksgiving chow line.

Young and Yates had received instructions from above that Ultima Online absolutely had to ship by the end of September. Rather than cracking the whip, they tried to cajole and josh their way to that milestone as much as possible. They agreed to attend the release party in drag if the deadline was met; then Young went one step farther, promising Starr Long a kiss on the lips. Yates didn’t go that far, but he did agree to grow a beard to commemorate the occasion, even as Richard Garriott, whose upper lip hadn’t seen the sun since he’d graduated from high school, agreed to shave his.

Young and Yates got it done, earning for themselves the status of, if not the unsung heroes of Ultima Online, two among a larger group of same. The core group of ex-MUDders whose dream and love Ultima Online had always been could probably have kept running beta tests for years to come, had not these outsiders stepped in to set the technical agenda. “That meant trading off features with technology choices and decisions every minute of the day,” says Young. He brought in one Rich Vogel, who had set up and run the server infrastructure for Meridian 59 at The 3DO Company, to do the same for Ultima Online. In transforming Origin Systems into a maintainer of servers and a seller of subscriptions, he foreshadowed a transition that would eventually come to the games industry in general, from games as boxed products to gaming as a service. These tasks did not involve the sexy, philosophically stimulating ideas about virtual worlds and societies with which Raph Koster and his closest colleagues spent their time and which will always capture the lion’s share of the attention in articles like this one, but the work was no less essential for all that, and no less of a paradigm shift in its way.

So, the big day came and the deadline was met: Ultima Online shipped on September 24, 1997, three days before Meridian 59 would celebrate its first anniversary. The sleek black box was an end and a beginning at the same time. Young and Yates did their drag show, Starr Long got his kiss, and, most shockingly of all, Richard Garriott revealed his naked upper lip to all and sundry. (Opinions were divided as to whether the mangy stubble which Chris Yates deigned to grow before picking up his razor again really qualified as a beard or not.) And then everyone waited to see what would happen next.

A (semi-)bearded Chris Yates and a rare sight indeed: a clean-shaven Richard Garriott.

EA made and shipped to stores all over the country 50,000 copies of Ultima Online, accompanying it with a marketing campaign that was, as Wired magazine described it, of “Hollywood proportions.” The virtual world garnered attention everywhere, from CNN to The New York Times. These mainstream organs covered it breathlessly as the latest harbinger of humanity’s inevitable cyber-future, simultaneously bracing and unnerving. Flailing about for a way to convey some sense of the virtual world’s scope, The New York Times noted that it would take 38,000 computer monitors — enough to fill a football field — to display it in its entirety at one time. Needless to say, the William Gibson quotes, all “collective hallucinations” and the like, flew thick and fast, as they always did to mark events like this one.

Three weeks after the launch, 38,000 copies of Ultima Online had been sold and EA was spooling up the production line again to make another 65,000. Sales would hit the 100,000 mark within three months of the release. Such numbers were more than gratifying. EA knew that 100,000 copies sold of this game ought to be worth far more to its bottom line than 100,000 copies of any other game would have been, given that each retail sale hopefully represented only the down payment on a long-running subscription at $10 per month. For its publisher, Ultima Online would be the gift that kept on giving.

In another sense, however, the sales figures were a problem. When Ultima Online went officially live, it did so on just three shards: the Atlantic and Pacific shards from the beta test, plus a new Great Lakes one to handle the middle of the country. Origin was left scrambling to open more to meet the deluge of subscribers. Lake Superior came up on October 3, Baja on October 10, Chesapeake on October 16,  Napa Valley on November 14, Sonoma on December 13, Catskills on December 22. And still it wasn’t enough.

Origin’s estimates of how many players a single server could reliably support proved predictably overoptimistic. But rather than dial back on the number of players they allowed inside, thereby ensuring that each of them who did get in could have a reasonably enjoyable experience, they kept trying to cover the gap between technical theory and reality by hacking their code on the fly. As a result, Ultima Online became simultaneously the most loved and most hated game in the country. When it all came together, it was magic for many of its players. But truth be told, that didn’t happen anywhere near as often as one might have wished in that first year or so. Extreme lag, inexplicable glitches, dropped connections, and even total server crashes were the more typical order of the day. Of course, with almost everyone who surfed the Web still relying on dial-up modems running over wires that had been designed to carry voices rather than computer data, slowdowns and dropped connections were a reality of daily online life even for those who weren’t attempting to log onto virtual worlds. This created a veneer of plausible deniability, which Origin’s tech-support people, for lack of any other suggestions or excuses to offer, leaned on perhaps a bit too heavily. After all, who could say for sure that the problem any individual player might be having wasn’t downstream from Origin’s poor overtaxed server?

Weaselly excuses like these led to the first great act of civil disobedience by the residents of Britannia, just a few weeks after the launch, when hundreds of players gathered outside Lord British’s castle, stripped themselves naked, broke into the throne room, drank gallons of wine, and proceeded to disgorge all of it onto Richard Garriott’s virtual furniture, whilst chanting in unison their demands for a better, stabler virtual world. The world’s makers were appalled, but also weirdly gratified. What better sign of a budding civic life could there be than a full-on political protest? “We were all watching and thinking it was a grand statement about the project,” says Richard Garriott. “As unhappy as they were about the game, they voiced their unhappiness in the context of the game.” Much of what happened inside Ultima Online during the first year especially had the same quality of being amazing for philosophers of virtual worlds to witness, but stressful for the practical administrators who were trying to turn this one into a sustainable money tree. The rub was that the two categories were combined in the very same people, who were left feeling conflicted to say the least.

The journals of hardcore gaming, hardly known for their stoicism in the face of hype on most days, were ironically more reserved and skeptical than the mainstream press on the subject of Ultima Online, perchance because they were viewing the virtual world less as a harbinger of some collective cyber-future and more as a game that their readers might wish to, you know, actually play. Computer Gaming World wittily titled its scathing review, buried on page 162 and completely unmentioned on the cover of the issue in question, simply “Uh-Oh.” Among the litany of complaints were “numerous and never-ending bugs, horrible lag time, design issues [that] lead to repetitive and time-consuming activities, and [an] unbalanced economy.” The magazine did admit that “Ultima Online could become a truly great game. But we can’t review potential, we can only review concrete product.” Editor-in-chief Johnny L. Wilson, for his part, held out little hope for improvement. “Ultima Online begins with hubris and ends in Greek tragedy,” he said. “The hubris is a result of being unwilling to learn from others’ mistakes. The tragedy is that it could have been so much more.” Randy Farmer, co-creator of the earlier would-be virtual world Habitat, expressed a similar sentiment, saying that “Origin seems to have ignored many of the lessons that our industry has learned in the last ten years of building online worlds. They’re making the same mistakes that first-time virtual-world builders always make.”

The constant crashes and long periods of unexplained down time associated with a service for which people were paying good money constituted a corporate lawyer’s worst nightmare — or a different sort of lawyer’s wet dream. One of these latter named George Schultz began collecting signatures from Origin’s most disgruntled customers within weeks, filing a class-action lawsuit in San Diego at the beginning of March of 1998. Exhibit A was the copy right there on the back of the box, promising “a living, growing world where thousands of real people discover real fantasy and adventure, 24 hours a day, every day of the year,” with all of it taking place “in real time.” This was, claimed Schultz, a blatant case of false advertising. “We’re not trying to tell anyone how to design a good or a bad game,” he said. “What it’s about is holding Origin and EA to the promises they made on the box, in their advertising, and [in] the manual. It’s about the misrepresentations they’ve made. A big problem with the gaming industry is that they think there are some special rules that only apply to them.”

Whatever the truth of that last claim, there was no denying that just about half of the learning curve of Ultima Online was learning to navigate around the countless bugs and technical quirks. For example, Origin took down each shard once per day for a backup and a “therapeutic” reboot that was itself a testament to just what a shaky edifice the software and hardware were. When the server came back up again, it restored the state of the world from the last backup. But said state was a snapshot in time from one hour before the server went down. There was, in other words, an hour every day during which everything you did in virtual Britannia was doomed to be lost; this was obviously not a time to go on any epic, treasure- and experience-point-rich adventures. Yet such things were documented nowhere; one learned them only through the proverbial school of hard knocks.

In their defense, Origin was sailing into completely uncharted waters with Ultima Online. Although there had been online virtual worlds before, dating all the way back to that first MUD of 1978 or 1979, none of them — no, not even Meridian 59 and The Realm — had been as expansive, sophisticated, and most of all popular as these shards of Britannia. Most of the hardware technologies that would give rise to the era of Web 2.0, from DSL in homes to VPS’s in data centers, existed only as blueprints; ditto most of the software. No one had ever made a computer game before that required this much care and feeding after the initial sale. And it wasn’t as if the group entrusted with maintaining the beast was a large one. Almost the entirety of the Ultima IX team which had been parachuted in six months before the launch to just get the world done already was pulled out just as abruptly as soon as it started accepting paying subscribers, leaving behind a crew of maintainers that was little bigger than the original team of ex-MUDders who had labored in obscurity for so long before catching the eye of EA’s management. The idea that maintaining a virtual world might require almost as much manpower and ongoing creative effort as making it in the first place was too high a mental hurdle for even otherwise clever folks like Neil Young and Chris Yates to clear at this point.

Overwhelmed as they were, the maintainers began to rely heavily on unpaid volunteers from the community of players to do much of the day-to-day work of administrating the world, just as was the practice on MUDs. But Ultima Online ran on a vastly larger scale than even the most elaborate MUDs, making it hard to keep tabs on these volunteer overseers. While some were godsends, putting in hours of labor every week to make Britannia a better place for their fellow players, others were corrupted by their powers, manipulating the levers they had to hand to benefit their friends and punish their enemies. Then, too, the volunteer system was another legal quagmire, one that would doubtless have sent EA’s lawyers running screaming from the room if anyone had bothered to ask them about it before it was rolled out; sure enough, it would eventually lead to another lawsuit, this one more extended, serious, and damaging than the first.

In the meanwhile, though, most players did not rally behind the first lawsuit to anything like the degree that George Schultz might have been hoping. The fact was that even the ones who had vomited all over Lord British’s throne had done so because they loved their virtual Britannia and wanted to see it fixed rather than destroyed, as it would likely be if Schultz won the day. The suit concluded in a settlement at the end of 1998. The biggest concession on the part of the defendants was a rather weird one that gave no recompense to any individual inhabitant of virtual Britannia: EA agreed to donate $15,000 to the San Jose Tech Museum of Innovation. Perhaps Schultz thought that it would be able to innovate up a more reliable virtual world.

While many of the technical problems that beset Ultima Online were only to be expected in the context of the times, some of the other obstacles to enjoying the virtual world were more puzzling. First and foremost among these was the ever-present issue of players killing other players, which created so much frustration that George Schultz felt compelled to explicitly wall it off from the breach-of-trust claims that were the basis of his lawsuit: “We’re not getting into whether there should be player-killing.” Given that it had been such a constant theme of life (and death) in virtual Britannia going all the way back to the alpha-testing phase, the MUDders might have taken more steps to address it before the launch. As it was, though, one senses that, having seen so many of their ideas about a virtual ecology and the like not survive contact with real players, having been forced to give up in so many ways on virtual Britannia as a truly self-sustaining, living world, they were determined to make this the scene of their last stand, the hill that they would either hold onto or die trying.

Their great white hope was still the one that Richard Garriott had been voicing in interviews since well before the world’s commercial debut: that purely social pressures would act as a constraint on player-killing — that, in short, their world would learn to police itself. In fact, the presence of player-killing might act as a spur to civilization — for, as Raph Koster said, “cultures define and refine themselves through conflict.” They kept trying to implement systems that would nudge this particular culture in the right direction. They decided that, after committing murder five times, a player would be branded with literal scarlet letters: the color of his onscreen name would change from blue to red. Hopefully this would make him a pariah among his peers, while also making it very dangerous for him to enter a town, whose invulnerable computer-controlled guards would attack him on sight. The designers didn’t reckon with the fact that a virtual life is, no matter how much they might wish otherwise, simply not the same as a real life. Some percentage of players, presumably perfectly mild-mannered and law-abiding in the real world, reveled in the role of murderous outlaws online, taking the red letters of their name as a badge of honor rather than shame, the dangers of the cities as a challenge rather than a deterrent. To sneak past the city gates, creep up behind an unsuspecting newbie and stab her in the back, then get out of Dodge before the city watch appeared… now, that was good times. The most-wanted rolls posted outside the guard stations of Britannia became, says Raph Koster, “a high-score table for player killers.”

The MUDders’ stubborn inflexibility on this issue — an issue that was by all indications soon costing Ultima Online large numbers of customers — was made all the more inexplicable in the opinion of many players by the fact that it was, in marked contrast to so many of the other problems, almost trivial to address in programming terms. An “invulnerability” flag had long existed, to be applied not only to computer-controlled city guards but to special human-controlled personages such as Lord British to whom the normal laws of virtual time and space did not apply. All Origin had to do was add a few lines of code to automatically turn the flag on when a player walked into designated “safe” spaces. That way, you could have places where those who had signed up mostly in order to socialize could hang out without having to constantly look over their backs, along with other places where the hardcore pugilists could pummel one another to their heart’s content. Everyone would be catered to. Problem solved.

But Raph Koster and company refused to take this blindingly obvious step, having gotten it into their heads that to do so would be to betray their most cherished ideals. They kept tinkering around the edges of the problem, looking for a subtler solution that would preserve their world’s simulational autonomy. For example, they implemented a sort of karmic justice system, which dictated that players who had been evil during life would be resurrected after death only after losing a portion of their stats and skills. Inevitably, the player killers just took this as another challenge. Just don’t get killed, and you would never have to worry about it.

The end result was to leave the experience of tens of thousands of players in the unworthy hands of a relatively small minority of “griefers,” people who thrived on causing others pain and distress. Like all bullies, they preyed on the weak; their typical victims were the newbies, unschooled in the ways of defense, guiding characters with underwhelming statistics and no arms or armor to speak of. Such new players were, of course, the ones whose level of engagement with the game was most tentative, who were the mostly likely to just throw up their hands and go find something else to play after they’d been victimized once or twice, depriving Origin of potentially hundreds of dollars in future subscription revenue.

In light of this, it’s strange that no one from EA or Origin overrode the MUDders on this point. For his part, Richard Garriott was adamantly on their side, insisting that Ultima Online simply had to allow player-killing if it wasn’t to become a mockery of itself. It was up to the dissatisfied and victimized residents themselves to band together and turn Britannia into the type of world they wanted to live in; it wasn’t up to Origin to step in and fix their problems for them with a deus ex machina. “When we first launched Ultima Online, we set out to create a world that supported the evil player as a legitimate role,” said Garriott in his rather high-handed way. “Those who have truly learned the lessons of the [single-player] Ultima games should cease their complaining, rise to the challenge, and make Britannia into the place they want it to be.” He liked to tell a story on this subject. (Knowing Garriott’s penchant for embellishment, it probably didn’t happen, or at least didn’t happen quite like this. But that’s not relevant to its importance as allegory.)

One evening, he was wandering the streets of the capital in his Lord British persona, when he heard a woman screaming. Rushing over to her, he was told that a thief had stolen all of her possessions. His spirit of chivalry was awoken; he told her that he would get her things back for her. Together they tracked down the thief and cornered him in a back alley. Lord British demanded that the thief return the stolen goods, and the thief complied. They all went their separate ways. A moment later, the woman cried out again; the thief had done it again.

This time, Lord British froze the thief with a spell before he could leave the scene of the crime. “I told you not to do that,” he scolded. “What are you doing?”

“Sorry, I won’t do it again,” said the thief as he turned over the goods for a second time.

“If you do that again, I’m going to ban you from the game,” said Lord British.

You might be able to guess what happened next: the thief did it yet again. “I said I was going to ban you, and now I have to,” shouted Lord British, now well and truly incensed. “What’s wrong with you? I told you not to steal from this woman!”

The thief’s answer stopped Garriott in his tracks. “Listen. You created this world, and I’m a thief,” he said, breaking character for the first time. “I steal. That’s what I do. And now you’re going to ban me from the game for playing the role I’m supposed to play? I lied to you before because I’m a thief. The king caught me and told me not to steal. What am I going to do, tell you that as soon as you turn around I’m going to steal again? No! I’m going to lie.”

And Garriott realized that the thief was right. Garriott could do whatever he wished to him as Lord British, the reigning monarch of this world. But if he wished to stay true to all the things he had said in the past about what virtual Britannia was and ought to be, he couldn’t go outside the world to punish him as Richard Garriott, the god of the server looking down from on-high.

Some of the questions with which Origin was wrestling resonate all too well today: questions involving the appropriate limits of online free speech — or rather free action, in this case. They are questions with which everyone who has ever opened an Internet discussion up to the public, myself included, have had to engage. When does strongly felt disagreement spill over into bad faith, counterpoint into disruption for the sake of it? And what should we do about it when it does? In Origin’s case, the pivotal philosophical question at hand was where the boundary lay between playing an evil character in good faith in a fantasy world and purposely, willfully trying to cause real pain to other real people sitting behind other real computers. Origin had chosen to embrace a position close to the ground staked out by our self-described “free-speech maximalists” of today. And like them, Origin was learning that the issue is more dangerously nuanced than they had wished to believe.

But there were others sorts of disconnect at play here as well. Garriott’s stern commandment that his world’s inhabitants should “cease their complaining, rise to the challenge, and make Britannia into the place they want it to be” becomes more than a bit rich when we remember that it was being directed toward Origin’s paying customers. Many of them might have replied that it was up to Origin rather than they themselves to make Britannia a place they wanted to be, lest they choose to spend their $10 per month on something else. The living-world dynamic held “as long as everyone is playing the same game,” wrote Amy Jo Kim in an article about Ultima Online and its increasingly vocalized discontents that appeared in Wired magazine in the spring of 1998. “But what happens when players who think they’re attending an online Renaissance Faire find themselves at the mercy of a violent, abusive gang of thugs? In today’s Britannia, it’s not uncommon to stumble across groups of evil players who talk like Snoop Doggy Dogg, dress like gangstas, and act like rampaging punks.” To be sure, some players were fully onboard with the “living-world” policy of (non-)administration. Others, however, had thought, reasonably enough given what they had read on the back of the game’s box, that they were just buying an entertainment product, a place to hang out in a few hours per day or week and have fun, chatting and exploring and killing monsters. They hadn’t signed up to organize police forces or lead political rallies. Nor had they signed up to be the guinea pigs in some highfalutin social experiment. No; they had signed up to play a game.

As it was, Ultima Online was all but impossible to play casually, thanks not only to the murderers skulking in its every nook and cranny but to core systems of the simulation itself. For example, if you saved up until you could afford to build yourself a nice little house, made it just like you wanted it, then failed to log on for a few days, when you did return you’d find that your home had disappeared, razed to make room for some other, more active player to build something. Systems like these pushed players to spend more time online as a prerequisite to having fun when they were there. Some left when the demands of the game conflicted with those of real life, which was certainly the wisest choice. But some others began to spend far more time in virtual Britannia than was really good for them, raising the specter of gaming addiction, a psychological and sociological problem that would only become more prevalent in the post-millennial age.

Origin estimated that the median hardcore player spent a stunning if not vaguely horrifying total of six hours per day in the virtual world. And if the truth be told, many of the non-murderous things with which they were expected to fill those hours do seem kind of boring on the face of it. This is the flip side of making a virtual world that is more “realistic”: most people play games to escape from reality for a while, not to reenact it. With all due respect to our dedicated and talented real-world tailors and bakers, most people don’t dream of spending their free time doing such jobs online. Small wonder so many became player killers instead; at least doing that was exciting and, for some people at any rate, fun. From Amy Jo Kim’s article:

There’s no shortage of realism in this game — the trouble is, many of the nonviolent activities in Ultima Online are realistic to the point of numbingly lifelike boredom. If you choose to be a tailor, you can make a passable living at it, but only after untold hours of repetitive sewing. And there’s no moral incentive for choosing tailoring — or any honorable, upstanding vocation, for that matter. So why be a tailor? In fact, why not prey on the tailors?

True, Ultima Online is many things to many people. Habitués of online salons come looking for intellectual sparring and verbal repartee. Some other people log on in search of intimate but anonymous social relationships. Still others play the game with cunning yet also a discernible amount of self-restraint, getting rich while staying pretty honest. But there’s no avoiding where the real action is: an ever-growing number are playing Ultima Online to kill everything that moves.

All of this had an effect: all signs are that, after the first rush of sales and subscriptions, Ultima Online began to stagnate, mired in bad reviews, ongoing technical problems, and a growing disenchantment with the player-killing and the other barriers to casual fun. Raph Koster admits that “our subscriber numbers, while stratospheric for the day, weren’t keeping up” with sales of the boxed game, because “the losses [of frustrated newbies] were so high.”

Although Origin and EA never published official sales or subscriber numbers, I have found one useful data point from the early days of Ultima Online, in an internal Origin newsletter dated October 30, 1998. As of this date, just after its first anniversary, the game had 90,000 registered users, of whom approximately half logged on on any given day. These numbers are depicted in the article in question as very impressive, as indeed they were in comparison to the likes of Meridian 59 and The Realm. Still, a bit of context never hurts. Ultima Online had sold 100,000 boxed copies in its first three months, yet it didn’t have even that many subscribers after thirteen months, when its total boxed sales were rounding the 200,000 mark. The subscriber-retention rate, in other words, was not great; a lot of those purchased CDs had become coasters in fairly short order.

Nine shards were up in North America at this time, a number that had stayed the same since the previous December. And it’s this number that may be the most telling one of all. It’s true that, since demand was concentrated at certain times of day, Ultima Online was hosting just about all the players it could handle with its current server infrastructure as of October of 1998. But then again, this was by no means all the players it should be able to handle in the abstract: new shards were generally brought into being in response to increasing numbers of subscribers rather than vice versa. The fact that no new North American shards had been opened since December of 1997 becomes very interesting in this light.

I don’t want to overstate my case here: Ultima Online was extremely successful on its own, somewhat experimental terms. We just need to be sure that we understand what those terms were. By no means were its numbers up there with the industry’s biggest hits. As a point of comparison, let’s take Riven, the long-awaited sequel to the mega-hit adventure game Myst. It was released two months after Ultima Online and went on to sell 1 million units in its first year — at least five times the number of boxed entrées to Origin’s virtual world over the same time period, despite being in a genre that was in marked decline in commercial terms. Another, arguably more pertinent point of comparison is Age of Empires, a new entry in the red-hot real-time-strategy genre. Released just one month after Ultima Online, it outsold Origin’s virtual world by more than ten to one over its first year. Judged as a boxed retail game, Ultima Online was a middling performer at best.

Of course, Ultima Online was not just another boxed retail game; the unique thing about it was that each of the 90,000 subscribers it had retained was paying $10 every month, yielding a steady revenue of almost $11 million per year, with none of it having to be shared with any distributor or retailer. That was really, really nice — nice enough to keep Origin’s head above water at a time when the studio didn’t have a whole lot else to point to by way of justifying its ongoing existence to EA. And yet the reality remained that Ultima Online was a niche obsession rather than a mass-market sensation. As so often happens in life, taking the next step forward in commercial terms, not to mention fending off the competition that was soon to appear with budgets and publisher support of which Meridian 59 and The Realm couldn’t have dreamed, would require a degree of compromise with its founding ideals.

Be that as it may, however, one thing at least was now clear: there was real money to be made in the MMORPG space. Shared virtual worlds would soon learn to prioritize entertainment over experimentation. Going forward, there would be less talk about virtual ecologies and societies, and more focus on delivering slickly packaged fun, of the sort that would keep all kinds of players coming back for more — and, most importantly of all, get those subscriber counts rising once more.

I’ll continue to follow the evolution of PMOG, MMORPGs, and Ultima Online in future articles, and maybe see if I can’t invent some more confusing acronyms while I’m at it. But not right away… other subjects beg for attention in the more immediate future.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Braving Britannia: Tales of Life, Love, and Adventure in Ultima Online by Wes Locher, Postmortems: Selected Essays, Volume One by Raph Koster, Online Game Pioneers at Work by Morgan Ramsay, Through the Moongate, Part II by Andrea Contato, Explore/Create by Richard Garriott, and MMOs from the Inside Out by Richard Bartle, and Dungeons and Dreamers by Bard King and John Borland. Origin Systems’s internal newsletter Point of Origin of February 20 1998 and October 30 1998; Computer Gaming World of February 1998 and November 1998; New York Times of October 20 1997; Wired of May 1998.

Web sources include a 2018 Game Developers Conference talk by some of the Ultima Online principals, an Ultima Online timeline at UOGuide, and GameSpot‘s vintage reviews of Ultima Online and its first expansion, The Second Age. On the subject of SubSpace, we have histories by Rod Humble and Epinephrine, another vintage GameSpot review, and a Vice article by Emanuel Maiberg.

 

Tags: , , , , , , , , ,

Televising the Revolution

When we finished Broken Sword, the managing director of Virgin [Interactive] called me into his office and showed me a game from Argonaut [Software] called Creature Shock. He said, “These are the games you should be writing, not adventure games. These are the games. This is the future.”

— Charles Cecil, co-founder of Revolution Software

Broken Sword, Revolution Software’s third point-and-click adventure game, was released for personal computers in September of 1996. Three months later, it arrived on the Sony PlayStation console, so that it could be enjoyed on television as well as monitor screens. And therein lies a tale in itself.

Prior to this point, puzzle-based adventure games of the traditional stripe had had a checkered career on the consoles, for reasons as much technical as cultural. They were a difficult fit with the Nintendo Entertainment System (NES), the console du jour in the United States during the latter 1980s, thanks to the small capacity of the cartridges that machine used to host its games, its lack of means for easily storing state so that one could return to a game where one had left off after spending time away from the television screen, and the handheld controllers it used that were so very different from a mouse, joystick, and/or keyboard. Still, these challenges didn’t stop some enterprising studios from making a go of it, tempted as they were by the huge installed base of Nintendo consoles. Over the course of 1988 and 1989, ICOM Simulations managed to port to the NES Deja VuUninvited, and Shadowgate; the last in particular really took off there, doing so well that it is better remembered as a console than a computer game today. In 1990, LucasArts[1]LucasArts was actually still known as Lucasfilm Games at the time. did the same with their early adventure Maniac Mansion; this port too was surprisingly playable, if also rather hilariously Bowdlerized to conform to Nintendo’s infamously strict censorship regime.

But as the 1990s began, “multimedia” was becoming the watchword of adventure makers on computers. By 1993, the era of the multimedia “interactive movie” was in full swing, with games shipping on CD — often multiple CDs — and often boasting not just voice acting but canned video clips of real actors. Such games were a challenge of a whole different order even for the latest generation of 16-bit consoles. Sierra On-Line and several other companies tried mightily to cram their adventure games onto the Sega Genesis,[2]The Genesis was known as the Mega-Drive in Japan and Europe. a popular console for which one could purchase a CD drive as an add-on product. In the end, though, they gave it up as technically impossible; the Genesis’s color palette and memory space were just too tiny, its processor just too slow.

But then, along came the Sony PlayStation.

For all that the usual focus of these histories is computer games, I’ve already felt compelled to write at some length about the PlayStation here and there. As I’ve written before, I consider it the third socially revolutionary games console, after the Atari VCS and the Nintendo Entertainment System. Its claim to that status involves both culture and pure technology. Sony marketed the PlayStation to a new demographic: to hip young adults rather than the children and adolescents that Nintendo and its arch-rival Sega had targeted. Meanwhile the PlayStation hardware, with its built-in CD-drive, its 32-bit processor, its 2MB of main memory and 1MB of graphics memory, its audiophile-quality sound system, and its handy memory cards for saving up to 128 K of state at a time, made ambitious long-form gaming experiences easier than ever before to realize on a console. The two factors in combination opened a door to whole genres of games on the PlayStation that had heretofore been all but exclusive to personal computers. Its early years brought a surprising number of these computer ports, such as real-time strategy games like Command & Conquer and turn-based strategy games like X-COM. And we can also add to that list adventure games like Broken Sword.

Their existence was largely thanks to the evangelizing efforts of Sony’s own new PlayStation division, which seldom placed a foot wrong during these salad days. Unlike Nintendo and Sega, who seemed to see computer and console games as existing in separate universes, Sony was eager to bridge the gap between the two, eager to bring a wider variety of games to the PlayStation. And they were equally eager to push their console in Europe, where Nintendo had barely been a presence at all to this point and which even Sega had always treated as a distant third in importance to Japan and North America.

Thus Revolution Software got a call one day while the Broken Sword project was still in its first year from Phil Harrison, an old-timer in British games who knew everyone and had done a bit of everything. “Look, I’m working for Sony now and there’s this new console going to be produced called the PlayStation,” he told Charles Cecil, the co-founder and tireless heart and soul of Revolution. “Are you interested in having a look?”

Cecil was. He was indeed.

Thoroughly impressed by the hardware and marketing plans Harrison had shown him, Cecil went to Revolution’s publisher Virgin Interactive to discuss making a version of Broken Sword for the PlayStation as well. “That’s crazy, that’s not going to work at all,” said Virgin according to Cecil himself. Convinced the idea was a non-starter, both technically and commercially, they told him he was free to shop a PlayStation Broken Sword elsewhere for all they cared. So, Cecil returned to his friend Phil Harrison, who brokered a deal for Sony themselves to publish a PlayStation version in Europe as a sort of test of concept. Revolution worked on the port on the side and on their own dime while they finished the computer game. Sony then shipped this PlayStation version in December of 1996.

Broken Sword on a computer…

…and on the PlayStation, where it’s become more bleary-eyed.

To be sure, it was a compromised creation. Although the PlayStation was a fairly impressive piece of kit by console standards, it left much to be desired when compared to even a mid-range gaming computer. The lovely graphics of the original had to be downgraded to the PlayStation’s lower resolution, even as the console’s relatively slow CD drive and lack of a hard drive for storing frequently accessed data made them painfully sluggish to appear on the television screen; one spent more time waiting for the animated cut scenes to load than watching them, their dramatic impact sometimes being squandered by multiple loading breaks within a scene. Even the voiced dialog could take unnervingly long to unspool from disc. Then, too, pointing and clicking was nowhere near as effortless using a game controller as it was with a mouse. (Sony actually did sell a mouse as an optional peripheral, but few people bought one.) Perhaps most worrisome of all, though, was the nature of the game itself. How would PlayStation gamers react to a cerebral, puzzle-oriented and narrative-driven experience like this?

The answer proved to be, better than some people — most notably those at Virgin — might have expected. Broken Sword‘s Art Deco classicism may have looked a bit out of place in the lurid, anime-bedecked pages of the big PlayStation magazines, but they and their readers generally treated it kindly if somewhat gingerly. Broken Sword sold 400,000 copies on the PlayStation in Europe. Granted, these were not huge numbers in the grand scheme of things. On a console that would eventually sell more than 100 million units, it was hard to find a game that didn’t sell well into the six if not seven (or occasionally eight) figures. By Revolution’s modest standards, however, the PlayStation port made all the difference in the world, selling as it did at least three times as many copies as the computer version despite its ample reasons for shirking side-by-side comparisons. Its performance in Europe was even good enough to convince the American publisher THQ to belatedly pick it up for distribution in the United States as well, where it shifted 100,000 or so more copies. “The PlayStation was good for us,” understates Charles Cecil today.

It was a godsend not least because Revolution’s future as a maker of adventure games for computers was looking more and more doubtful. Multinational publishers like Virgin tended to take the American market as their bellwether, and this did not bode well for Revolution, given that Broken Sword had under-performed there in relation to its European sales. To be sure, there were proximate causes for this that Revolution could point to: Virgin’s American arm, never all that enthused about the game, had given it only limited marketing and saddled it with the terrible alternative title of Circle of Blood, making it sound more like another drop in the ocean of hyper-violent DOOM clones than a cerebral exercise in story-driven puzzle-solving. At the same time, though, it was hard to deny that the American adventure market in general was going soggy in the middle; 1996 had produced no million-plus-selling mega-hit in the genre to stand up alongside 1995’s Phantasmagoria, 1994’s Myst, or 1993’s The 7th Guest. Was Revolution’s sales stronghold of Europe soon to follow the industry’s bellwether? Virgin suspected it was.

So, despite having made three adventure games in a row for Virgin that had come out in the black on the global bottom line, Revolution had to lobby hard for the chance to make a fourth one. “It was frustrating for us,” says Revolution programmer Tony Warriner, “because we were producing good games that reviewed and sold well, but we had to beg for every penny of development cash. There was a mentality within publishing that said you were better off throwing money around randomly, and maybe scoring a surprise big hit, instead of backing steady but profitable games like Broken Sword. But this sums up the problem adventures have always had: they sell, but not enough to turn the publishers on.”

We might quibble with the “always” in Warriner’s statement; there was a time, lasting from the dawn of the industry through the first half of the 1990s, when adventures were consistently among the biggest-selling titles of all on computers. But this was not the case later on. Adventure games became mid-tier niche products from the second half of the 1990s on, capable of selling in consistent but not huge numbers, capable of raking in modest profits but not transformative ones. Similar middling categories had long existed in other mass-media industries, from film to television, books to music, all of which industries had been mature enough to profitably cater to their niche customers in addition to the heart of the mainstream. The computer-games industry, however, was less adept at doing so.

The problem there boiled down to physical shelf space. The average games shop had a couple of orders of magnitude fewer titles on its shelves at any given time than the average book or record store. Given how scarce retail space was, nobody — not the distributors, not the publishers, certainly not the retailers themselves — was overly enthusiastic about filling it with product that wasn’t in one of the two hottest genres in gaming at the time, the first-person shooter and the real-time strategy. This tunnel vision had a profound effect on the games that were made and sold during the years just before and after the millennium, until the slow rise of digital distribution began to open fresh avenues of distribution for more nichey titles once again.

In light of this situation, it’s perhaps more remarkable how many computer games were made between 1995 and 2005 that were not first-person shooters or real-time strategies than the opposite. More dedicated, passionate developers than you might expect found ways to make their cases to the publishers and get their games funded in spite of the remorseless logic of the extant distribution systems.

Revolution Software found a way to be among this group, at least for a while — but Virgin’s acquiescence to a Broken Sword II didn’t come easy. Revolution had to agree to make the sequel in just one year, as compared to the two and a half years they had spent on its predecessor, and for a cost of just £500,000 rather than £1 million. The finished game inevitably reflects the straitened circumstances of its birth. But that isn’t to say that it’s a bad game. Far from it.

Broken Sword II: The Smoking Mirror kicks off six months after the conclusion of the first game. American-in-Paris George Stobbart, that game and this one’s star, has just returned to France after dealing with the death of his father Stateside. There’s he’s reunited with Nico Collard, the fetching Parisian reporter who helped him last time around and whom George has a definite hankering for, to the extent of referring to her as his “girlfriend”; Nico is more ambiguous about the nature of their relationship. At any rate, an ornately carved and painted stone, apparently Mayan in origin, has come into her possession, and she has asked George to accompany her to the home of an archaeologist who might be able to tell them something about it. Unfortunately, they’re ambushed by thugs as soon as they arrive; Nico is kidnapped, while George is left tied to a chair in a room whose only other inhabitants are a giant poisonous spider and a rapidly spreading fire.

If this game doesn’t kick off with the literal bang of an exploding bomb like last time, it’s close enough. “I believe that a videogame must declare the inciting incident immediately so the player is clear on what their character needs to do and, equally importantly, why,” says Charles Cecil.

With your help, George will escape from his predicament and track down and rescue Nico before she can be spirited out of the country, even as he also retrieves the Mayan stone from the dodgy acquaintance in whose safekeeping she left it and traces their attackers back to Central America. And so George and Nico set off together across the ocean to sun-kissed climes, to unravel another ancient prophecy and prevent the end of the world as we know it for the second time in less than a year.

Broken Sword II betrays its rushed development cycle most obviously in its central conspiracy. For all that the first game’s cabal of Knights Templar was bonkers on the face of it, it was grounded in real history and in a real, albeit equally bonkers classic book of pseudo-history, The Holy Blood and the Holy Grail. Mayans, on the other hand, are the most generic adventure-game movers and shakers this side of Atlanteans. “I was not as interested in the Mayans, if I’m truthful,” admits Charles Cecil. “Clearly human sacrifices and so on are interesting, but they were not on the same level of passion for me as the Knights Templar.”

Lacking the fascination of uncovering a well-thought-through historical mystery, Broken Sword II must rely on its set-piece vignettes to keep its player engaged. Thankfully, these are mostly still strong. Nico eventually gets to stop being the damsel in distress, becoming instead a driving force in the plot in her own right, so much so that you the player control her rather than George for a quarter or so of the game; this is arguably the only place where the second game actually improves on the first, which left Nico sitting passively in her flat waiting for George to call and collect hints from her most of the time. Needless to say, the sexual tension between George and Nico doesn’t get resolved, the writers having learned from television shows like Moonlighting and Northern Exposure that the audience’s interest tends to dissipate as soon as “Will they or won’t they” becomes “They will!” “We could very easily have had them having sex,” says Cecil, “but that would have ruined the relationship between these two people.”

The writing remains consistently strong in the small moments, full of sly humor and trenchant observations. Some fondly remembered supporting characters return, such as Duane and Pearl, the two lovably ugly American tourists you met in Syria last time around, who’ve now opted to take a jungle holiday, just in time to meet George and Nico once again. (Isn’t coincidence wonderful?)

And the game is never less than fair, with occasional deaths to contend with but no dead ends. This manifestation of respect for their players has marked Revolution’s work since Beneath a Steel Sky; they can only be applauded for it, given how many bigger, better-funded studios got this absolutely critical aspect of their craft so very wrong back in the day. The puzzles themselves are pitched perfectly in difficulty for the kind of game this is, being enough to make you stop and think from time to time but never enough to stop you in your tracks.

Broken Sword or Monkey Island?

In the end, then, Broken Sword II suffers only by comparison with Broken Sword I, which does everything it does well just that little bit better. The backgrounds and animation here, while still among the best that the 1990s adventure scene ever produced, aren’t quite as lush as what we saw last time. The series’s Art Deco and Tintin-inspired aesthetic sensibility, seen in no other adventure games of the time outside of the equally sumptuous Last Express, loses some focus when we get to Central America and the Caribbean. Here the game takes on an oddly LucasArts-like quality, what with the steel-drum background music and all the sandy beaches and dark jungles and even a monkey or two flitting around. Everywhere you look, the seams show just a little more than they did last time; the original voice of Nico, for example, has been replaced by that of another actress, making the opening moments of the second game a jarring experience for those who played the first. (Poor Nico would continue to get a new voice with each subsequent game in the series. “I’ve never had a bad Nico, but I’ve never had one I’ve been happy with,” says Cecil.)

But, again, we’re holding Broken Sword II up against some very stiff competition indeed; the first game is a beautifully polished production by any standard, one of the crown jewels of 1990s adventuring. If the sequel doesn’t reach those same heady heights, it’s never less than witty and enjoyable. Suffice to say that Broken Sword II is a game well worth playing today if you haven’t done so already.

It did not, however, sell even as well as its predecessor when it shipped for computers in November of 1997, serving more to justify than disprove Virgin’s reservations about making it in the first place. In the United States, it was released without its Roman numeral as simply Broken Sword: The Smoking Mirror, since that country had never seen a Broken Sword I. Thus even those Americans who had bought and enjoyed Circle of Blood had no ready way of knowing that this game was a sequel to that one. (The names were ironic not least in that the American game called Circle of Blood really did contain a broken sword, while the American game called Broken Sword did not.)

That said, in Europe too, where the game had no such excuses to rely upon, the sales numbers it put up were less satisfactory than before. A PlayStation version was released there in early 1998, but this too sold somewhat less than the first game, whose relative success in the face of its technical infelicities had perchance owed much to the novelty of its genre on the console. It was not so novel anymore: a number of other studios were also now experimenting with computer-style adventure games on the PlayStation, to mixed commercial results.

With Virgin having no interest in a Broken Sword III or much of anything else from Revolution, Charles Cecil negotiated his way out of the multi-game contract the two companies had signed. “The good and the great decided adventures [had] had their day,” he says. Broken Sword went on the shelf, permanently as far as anyone knew, leaving George and Nico in a lovelorn limbo while Revolution retooled and refocused. Their next game would still be an adventure at heart, but it would sport a new interface alongside action elements that were intended to make it a better fit on a console. For better or for worse, it seemed that the studio’s hopes for the future must lie more with the PlayStation than with computers.

Revolution Software was not alone in this; similar calculations were being made all over the industry. Thanks to the fresh technology and fresh ideas of the PlayStation, said industry was entering a new period of synergy and cross-pollination, one destined to change the natures of computer and console games equally. Which means that, for all that this site has always been intended to be a history of computer rather than console gaming, the PlayStation will remain an inescapable presence even here, lurking constantly in the background as both a promise and a threat.


Where to Get It: Broken Sword II: The Smoking Mirror is available as a digital download at GOG.com.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the book Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene; Retro Gamer 6, 31, 63, 146, and 148; GameFan of February 1998; PlayStation Magazine of February 1998; The Telegraph of January 4 2011. Online sources include Charles Cecil’s interviews with Anthony Lacey of Dining with Strangers, John Walker of Rock Paper Shotgun, Marty Mulrooney of Alternative Magazine Online, and Peter Rootham-Smith of Game Boomers.

Footnotes

Footnotes
1 LucasArts was actually still known as Lucasfilm Games at the time.
2 The Genesis was known as the Mega-Drive in Japan and Europe.
 
 

Tags: , , ,

A Dialog in Real Time (Strategy)

At the end of the 1990s, the two most popular genres in computer gaming were the first-person shooter and the real-time strategy game. They were so dominant that most of the industry’s executives seemed to want to publish little else. And yet at the beginning of the decade neither genre even existed.

The stories of how the two rose to such heady heights are a fascinating study in contrasts, of how influences in media can either go off like an explosion in a TNT factory or like the slow burn of a long fuse. Sometimes something appears and everyone knows instantly that it’s just changed everything; when the Beatles dropped Sgt. Pepper’s Lonely Hearts Club Band in 1967, there was no doubt that the proverbial goalposts in rock music had just been shifted. Other times, though, influence can take years to make itself felt, as was the case for another album of 1967, The Velvet Underground & Nico, about which Brian Eno would later famously say that it “only sold 10,000 copies, but everyone who bought it formed a band.”

Games are the same. Gaming’s Sgt. Pepper was DOOM, which came roaring up out of the shareware underground at the tail end of 1993 to sweep everything from its path, blowing away all of the industry’s extant conventional wisdom about what games would become and what role they would play in the broader culture. Gaming’s Velvet Underground, on the other hand, was the avatar of real-time strategy, which came to the world in the deceptive guise of a sequel in the fall of 1992. Dune II: The Building of a Dynasty sported its Roman numeral because its transnational publisher had gotten its transatlantic cables crossed and accidentally wound up with two separate games based on Frank Herbert’s epic 1965 science-fiction novelone made in Paris, the other in Las Vegas. The former turned out to be a surprisingly evocative and playable fusion of adventure and strategy game, but it was the latter that would quietly — oh, so quietly in the beginning! — shift the tectonic plates of gaming.

For Dune II, which was developed by Westwood Studios and published by Virgin Games, really was the first recognizable implementation of the genre of real-time strategy as we have come to know it since. You chose one of three warring trading houses to play, then moved through a campaign made up of a series of set-piece scenarios, in which your first goal was always to make yourself an army by gathering resources and using them to build structures that could churn out soldiers, tanks, aircraft, and missiles, all of which you controlled by issuing them fairly high-level orders: “go here,” “harvest there,” “defend this building,” “attack that enemy unit.” Once you thought you were strong enough, you could launch your full-on assault on the enemy — or, if you weren’t quick enough, you might find yourself trying to fend off his attack. What made it so different from most of the strategy games of yore was right there in the name: in the fact that it all played out in real time, at a pace that ranged from the brisk to the frantic, making it a test of your rapid-fire mousemanship and your ability to think on your feet. Bits and pieces of all this had been seen before — perhaps most notably in Peter Molyneux and Bullfrog’s Populous and the Sega Genesis game Herzog Zwei — but Dune II was where it all came together to create a gaming paradigm for the ages.

That said, Dune II was very much a diamond in the rough, a game whose groundbreaking aspirations frequently ran up against the brick wall of its limitations. It’s likely to leave anyone who has ever played almost any other real-time-strategy game seething with frustration. It runs at a resolution of just 320 X 200, giving only the tiniest window into the battlefield; it only lets you select and control one unit at a time, making coordinated attacks and defenses hard to pull off; its scenarios are somewhat rote exercises, differing mainly in the number of enemy hordes they throw against you as you advance through the campaign rather than the nature of the terrain or your objectives. Even its fog of war is wonky: the whole battlefield is blank blackness until one of your units gets within visual range, after which you can see everything that goes on there forevermore, whether any of your units can still lay eyes on it or not. And it has no support whatsoever for the multiplayer free-for-alls that are for many or most players the biggest draw of the genre.

Certainly Virgin had no inkling that they had a nascent ludic revolution on their hands. They released Dune II with more of a disinterested shrug than a fulsome fanfare, having expended most of their promotional energies on the other Dune, which had come out just a few months earlier. It’s a testimony to the novelty of the gameplay experience that it did as well as it did. It didn’t become a massive hit, but it sold well enough to earn its budget back and then some on the strength of reasonably positive reviews — although, again, no reviewer had the slightest notion that he was witnessing the birth of what would be one of the two hottest genres in gaming six years in the future. Even Westwood seemed initially to regard Dune II as a one-and-done. They wouldn’t release another game in the genre they had just invented for almost three years.

But the gaming equivalent of all those budding bedroom musicians who listened to that Velvet Underground record was also out there in the case of Dune II. One hungry, up-and-coming studio in particular decided there was much more to be done with the approach it had pioneered. And then Westwood themselves belatedly jumped back into the fray. Thanks to the snowball that these two studios got rolling in earnest during the mid-1990s, the field of real-time strategy would be well and truly saturated by the end of the decade, the yin to DOOM‘s yang. This, then, is the tale of those first few years of these two studios’ competitive dialog, over the course of which they turned the real-time strategy genre from a promising archetype into one of gaming’s two biggest, slickest crowd pleasers.


Blizzard Studios is one of the most successful in the history of gaming, so much so that it now lends its name to the Activision Blizzard conglomerate, with annual revenues in the range of $7.5 billion. In 1993, however, it was Westwood, flying high off the hit dungeon crawlers Eye of the Beholder and Lands of Lore, that was by far the more recognizable name. In fact, Blizzard wasn’t even known yet as Blizzard.

The company had been founded in late 1990 by Allen Adham and Mike Morhaime, a couple of kids fresh out of university, on the back of a $15,000 loan from Morhaime’s grandmother. They called their venture Silicon & Synapse, setting it up in a hole-in-the-wall office in Costa Mesa, California. They kept the lights on initially by porting existing games from one platform to another for publishers like Interplay — the same way, as it happened, that Westwood had gotten off the ground almost a decade before. And just as had happened for Westwood, Silicon & Synapse gradually won opportunities to make their own games once they had proven themselves by porting those of others. First there was a little auto-racing game for the Super Nintendo called RPM Racing, then a pseudo-sequel to it called Rock ‘n’ Roll Racing, and then a puzzle platformer called The Lost Vikings, which appeared for the Sega Genesis, MS-DOS, and the Commodore Amiga in addition to the Super Nintendo. None of these titles took the world by storm, but they taught Silicon & Synapse what it took to create refined, playable, mass-market videogames from scratch. All three of those adjectives have continued to define the studio’s output for the past 30 years.

It was now mid-1993; Silicon & Synapse had been in business for more than two and a half years already. Adham and Morhaime wanted to do something different — something bigger, something that would be suitable for computers only rather than the less capable consoles, a real event game that would get their studio’s name out there alongside the Westwoods of the world. And here there emerged another of their company’s future trademarks: rather than invent something new from whole or even partial cloth, they decided to start with something that already existed, but make it better than ever before, polishing it until it gleamed. The source material they chose was none other than Westwood’s Dune II, now relegated to the bargain bins of last year’s releases, but a perennial after-hours favorite at the Silicon & Synapse offices. They all agreed as to the feature they most missed in Dune II: a way to play it against other people, like you could its ancestor Populous. The bane of most multiplayer strategy games was their turn-based nature, which left you waiting around half the time while your buddy was playing. Real-time strategy wouldn’t have this problem of downtime.

That became the design brief for Warcraft: Orcs & Humans: remake Dune II but make it even better, and then add a multiplayer feature. And then, of course, actually try to sell the thing in all the ways Virgin had not really tried to sell its inspiration.

To say that Warcraft was heavily influenced by Dune II hardly captures the reality. Most of the units and buildings to hand have a direct correspondent in Westwood’s game. Even the menu of icons on the side of the screen is a virtual carbon copy — or at least a mirror image. “I defensively joked that, while Warcraft was certainly inspired by Dune II, [our] game was radically different,” laughs Patrick Wyatt, the lead programmer and producer on the project. “Our radar mini-map was in the upper left corner of the screen, whereas theirs was in the bottom right corner.”

In the same spirit of change, Silicon & Synapse replaced the desert planet of Arrakis with a fantasy milieu pitting, as the subtitle would suggest, orcs against humans. The setting and the overall look of Warcraft owe almost as much to the tabletop miniatures game Warhammer as the gameplay does to Dune II; a Warhammer license was seriously considered, but ultimately rejected as too costly and potentially too restrictive. Years later, Wyatt’s father would give him a set of Warhammer miniatures he’d noticed in a shop: “I found these cool toys and they reminded me a lot of your game. You might want to have your legal department contact them because I think they’re ripping you off.”

Suffice to say, then, that Warcraft was even more derivative than most computer games. The saving grace was the same that it would ever be for this studio: that they executed their mishmash of influences so well. The squishy, squint-eyed art is stylized like a cartoon, a wise choice given that the game is still limited to a resolution of just 320 X 200, so that photo-realism is simply not on the cards. The overall look of Warcraft has more in common with contemporary console games than the dark, gritty aesthetic that was becoming so popular on computers. The guttural exclamations of the orcs and the exaggerated Monty Python and the Holy Grail-esque accents of the humans, all courtesy of regular studio staffers rather than outside voice actors, become a chorus line as you order them hither and yon, making Dune II seem rather stodgy and dull by comparison. “We felt too many games took themselves too seriously,” says Patrick Wyatt. “We just wanted to entertain people.”

Slavishly indebted though it is to Dune II in all the broad strokes, Warcraft doesn’t neglect to improve on its inspiration in those nitty-gritty details that can make the difference between satisfaction and frustration for the player. It lets you select up to four units and give them orders at the same time by simply dragging a box around them, a quality-of-life addition whose importance is difficult to overstate, one so fundamental that no real-time-strategy game from this point forward would dare not to include it. Many more keyboard shortcuts are added, a less technically impressive addition but one no less vital to the cause of playability when the action starts to heat up. There are now two resources you need to harvest, lumber and gold, in places of Dune II‘s all-purpose spice. Units are now a little more intelligent about interpreting your orders, such that they no longer blithely ignore targets of opportunity, or let themselves get mauled to death without counterattacking just because you haven’t explicitly told them to. Scenario design is another area of marked improvement: whereas every Dune II scenario is basically the same drill, just with ever more formidable enemies to defeat, Warcraft‘s are more varied and arise more logically out of the story of the campaign, including a couple of special scenarios with no building or gathering at all, where you must return a runaway princess to the fold (as the orcs) or rescue a stranded explorer (as the humans).

The orc on the right who’s stroking his “sword” looks so very, very wrong — and this screenshot doesn’t even show the animation…

And, as the cherry on top, there was multiplayer support. Patrick Wyatt finished his first, experimental implementation of it in June of 1994, then rounded up a colleague in the next cubicle over so that they could became the first two people ever to play a full-fledged real-time-strategy game online. “As we started the game, I felt a greater sense of excitement than I’d ever known playing any other game,” he says.

It was just this magic moment, because it was so invigorating to play against a human and know that it wasn’t some stupid AI. It was a player who was smart and doing his absolute best to crush you. I knew we were making a game that would be fun, but at that moment I knew the game would absolutely kick ass.

While work continued on Warcraft, the company behind it was going through a whirlwind of changes. Recognizing at long last that “Silicon & Synapse” was actually a pretty terrible name, Adham and Morhaime changed it to Chaos Studios, which admittedly wasn’t all that much better, in December of 1993. Two months later, they got an offer they couldn’t refuse: Davidson & Associates, a well-capitalized publisher of educational software that was looking to break into the gaming market, offered to buy the freshly christened Chaos for the princely sum of $6.75 million. It was a massive over-payment for what was in all truth a middling studio at best, such that Adham and Morhaime felt they had no choice but to accept, especially after Davidson vowed to give them complete creative freedom. Three months after the acquisition, the founders decided they simply had to find a decent name for their studio before releasing Warcraft, their hoped-for ticket to the big leagues. Adham picked up a dictionary and started leafing through it. He hit pay dirt when his eyes flitted over the word “blizzard.” “It’s a cool name! Get it?” he asked excitedly. And that was that.

So, Warcraft hit stores in time for the Christmas of 1994, with the name of “Blizzard Entertainment” on the box as both its developer and its publisher — the wheels of the latter role being greased by the distributional muscle of Davidson & Associates. It was not immediately heralded as a game that would change everything, any more than Dune II had been; real-time strategy continued to be more of a slowly growing snowball than the ton of bricks to the side of the head that the first-person shooter had been. Computer Gaming World magazine gave Warcraft a cautious four stars out of five, saying that “if you enjoy frantic real-time games and if you don’t mind a linear structure in your strategic challenges, Warcraft is a good buy.” At the same time, the extent of the game’s debt to Dune II was hardly lost on the reviewer: “It’s a good thing for Blizzard that there’s no precedent for ‘look and feel’ lawsuits in computer entertainment.”[1]This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.

Warcraft would eventually sell 400,000 units, bettering Dune II‘s numbers by a factor of four or more. As soon as it became clear that it was doing reasonably well, Blizzard started on a sequel.


Out of everyone who looked at Warcraft, no one did so with more interest — or with more consternation at its close kinship with Dune II — than the folks at Westwood. “When I played Warcraft, the similarities between it and Dune II were pretty… blatant, so I didn’t know what to think,” says the Westwood designer Adam Isgreen. Patrick Wyatt of Blizzard got the impression that his counterparts “weren’t exactly happy” at the slavish copying when they met up at trade shows, though he “reckoned they should have been pleased that we’d taken their game as a base for ours.” Only gradually did it become clear why Warcraft‘s existence was a matter of such concern for Westwood: because they themselves had finally decided to make another game in the style of Dune II.

The game that Westwood was making could easily have wound up looking even more like the one that Blizzard had just released. The original plan was to call it Command & Conquer: Fortress of Stone and to set it in a fantasy world. (Westwood had been calling their real-time-strategy engine “Command & Conquer” since the days of promoting Dune II.) “It was going to have goldmines and wood for building things. Sound familiar?” chuckles Westwood’s co-founder Louis Castle. “There were going to be two factions, humans and faerie folk… pretty fricking close to orcs versus humans.”

Some months into development, however, Westwood decided to change directions, to return to a science-fictional setting closer to that of Dune II. For they wanted their game to be a hit, and it seemed to them that fantasy wasn’t the best guarantee of such a thing: CRPGs were in the doldrums, and the most recent big strategy release with a fantasy theme, MicroProse’s cult-classic-to-be Master of Magic, hadn’t done all that well either. Foreboding near-future stories, however, were all the rage; witness the stellar sales of X-COM, another MicroProse strategy game of 1994. “We felt that if we were going to make something that was massive,” says Castle, “it had to be something that anybody and everybody could relate to. Everybody understands a tank; everybody understands a guy with a machine gun. I don’t have to explain to them what this spell is.” Westwood concluded that they had made the right decision as soon as they began making the switch in software: “Tanks and vehicles just felt better.” The game lost its subtitle to become simply Command & Conquer.

While the folks at Blizzard were plundering Warhammer for their units and buildings, those at Westwood were trolling the Jane’s catalogs of current military hardware and Soldier of Fortune magazine. “We assumed that anything that was talked about as possibly coming was already here,” says Castle, “and that was what inspired the units.” The analogue of Dune II‘s spice — the resource around which everything else revolved — became an awesomely powerful space-born element come to earth known as tiberium.

Westwood included most of the shortcuts and conveniences that Blizzard had built into Warcraft, but went one or two steps further more often than not. For example, they also made it possible to select multiple units by dragging a box around them, but in their game there was no limit to the number of units that could be selected in this way. The keyboard shortcuts they added not only let you quickly issue commands to units and buildings, but also jump around the map instantly to custom viewpoints you could define. And up to four players rather than just two could now play together at once over a local network or the Internet, for some true mayhem. Then, too, scenario design was not only more varied than in Dune II but was even more so than in Warcraft, with a number of “guerilla” missions in the campaigns that involved no resource gathering or construction. It’s difficult to say to what extent these were cases of parallel innovation and to what extent they were deliberate attempts to one-up what Warcraft had done. It was probably a bit of both, given that Warcraft was released a good nine months before Command & Conquer, giving Westwood plenty of time to study it.

But other innovations in Command & Conquer were without any precedent. The onscreen menus could now be toggled on and off, for instance, a brilliant stroke that gave you a better view of the battlefield when you really needed it. Likewise, Westwood differentiated the factions in the game in a way that had never been done before. Whereas the different houses in Dune II and the orcs and humans in Warcraft corresponded almost unit for unit, the factions in Command & Conquer reflected sharply opposing military philosophies, demanding markedly different styles of play: the establishment Global Defense Initiative had slow, strong, and expensive units, encouraging a methodical approach to building up and husbanding your forces, while the terroristic Brotherhood of Nod had weaker but faster and cheaper minions better suited to madcap kamikaze rushes than carefully orchestrated combined-arms operations.

Yet the most immediately obvious difference between Command & Conquer and Warcraft was all the stuff around the game. Warcraft had been made on a relatively small budget with floppy disks in mind. It sported only a brief opening cinematic, after which scenario briefings consisted of nothing but scrolling text and a single voice over a static image. Command & Conquer, by contrast, was made for CD-ROM from the outset, by a studio with deeper pockets that had invested a great deal of time and energy into both 3D animation and full-motion video, that trendy art of incorporating real-world actors and imagery into games. The much more developed story line of Command & Conquer is forwarded by little between-mission movies that, if not likely to make Steven Spielberg nervous, are quite well-done for what they are, featuring as they do mostly professional performers — such as a local Las Vegas weatherman playing a television-news anchorman — who were shot by a real film crew in Westwood’s custom-built blue-screen studio. Westwood’s secret weapon here was Joseph Kucan, a veteran theater director and actor who oversaw the film shoots and personally played the charismatic Nod leader Kane so well that he became the very face of Command & Conquer in the eyes of most gamers, arguably the most memorable actual character ever associated with a genre better known for its hordes of generic little automatons. Louis Castle reckons that at least half of Command & Conquer‘s considerable budget went into the cut scenes.

The game was released with high hopes in August of 1995. Computer Gaming World gave it a pretty good review, four stars out of five: “The entertainment factor is high enough and the action fast enough to please all but the most jaded wargamers.”

The gaming public would take to it even more than that review might imply. But in the meantime…


As I noted in an earlier article, numbered sequels weren’t really commonplace for strategy games prior to the mid-1990s. Blizzard had originally imagined Warcraft as a strategy franchise of a different stripe: each game bearing the name would take the same real-time approach into a completely different milieu, as SSI was doing at the time with their “5-Star General” series of turn-based strategy games that had begun with Panzer General and continued with the likes of Fantasy General and Star General. But Blizzard soon decided to make their sequel a straight continuation of the first game, an approach to which real-time strategy lent itself much more naturally than more traditional styles of strategy game; the set-piece story of a campaign could, after all, always be continued using all the ways that Hollywood had long since discovered for keeping a good thing going. The only snafu was that either the orcs or the humans could presumably have won the war in the first game, depending on which side the player chose. No matter: Blizzard decided the sequel would be more interesting if the orcs had been the victors and ran with that.

Which isn’t to say that building upon its predecessor’s deathless fiction was ever the real point of Warcraft II: Tides of Darkness. Blizzard knew now that they had a competitor in Westwood, and were in any case eager to add to the sequel all of the features and ideas that time had not allowed them to include in the first game. There would be waterways and boats to sail on them, along with oil, a third resource, one that could only be mined at sea. Both sides would get new units to play with, while elves, dwarves, trolls, ogres, and goblins would join the fray as allies of one of the two main racial factions. The interface would be tweaked with another welcome shortcut: selecting a unit and right-clicking somewhere would cause it to carry out the most logical action there without having to waste time choosing from a menu. (After all, if you selected a worker unit and sent him to a goldmine, you almost certainly wanted him to start collecting gold. Why should you have to tell the game the obvious in some more convoluted fashion?)

But perhaps the most vital improvement was in the fog of war. The simplistic implementations of same seen in the first Warcraft and Command & Conquer were inherited from Dune II: areas of the map that had been seen once by any of your units were revealed permanently, even if said units went away or were destroyed. Blizzard now made it so that you would see only a back-dated snapshot of areas currently out of your units’ line of sight, reflecting what was there the last time one of your units had eyes on them. This innovation, no mean feat of programming on the part of Patrick Wyatt, brought a whole new strategic layer to the game. Reconnaissance suddenly became something you had to think about all the time, not just once.

Other improvements were not so conceptually groundbreaking, but no less essential for keeping ahead of the Joneses (or rather the Westwoods). For example, Blizzard raised the screen-resolution stakes, from 320 X 200 to 640 X 480, even as they raised the number of people who could play together online from Command & Conquer‘s four to eight. And, while there was still a limit on the number of units you could select at one time using Blizzard’s engine, that limit at least got raised from the first Warcraft‘s four to nine.

The story and its presentation, however, didn’t get much more elaborate than last time out. While Westwood was hedging its bets by keeping one foot in the “interactive movie” space of games like Wing Commander III, Blizzard was happy to “just” make Warcraft a game. The two series were coming to evince very distinct personalities and philosophies, just as gamers were sorting themselves into opposing groups of fans — with a large overlap of less partisan souls in between them, of course.

Released in December of 1995, Warcraft II managed to shake Computer Gaming World free of some of its last reservations about the burgeoning genre of real-time strategy, garnering four and a half stars out of five: “If you enjoy fantasy gaming, then this is a sure bet for you.” It joined Command & Conquer near the top of the bestseller lists, becoming the game that well and truly made Blizzard a name to be reckoned with, a peer in every sense with Westwood.

Meanwhile, and despite the sometimes bitter rivalry between the two studios and their fans, Command & Conquer and Warcraft II together made real-time strategy into a commercial juggernaut. Both games became sensations, with no need to shirk from comparison to even DOOM in terms of their sales and impact on the culture of gaming. Each eventually sold more than 3 million copies, numbers that even the established Westwood, much less the upstart Blizzard, had never dreamed of reaching before, enough to enshrine both games among the dozen or so most popular computer games of the entire 1990s. More than three years after real-time strategy’s first trial run in Dune II, the genre had arrived for good and all. Both Westwood and Blizzard rushed to get expansion packs of additional scenarios for their latest entries in the genre to market, even as dozens of other developers dropped whatever else they were doing in order to make real-time-strategy games of their own. Within a couple of years, store shelves would be positively buckling under the weight of their creations — some good, some bad, some more imaginative, some less so, but all rendered just a bit anonymous by the sheer scale of the deluge. And yet even the most also-ran of the also-rans sold surprisingly well, which explained why they just kept right on coming. Not until well into the new millennium would the tide begin to slacken.


With Command & Conquer and Warcraft II, Westwood and Blizzard had arrived at an implementation of real-time strategy that even the modern player can probably get on with. Yet there is one more game that I just have to mention here because it’s so loaded with a quality that the genre is known for even less than its characters: that of humor. Command & Conquer: Red Alert is as hilarious as it is unexpected, the only game of this style that’s ever made me laugh out loud.

Red Alert was first envisioned as a scenario pack that would move the action of its parent game to World War II. But two things happened as work progressed on it: Westwood decided it was different enough from the first game that it really ought to stand alone, and, as designer Adam Isgreen says, “we found straight-up history really boring for a game.” What they gave us instead of straight-up history is bat-guano insane, even by the standards of videogame fictions.

We’re in World War II, but in a parallel timeline, because Albert Einstein — why him? I have no idea! — chose to travel back in time on the day of the Trinity test of the atomic bomb and kill Adolf Hitler. Unfortunately, all that’s accomplished is to make world conquest easier for Joseph Stalin. Now Einstein is trying to save the democratic world order by building ever more powerful gadgets for its military. Meanwhile the Soviet Union is experimenting with the more fantastical ideas of Nikola Tesla, which in this timeline actually work. So, the battles just keep getting crazier and crazier as the game wears on, with teleporters sending units jumping instantly from one end of the map to the other, Tesla coils zapping them with lightning, and a fetching commando named Tanya taking out entire cities all by herself when she isn’t chewing the scenery in the cut scenes. Those actually display even better production values than the ones in the first game, but the script has become pure, unadulterated camp worthy of Mel Brooks, complete with a Stalin who ought to be up there singing and dancing alongside Der Führer in Springtime for Hitler. Even our old friend Kane shows up for a cameo. It’s one of the most excessive spectacles of stupidity I’ve ever seen in a game… and one of the funniest.

Joseph Stalin gets rough with an underling. When you don’t have the Darth Vader force grip, you have to do things the old-fashioned way…

Up there at the top is the killer commando Tanya, who struts across the battlefield with no regard for proportion.

Released in the dying days of 1996, Red Alert didn’t add that much that was new to the real-time-strategy template, technically speaking; in some areas such as fog of war, it still lagged behind the year-old Warcraft II. Nonetheless, it exudes so much joy that it’s by far my favorite of the games I’ve written about today. If you ask me, it would have been a better gaming world had the makers of at least a few of the po-faced real-time-strategy games that followed looked here for inspiration. Why not? Red Alert too sold in the multiple millions.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: the book Stay Awhile and Listen, Book I by David L. Craddock; Computer Gaming World of January 1995, March 1995, December 1995, March 1996, June 1996, September 1996, December 1996, March 1997, June 1997, and July 1997; Retro Gamer 48, 111, 128, and 148; The One of January 1993; the short film included with the Command & Conquer: The First Decade game collection. Online sources include Patrick Wyatt’s recollections at his blog Code of Honor, Dan Griliopoulos’s collection of interviews with Westwood alumni at Funambulism, Soren Johnson’s interview with Louis Castle for his Designer’s Notes podcast, and Richard Moss’s real-time-strategy retrospective for Ars Technica.

Warcraft: Orcs & Humans and Warcraft II: Tides of Darkness, are available as digital purchases at GOG.com. The first Command & Conquer and Red Alert are available in remastered versions as a bundle from Steam.)

Footnotes

Footnotes
1 This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.
 

Tags: , , , , ,

Broken Sword: The Shadow of the Templars

The games of Revolution Software bore the stamp of the places in which they were conceived. Work on Beneath a Steel Sky, the company’s breakthrough graphic adventure, began in Hull, a grim postindustrial town in the north of England, and those environs were reflected in the finished product’s labyrinths of polluted streets and shuttered houses. But by the time Revolution turned to the question of a follow-up, they had upped stakes for the stately city of York. “We’re surrounded by history here,” said Revolution co-founder Tony Warriner. “York is a very historical city.” Charles Cecil, Revolution’s chief motivating force in a creative sense, felt inspired to make a very historical game.

The amorphous notion began to take a more concrete form after he broached the idea over dinner one evening to Sean Brennan, his main point of contact at Revolution’s publisher Virgin Interactive. Brennan said that he had recently struggled through Umberto Eco’s infamously difficult postmodern novel Foucault’s Pendulum, an elaborate satire of the conspiratorial view of history which is so carefully executed that its own conspiracy theories wind up becoming more convincing than most good-faith examples of the breed. Chasing a trail of literally and figuratively buried evidence across time and space… it seemed ideal for an adventure game. Why not do something like that? Perhaps the Knights Templar would make a good starting point. Thus was born Broken Sword: The Shadow of the Templars.



Our respectable books of history tell us that the Knights Templar was a rich and powerful but relatively brief-lived chivalric order of the late Middle Ages in Europe. It was founded in 1119 and torn up root and branch by a jealous King Philip IV of France and Pope Clement V in 1312. After that, it played no further role in history. Or did it?

People have been claiming for centuries that the order wasn’t really destroyed at all, that it just went underground in one sense or another. Meanwhile other conspiracy theories — sometimes separate from, sometimes conjoined with the aforementioned — have posited that the Knights left a fabulous hidden treasure behind somewhere, which perchance included even the Holy Grail of Arthurian legend.

In the 1960s, the old stories were revived and adapted into a form suitable for modern pop culture by a brilliant French fabulist named Pierre Plantard, who went so far as to plant forged documents in his homeland’s Bibliothèque Nationale. Three Anglo authors ingeniously expanded upon his deceptions — whether they were truly taken in by them or merely saw them as a moneymaking opportunity is unclear — in 1982 in the book The Holy Blood and the Holy Grail. It connected the Knights Templar to another, more blasphemous conspiracy theory: that Jesus Christ had not been celibate as stated in the New Testament, nor had his physical form actually died on the cross. He had rather run away with Mary Magdalene and fathered children with her, creating a secret bloodline that has persisted to the present day. The Knights Templar were formed to guard the holy bloodline, a purpose they continue to fulfill. Charles Cecil freely admits that it was The Holy Blood and the Holy Grail that really got his juices flowing.

It isn’t hard to see why. It’s a rare literary beast: a supposedly nonfiction book full of patent nonsense that remains thoroughly entertaining to read even for the person who knows what a load of tosh it all is. In his review of it back in 1982, Anthony Burgess famously wrote that “it is typical of my unregenerable soul that I can only see this as a marvelous theme for a novel.” Many others have felt likewise over the years since. If Umberto Eco’s unabashedly intellectual approach doesn’t strike your fancy, you can always turn to The Da Vinci Code, Dan Brown’s decidedly more populist take on the theme from 2003 — one of the most successful novels of the 21st century, the founder of a veritable cottage industry of sequels, knock-offs, and cinematic adaptations. (Although Brown himself insists that he didn’t use The Holy Blood and the Holy Grail for a crib sheet when writing his novel, pretty much no one believes him.)

For all their convoluted complexity, conspiracy theories are the comfort food of armchair historians. They state that the sweeping tides of history are not the result of diffuse, variegated, and ofttimes unease-inducing social and political impulses, but can instead all be explained by whatever shadowy cabal they happen to be peddling. It’s a clockwork view of history, A leading to B leading to C, which conveniently absolves us and our ancestors who weren’t pulling the strings behind the scenes of any responsibility for the state of the world. I’ve often wondered if the conspiratorial impulse in modern life stems at least in part from our current obsession with granular data, our belief that all things can be understood if we can just collect enough bits and bytes and analyze it all rigorously enough. Such an attitude makes it dangerously easy to assemble the narratives we wish to be true out of coincidental correlations. The amount of data at our fingertips, it seems to me, has outrun our wisdom for making use of it.

But I digress. As Burgess, Eco, and Brown all well recognized, outlandish conspiracy theories can be outrageously entertaining, and are harmless enough if we’re wise enough not to take them seriously. Add Charles Cecil to that list as well: “I was convinced a game set in the modern day with this history that resonated from Medieval times would make a very compelling subject.”

As he began to consider how to make a commercial computer game out of the likes of The Holy Blood and the Holy Grail, Cecil realized that he needed to stay well away from the book’s claims about Jesus Christ; the last thing Revolution Software or Virgin Interactive needed was to become the antichrist in the eyes of scandalized Christians all over the world. So, he settled on a less controversial vision of the Knights Templar, centering on their alleged lost treasure — a scavenger hunt was, after all, always a good fit for an adventure game — and a fairly nondescript conspiracy eager to get their hands on it for a spot of good old world domination for the sake of it.

Cecil and some of his more committed fans have occasionally noted some surface similarities between his game and The Da Vinci Code, which was published seven years later, and hinted that Dan Brown may have been inspired by the game as well as by The Holy Blood and the Holy Grail. In truth, though, the similarities would appear to be quite natural for fictions based on the same source material.

Indeed, I’ve probably already spent more time on the historical backstory of Broken Sword here than it deserves, considering how lightly it skims the surface of the claims broached in The Holy Blood and the Holy Grail and elsewhere. Suffice to say that the little bit of it that does exist here does a pretty good job of making you feel like you’re on the trail of a mystery ancient and ominous. And that, of course, is all it really needs to do.



In addition to being yet another manifestation of pop-culture conspiracy theorizing, Broken Sword was a sign of the times for the industry that produced it. Adventure games were as big as they would ever get in 1994, the year the project was given the green light by Virgin. Beneath a Steel Sky had gotten good reviews and was performing reasonably well in the marketplace, and Virgin was willing to invest a considerable sum to help Revolution take their next game to the proverbial next level, to compete head to head with Sierra and LucasArts, the titans of American adventure gaming. Broken Sword‘s final production cost would touch £1 million, making it quite probably the most expensive game yet made in Britain.

Having such a sum of money at their disposal transformed Revolution’s way of doing business. Some 50 different people in all contributed to Broken Sword, a five-fold increase over the staff hired for Beneath a Steel Sky. Artist Dave Gibbons, whose distinctive style had done so much to make the previous game stand out from the pack, was not among them, having moved on to other endeavors. But that was perhaps for the best; Gibbons was a comic-book artist, adept at crafting striking static images. Broken Sword, on the other hand, would have lots of motion, would be more of an interactive cartoon than an interactive comic.

To capture that feel, Charles Cecil went to Dublin, Ireland, where the animator Don Bluth ran the studio behind such films as The Land Before Time, All Dogs Go to Heaven, and Thumbelina. There he met one Eoghan Cahill, who had been working with Bluth for years, and got a hasty education on what separates the amateurs from the professionals in the field. Cecil:

I have to say, I didn’t take layout all that seriously. But he asked me about layout, and I showed him some of the stuff we were working on. And he looked at me and said, “This is not good enough.” I felt rather hurt. He said, “You need to see my stuff and you need to employ me.” So I had a look at his stuff, and it was so beautiful.

I said, “I think I really do need to employ you.” And indeed, he came to work at Revolution as a layout artist.

Although Don Bluth himself had nothing to do with the game, Broken Sword is as marked by the unique sensibility he inculcated in his artists as Beneath a Steel Sky is by that of Dave Gibbons. The opening movie is a bravura sequence by any standard, a tribute not only to the advantages of Super VGA graphics and CD-ROM — Revolution’s days of catering to more limited machines like the Commodore Amiga were now behind them — but to the aesthetic sophistication which Cahill brought to the project. Broken Sword‘s “pixel art,” as the kids call it today, remains mouth-wateringly luscious to look upon, something which most certainly cannot be said of the jaggy 3D productions of the mid-1990s.

The view with which the intro movie begins is a real one from the bell tower of Notre Dame Cathedral.

It’s worth dwelling on this movie a bit, for it does much to illustrate how quickly both Revolution and the industry to which they belonged were learning and expanding their horizons. Consider the stirring score by the noted film, television, and theater composer and conductor Barrington Pheloung, which is played by a real orchestra on real instruments — a growing trend in games in general at the time, which would have been unimaginable just a few years earlier for both technical and budgetary reasons.

Then, too, consider the subtle sophistication of the storytelling techniques that are employed here, from the first foreshadowing voice-over — the only dialog in the whole sequence — to the literal bang that finishes it. Right after the movie ends, you take control amidst the chaos on the sidewalk that follows the explosion. Assuming you aren’t made of the same stuff as that Notre Dame gargoyle, you’re already thoroughly invested at this point in figuring out what the heck just happened. The power of an in medias res opening like this one to hook an audience was well known to William Shakespeare, but has tended to elude many game developers. Charles Cecil:

There are two ways to start a game. You can give lots of background about a character and what he or she is doing or you can start in a way that is [in] the player’s control, and that’s what I wanted. I thought that since the player controlled the character and associated with him, I could afford to start a game without giving away a great deal about the character. So in the first scene, I didn’t want a long exposition. George is drawn into the plot unwillingly, having been caught up in an explosion, and he wants to do the right thing in finding out what was behind it.

All told, the jump in the quality of storytelling and writing from Beneath a Steel Sky to Broken Sword is as pronounced as the audiovisual leap. Beneath a Steel Sky isn’t really a poorly written game in comparison to others of its era, but the script at times struggles to live up to Dave Gibbons’s artwork. It bears the telltale signs of a writer not quite in control of his own material, shifting tones too jarringly and lapsing occasionally into awkward self-referential humor when it ought to be playing it straight.

None of that is the case with Broken Sword. This game’s writers know exactly where they want to go and have the courage of their conviction that they can get there. This is not to say that it’s dour — far from it; one of the greatest charms of the game is that it never takes itself too seriously, never forgets that it is at bottom just an exercise in escapist entertainment.

Remarkably, the improvement in this area isn’t so much a credit to new personnel as to the usual suspects honing their craft. Revolution’s games were always the vision of Charles Cecil, but, as he admits, he’s “not the world’s greatest writer.” Therefore he had relied since the founding of Revolution on one Dave Cummins to turn his broad outlines into a finished script. For Broken Sword, Cummins was augmented by a newcomer named Jonathan Howard, but the improvement in the writing cannot be down to his presence alone. The veterans at Revolution may have become harder to spot amidst the sea of new faces, but they were working as hard as anyone to improve, studying how film and television were put together and then applying the lessons to the game — but sparingly and carefully, mind you. Cecil:

When Broken Sword came out, we were riding on the back of these interactive movies. They were a disaster. The people knocking them out were being blinded; they wanted to rub shoulders with movie stars and producers, and the gaming elements were lost. They were out of touch with games. Of course, I am interested in film script-writing and I felt then and still do that there can be parallels with games. I felt we needed to learn from the movies with Broken Sword, but not mimic them. It was my intention to make Broken Sword cinematic — with great gameplay.

Revolution may have had global ambitions for Broken Sword, but it’s a deeply British game at heart, shot through with sly British humor. To properly appreciate any of that, however, we really need to know what the game is actually about, beyond the Knights Templar and international conspiracies of evil in the abstract.



Broken Sword‘s protagonist is an American abroad with the pitch-perfect name of George Stobbart, who is winningly portrayed in the original game and all four of its official sequels to date by voice actor Rolf Saxon. George is a painfully earnest everyman — or at least every-American — who in an earlier era might have been played on the silver screen by Jimmy Stewart. He wanders through the game’s foreign settings safely ensconced in the impenetrable armor of his nationality, a sight recognizable to any observer of Americans outside their natural habitat. To my mind the funniest line in the entire script comes when he’s accosted by an overzealous French police constable brandishing a pistol. “Don’t shoot!” he yells. “I’m an American!” Whole volumes of sociology and history could be written by way of unpacking those five words…

Anyway, as we saw in the movie above, the vacationing George is sitting in a Parisian café when a killer clown bombs the place to smithereens, in what seems to have been a deliberate — and unfortunately successful — act of murder against one particular patron. Earnest fellow that he is, George takes it upon himself to solve the crime, which proves to be much more than a random act of street violence. As he slowly peels the onion of the conspiracy behind it all, he has occasion to visit Ireland, Syria, Spain, and Scotland in addition to roaming the length and breadth of Paris, the home base for his investigations. And why does Paris feature so prominently? Well, it was close enough to Britain to make it easy for Revolution to visit in the name of research, but still held a powerful romantic allure for an Englishman of Cecil’s generation. “England was very poor in the 1960s and 1970s, and London was gray and drab,” he says. “Paris was smart. People walked differently and they wore brighter clothes. You sat in restaurants and ate amazing food. The mythology of Paris [in] Broken Sword came from that imagery of my younger days.”

George’s companion — constantly in research, from time to time in adventure, and potentially in romance — is one Nico, a French reporter with a sandpaper wit whom he meets at the scene of the bombing. She was originally created by the game’s writers to serve a very practical purpose, a trick that television and movie scriptwriters have been employing forever: in acting as a diegetic sounding board for George, she becomes a handy way to keep the player oriented and up to date with the ramifications of his latest discoveries, helping the player to keep a handle on what becomes a very complex mystery. In this sense, then, her presence is another sign of how Revolution’s writers were mastering their craft. “It meant we didn’t need to have lengthy one-man dialogs or 30 minutes of cut scenes,” says Charles Cecil.

The sexual tension between the oft-bickering pair — that classic “will they or won’t they?” dilemma — was initially a secondary consideration. It’s actually fairly understated in this first game, even as Nico herself is less prominent than she would later become; she spends the bulk of the game sitting in her apartment conducting vaguely defined “inquiries,” apparently by telephone, and waiting for another visit from George. [1]It’s telling that, when Revolution recently produced a “director’s cut” of the game for digital distribution, the most obvious additions were a pair of scenes where the player gets to control Nico directly, giving at least the impression that she has a more active role in the plot. Sadly, one of these takes place before the bombing in the Parisian café, rather spoiling that dramatically perfect — and perfectly dramatic — in medias res opening.

So much for the characters. Now, back to the subject of humor:

There’s the time when George tells Nico that he’s just visited the costume shop whence he believes the bomber to have rented his clown suit. “Yeah, I like it. What are you supposed to be?” she asks. Da-dum-dum!

“I didn’t hire a costume,” answers our terminally earnest protagonist. “These are my clothes and you know it.”

And then there’s Nico and (a jealous) George’s discussion with a French historian about Britain’s status during the time of the Roman Empire. “To the Romans, the Mediterranean was the center of the universe,” says the historian. “Britain was a remote, unfriendly place inhabited by blue-painted savages.”

“It hasn’t changed much,” says Nico. Da-dum-dum-dum!

“Well, they’ve stopped painting themselves blue,” says our straight man George.

“Except when they go to a football match,” deadpans Nico. Da-dum-dum-dum-dum!

You get the idea. I should say that all of this is made funnier by the performances of the voice cast, who are clearly having a grand old time turning their accents up to eleven. (Like so many Anglosphere productions, Broken Sword seems to think that everyone speaks English all the time, just in funny ways and with a light salting of words like bonjour and merci.)

And yet — and this is the truly remarkable part — the campiness of it all never entirely overwhelms the plot. The game is capable of creating real dramatic tension and a palpable sense of danger from time to time. It demands to be taken seriously at such junctures; while you can’t lock yourself out of victory without knowing it, you can die. The game walks a tenuous tightrope indeed between drama and comedy, but it very seldom loses its balance.


It wasn’t easy being a writer of geopolitical thrillers in the 1990s, that period of blissful peace and prosperity in the West after the end of the Cold War and before the War on Terror, the resurgence of authoritarianism, a global pandemic, and a widespread understanding of the magnitude of the crisis of global warming. Where exactly was one to find apocalyptic conflicts in such a milieu? It’s almost chilling to watch this clip today. What seemed an example of typically absurd videogame evil in 1996 feels disturbingly relevant today — not the Knights Templar nonsense, that is, but all the real-world problems that are blamed on it. If only it was as simple as stamping out a single cabal of occultists…

It’s hard to reconcile Broken Sword‘s Syria, a place where horror exists only in the form of Knights Templar assassins, a peddler of dodgy kebobs, and — most horrifying of all — an American tourist in sandals and knee socks, with the reality of the country of today. The civil war that is now being fought there has claimed the lives of more than half a million people and shattered tens of millions more.

With Nico in her Parisian flat.

Wars and governments may come and go, but the pub life of Ireland is eternal.

A villa in Spain with a connection to the Knights Templar and a grouchy gardener whom George will need to outwit.

Amidst ruins of a Scottish castle fit for a work of Romantic art, on the cusp of foiling the conspirators’ nefarious plot.



Revolution spent an inordinate amount of time — fully two and a half years — honing their shot at the adventure-game big leagues. They were silent for so long that some in the British press consigned them to the “where are they now?” file. “Whatever happened to Revolution Software?” asked PC Zone magazine in January of 1996. “Two releases down the line, they seem to have vanished.”

Alas, by the time Broken Sword was finally ready to go in the fall of 1996, the public’s ardor for the adventure genre had begun to dissipate. Despite a slew of high-profile, ambitious releases, 1996 had yet to produce a million-selling hit like the previous year’s Phantasmagoria, or like Myst the year before that. Especially in the United States, the industry’s focus was shifting to 3D action-oriented games, which not only sold better but were cheaper and faster to make than adventure games. In what some might call a sad commentary on the times, Virgin’s American arm insisted that the name of Broken Sword be changed to Circle of Blood. “They wanted it to be much more ‘bloody’ sounding,” says Charles Cecil.

For all of its high production values, the game was widely perceived by the American gaming press as a second-tier entry in a crowded field plagued by flagging enthusiasm. Computer Gaming World‘s review reads as a more reserved endorsement than the final rating of four stars out of five might imply. “The lengthy conversations often drag on before getting to the point,” wrote the author. If you had told her that Broken Sword — or rather Circle of Blood, as she knew it — would still be seeing sequels published in the second decade after such adventure standard bearers as King’s Quest and Gabriel Knight had been consigned to the videogame history books, she would surely have been shocked to say the least.

Ah, yes, Gabriel Knight… the review refers several times to that other series of adventure games masterminded by Sierra’s Jane Jensen. Even today, Gabriel Knight still seems to be the elephant in the room whenever anyone talks about Broken Sword. And on the surface, there really are a lot of similarities between the two. Both present plots that are, for all their absurdity, extrapolations on real history; both are very interested in inculcating a sense of place in their players; both feature a male protagonist and a female sidekick who develop feelings for one another despite their constant bickering, and whose rapport their audience developed feelings for to such an extent that they encouraged the developers to make the sidekick into a full-fledged co-star. According to one line of argument in adventure-game fandom, Broken Sword is a thinly disguised knock-off of Gabriel Knight. (The first game of Sierra’s series was released back in 1993, giving Revolution plenty of time to digest it and copy it.) Many will tell you that the imitation is self-evidently shallower and sillier than its richer inspiration.

But it seems to me that this argument is unfair, or at least incomplete. To begin with, the whole comparison feels more apt if you’ve only read about the games in question than if you’ve actually played them. Leaving aside the fraught and ultimately irrelevant question of influence — for the record, Charles Cecil and others from Revolution do not cite Gabriel Knight as a significant influence — there is a difference in craft that needs to be acknowledged. The Gabriel Knight games are fascinating to me not so much for what they achieve as for what they attempt. They positively scream out for critical clichés about reaches exceeding grasps; they’re desperate to elevate the art of interactive storytelling to some sort of adult respectability, but they never quite figure out how to do that while also being playable, soluble adventure games.

Broken Sword aims lower, yes, but hits its mark dead-center. From beginning to end, it oozes attention to the details of good game design. “We had to be very careful, and so we went through lots of [puzzles], seeing which ones would be fun,” says Charles Cecil. “These drive the story on, providing rewards as the player goes along, so we had to get them right.” One seldom hears similar anecdotes from the people who worked on Sierra’s games.

This, then, is the one aspect of Broken Sword I haven’t yet discussed: it’s a superb example of classic adventure design. Its puzzles are tricky at times, but never unclued, never random, evincing a respect for its player that was too often lost amidst the high concepts of games like Gabriel Knight.

Of course, if you dislike traditional adventure games on principle, Broken Sword will not change your mind. As an almost defiantly traditionalist creation, it resolves none of the fundamental issues with the genre that infuriate so many. The puzzles it sets in front of you seldom have much to do with the mystery you’re supposed to be unraveling. In the midst of attempting to foil a conspiracy of world domination, you’ll expend most of your brainpower on such pressing tasks as luring an ornery goat out of an Irish farmer’s field and scouring a Syrian village for a kebob seller’s lucky toilet brush. (Don’t ask!) Needless to say, most of the solutions George comes up with are, although typical of an adventure game, ridiculous, illegal, and/or immoral in any other context. The only way to play them is for laughs.

And this, I think, is what Broken Sword understands about the genre that Gabriel Knight does not. The latter’s puzzles are equally ridiculous (and too often less soluble), but the game tries to play it straight, creating cognitive dissonances all over the place. Broken Sword, on the other hand, isn’t afraid to lean into the limitations of its chosen genre and turn them into opportunities — opportunities, that is, to just be funny. Having made that concession, if concession it be, it finds that it can still keep its overarching plot from degenerating into farce. It’s a pragmatic compromise that works.

I like to think that the wisdom of its approach has been more appreciated in recent years, as even the more hardcore among us have become somewhat less insistent on adventure games as deathless interactive art and more willing to just enjoy them for what they are. Broken Sword may have been old-school even when it was a brand-new game, but it’s no musty artifact today. It remains as charming, colorful, and entertaining as ever, an example of a game whose reach is precisely calibrated to its grasp.

(Sources: the books The Holy Blood and the Holy Grail by Michael Baigent, Richard Leigh, and Henry Lincoln and Grand Thieves and Tomb Raiders: How British Video Games Conquered the World by Magnus Anderson and Rebecca Levene; Retro Gamer 31, 63, 146, and 148; PC Zone of January 1996; Computer Gaming World of February 1997. Online sources include Charles Cecil’s interviews with Anthony Lacey of Dining with Strangers, John Walker of Rock Paper Shotgun, Marty Mulrooney of Alternative Magazine Online, and Peter Rootham-Smith of Game Boomers.

Broken Sword: The Shadow of the Templars is available for digital purchase as a “director’s cut” whose additions and modifications are of dubious benefit. Luckily, the download includes the original game, which is well worth the purchase price in itself.)

Footnotes

Footnotes
1 It’s telling that, when Revolution recently produced a “director’s cut” of the game for digital distribution, the most obvious additions were a pair of scenes where the player gets to control Nico directly, giving at least the impression that she has a more active role in the plot. Sadly, one of these takes place before the bombing in the Parisian café, rather spoiling that dramatically perfect — and perfectly dramatic — in medias res opening.
 

Tags: , ,