RSS

Search results for ‘trinity’

Trinity Postscript: Selling Tragedy

Like A Mind Forever Voyaging, Trinity seemed destined to become a casualty of an industry that just wasn’t equipped to appreciate what it was trying to do. Traditional game-review metrics like “fun” or “value for money” only cheapened it, while reviewers lacked the vocabulary to even begin to really address its themes. Most were content to simply mention, in passing and often with an obvious unease, that those themes were present. In Computer Gaming World, for instance, Scorpia said that it was “not for the squeamish,” would require of the player “some unpleasant actions,” that it was “overall a serious game, not a light-hearted one,” and then on to the firmer ground of puzzle hints. And that was downright thoughtful in comparison to Shay Addams’s review for Questbusters, which tried in a weird and clunky way to be funny in all the ways that Trinity doesn’t: “It blowed up real good!” runs the review’s tagline, which goes on to ask if they’ll be eating “fission chips” in the Kensington Gardens after the missiles drop. (Okay, that one’s dumb enough to be worth a giggle…) But the review’s most important point is that Trinity is “mainly a game” again after the first Interactive Fiction Plus title, A Mind Forever Voyaging, so disappointed: “The puzzles are back!”

Even Infocom themselves weren’t entirely sure how to sell or even how to talk about Trinity. The company’s creative management had been unstintingly supportive of Brian Moriarty while he was making the game, but “marketing,” as he said later, “was a little more concerned/disturbed. They didn’t quite know what to make of it.” The matrix of genres didn’t have a slot for “Historical Tragedy.” In the end they slapped a “Fantasy” label on it, although it doesn’t take a long look at Trinity and the previous games to wear that label — the Zork and Enchanter series — to realize that one of these things is not quite like the others.

Moriarty admits to “a few tiffs” with marketing over Trinity, but he was a reasonable guy who also understood that Infocom needed to sell their games and that, while the occasional highbrow press from the likes of The New York Times Book Review had been nice and all, the traditional adventure-game market was the only place they had yet succeeded in consistently doing that. Thus in interviews and other promotions for Trinity he did an uncomfortable dance, trying to talk seriously about the game and the reasons he wrote it while also trying not to scare away people just looking for a fun text adventure. The triangulations can be a bit excruciating: “It isn’t a gloomy game, but it does have a dark undertone to it. It’s not like it’s the end of the world.” (Actually, it is.) Or: “It’s kind of a dark game, but it’s also, I like to think, kind of a fun game too.” (With a ringing endorsement like “I like to think it’s kind of a fun game,” how could anyone resist?)

Trinity‘s commercial saving grace proved to be a stroke of serendipity having nothing to do with any of its literary qualities. The previous year Commodore had released what would prove to be their last 8-bit computer, the Commodore 128. Despite selling quite well, the machine had attracted very little software support. The cause, ironically, was also the reason it had done so well in comparison to the Plus/4, Commodore’s previous 8-bit machine. The 128, you see, came equipped with a “64 Mode” in which it was 99.9 percent compatible with the Commodore 64. Forced to choose between a modest if growing 128 user base and the massive 64 user base through which they could also rope in all those 128 users, almost all publishers, with too many incompatible machines to support already, made the obvious choice.

Infocom’s Interactive Fiction Plus system was, however, almost unique in the entertainment-software industry in running on the 128 in its seldom-used (at least for games) native mode. And all those new 128 owners were positively drooling for a game that actually took advantage of the capabilities of their shiny new machines. A Mind Forever Voyaging and Trinity arrived simultaneously on the Commodore 128 when the Interactive Fiction Plus interpreter was ported to that platform in mid-1986. But the puzzleless A Mind Forever Voyaging was a bit too outré for most gamers’ tastes. Plus it was older, and thus not getting the press or the shelf space that Trinity was. Trinity, on the other hand, fit the bill of “game I can use to show off my 128” just well enough, even for 128 users who might otherwise have had little interest in an all-text adventure game. Infocom’s sales were normally quite evenly distributed across the large range of machines they supported, but Trinity‘s were decidedly lopsided in favor of the Commodore 128. Those users’ numbers were enough to push Trinity to the vicinity of 40,000 in sales, not a blockbuster — especially by the standards of Infocom’s glory years — but enough to handily outdo not just A Mind Forever Voyaging but even more traditional recent games like Spellbreaker. Like the Cold War Trinity chronicles, it could have been much, much worse.

 
13 Comments

Posted by on February 26, 2015 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Trinity

Trinity

During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had taken office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became one of Infocom’s last substantial hits. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)

 
 

Tags: , ,

The Rise of POMG, Part 1: It Takes a Village…

No one on their deathbed ever said, “I wish I had spent more time alone with my computer!”

— Dani Bunten Berry

If you ever want to feel old, just talk to the younger generation.

A few years ago now, I met the kids of a good friend of mine for the very first time: four boys between the ages of four and twelve, all more or less crazy about videogames. As someone who spends a lot of his time and earns a lot of his income writing about games, I arrived at their house with high expectations attached.

Alas, I’m afraid I proved a bit of a disappointment to them. The distance between the musty old games that I knew and the shiny modern ones that they played was just too far to bridge; shared frames of reference were tough to come up with. This was more or less what I had anticipated, given how painfully limited I already knew my knowledge of modern gaming to be. But one thing did genuinely surprise me: it was tough for these youngsters to wrap their heads around the very notion of a game that you played to completion by yourself and then put on the shelf, much as you might a book. The games they knew, from Roblox to Fortnite, were all social affairs that you played online with friends or strangers, that ended only when you got sick of them or your peer group moved on to something else. Games that you played alone, without at the very least leader boards and achievements on-hand to measure yourself against others, were utterly alien to them. It was quite a reality check for me.

So, I immediately started to wonder how we had gotten to this point — a point not necessarily better or worse than the sort of gaming that I knew growing up and am still most comfortable with, just very different. This series of articles should serve as the beginning of an answer to that complicated question. Their primary focus is not so much how computer games went multiplayer, nor even how they first went online; those things are in some ways the easy, obvious parts of the equation. It’s rather how games did those things persistently — i.e., permanently, so that each session became part of a larger meta-game, if you will, embedded in a virtual community. Or perhaps the virtual community is embedded in the game. It all depends on how you look at it, and which precise game you happen to be talking about. Whichever way, it has left folks like me, whose natural tendency is still to read games like books with distinct beginnings, middles, and ends, anachronistic iconoclasts in the eyes of the youthful mainstream.

Which, I hasten to add, is perfectly okay; I’ve always found the ditch more fun than the middle of the road anyway. Still, sometimes it’s good to know how the other 90 percent lives, especially if you claim to be a gaming historian…



“Persistent online multiplayer gaming” (POMG, shall we say?) is a mouthful to be sure, but it will have to do for lack of a better descriptor of the phenomenon that has created such a divide between myself and my friend’s children.  It’s actually older than you might expect, having first come to be in the 1970s on PLATO, a non-profit computer network run out of the University of Illinois but encompassing several other American educational institutions as well. Much has been written about this pioneering network, which uncannily presaged in so many of its particulars what the Internet would become for the world writ large two decades later. (I recommend Brian Dear’s The Friendly Orange Glow for a book-length treatment.) It should suffice for our purposes today to say that PLATO became host to, among other online communities of interest, an extraordinarily vibrant gaming culture. Thanks to the fact that PLATO games lived on a multi-user network rather than standalone single-user personal computers, they could do stuff that most gamers who were not lucky enough to be affiliated with a PLATO-connected university would have to wait many more years to experience.

The first recognizable single-player CRPGs were born on PLATO in the mid-1970s, inspired by the revolutionary new tabletop game known as Dungeons & Dragons. They were followed by the first multiplayer ones in amazingly short order. Already in 1975’s Moria,[1]The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name. players met up with their peers online to chat, brag, and sell or trade loot to one another. When they were ready to venture forth to kill monsters, they could do so in groups of up to ten, pooling their resources and sharing the rewards. A slightly later PLATO game called Oubliette implemented the same basic concept in an even more sophisticated way. The degree of persistence of these games was limited by a lack of storage capacity — the only data that was saved between sessions were the statistics and inventory of each player’s character, with the rest of the environment being generated randomly each time out — but they were miles ahead of anything available for the early personal computers that were beginning to appear at the same time. Indeed, Wizardry, the game that cemented the CRPG’s status as a staple genre on personal computers in 1981, was in many ways simply a scaled-down version of Oubliette, with the multiplayer party replaced by a party of characters that were all controlled by the same player.

Chester Bolingbroke, better known online as The CRPG Addict, plays Moria. Note the “Group Members” field at bottom right. Chester is alone here, but he could be adventuring with up to nine others.

A more comprehensive sort of persistence arrived with the first Multi-User Dungeon (MUD), developed by Roy Trubshaw and Richard Bartle, two students at the University of Essex in Britain, and first deployed there in a nascent form in late 1978 or 1979. A MUD borrowed the text-only interface and presentation of Will Crowther and Don Woods’s seminal game of Adventure, but the world it presented was a shared, fully persistent one between its periodic resets to a virgin state, chockablock with other real humans to interact with and perhaps fight. “The Land,” as Bartle dubbed his game’s environs, expanded to more than 600 rooms by the early 1980s, even as its ideas and a good portion of its code were used to set up other, similar environments at many more universities.

In the meanwhile, the first commercial online services were starting up in the United States. By 1984, you could, for the price of a substantial hourly fee, dial into the big mainframes of services like CompuServe using your home computer. Once logged in there, you could socialize, shop, bank, make travel reservations, read newspapers, and do much else that most people wouldn’t begin to do online until more than a decade later — including gaming. For example, CompuServe offered MegaWars, a persistent grand-strategy game of galactic conquest whose campaigns took groups of up to 100 players four to six weeks to complete. (Woe betide the ones who couldn’t log in for some reason of an evening in the midst of that marathon!) You could also find various MUDs, as well as Island of Kesmai, a multiplayer CRPG boasting most of the same features as PLATO’s Oubliette in a genuinely persistent world rather than a perpetually regenerated one. CompuServe’s competitor GEnie had Air Warrior, a multiplayer flight simulator with bitmapped 3D graphics and sound effects to rival any of the contemporaneous single-player simulators on personal computers. For the price of $11 per hour, you could participate in grand Air Warrior campaigns that lasted three weeks each and involved hundreds of other subscribers, organizing and flying bombing raids and defending against the enemy’s attacks on their own lines. In 1991, America Online put up Neverwinter Nights,[2]Not the same game as the 2002 Bioware CRPG of the same name. which did for the “Gold Box” line of licensed Dungeons & Dragons CRPGs what MUD had done for Adventure and Air Warrior had done for flight simulators, transporting the single-player game into a persistent multiplayer space.

All of this stuff was more or less incredible in the context of the times. At the same time, though, we mustn’t forget that it was strictly the purview of a privileged elite, made up of those with login credentials for institutional-computing networks or money in their pockets to pay fairly exorbitant hourly fees to feed their gaming habits. So, I’d like to back up now and tell a different story of POMG — one with more of a populist thrust, focusing on what was actually attainable by the majority of people out there, the ones who neither had access to a university’s mainframe nor could afford to spend hundreds of dollars per month on a hobby. Rest assured that the two narratives will meet before all is said and done.



POMG came to everyday digital gaming in the reverse order of the words that make up the acronym: first games were multiplayer, then they went online, and then these online games became persistent. Let’s try to unpack how that happened.

From the very start, many digital games were multiplayer, optionally if not unavoidably so. Spacewar!, the program generally considered the first fully developed graphical videogame, was exclusively multiplayer from its inception in the early 1960s. Ditto Pong, the game that launched Atari a decade later, and with it a slow-building popular craze for electronic games, first in public arcades and later in living rooms. Multiplayer here was not so much down to design intention as technological affordances. Pong was an elaborate analog state machine rather than a full-blown digital computer, relying on decentralized resistors and potentiometers and the like to do its “thinking.” It was more than hard enough just to get a couple of paddles and a ball moving around on the screen of a gadget like this; a computerized opponent was a bridge too far.

Very quickly, however, programmable microprocessors entered the field, changing everyone’s cost-benefit analyses. Building dual controls into an arcade cabinet was expensive, and the end result tended to take up a lot of space. The designers of arcade classics like Asteroids and Galaxian soon realized that they could replace the complications of a human opponent with hordes of computer-controlled enemies, flying in rudimentary, partially randomized patterns. Bulky multiplayer machines thus became rarer and rarer in arcades, replaced by slimmer, more standardized single-player cabinets. After all, if you wanted to compete with your friends in such games, there was still a way to do so: you could each play a round against the computerized enemies and compare your scores afterward.

While all of this was taking shape, the Trinity of 1977 — the Radio Shack TRS-80, Apple II, and Commodore PET — had ushered in the personal-computing era. The games these early microcomputers played were sometimes ports or clones of popular arcade hits, but just as often they were more cerebral, conceptually ambitious affairs where reflexes didn’t play as big — or any — role: flight simulations, adventure games, war and other strategy games. The last were often designed to be played optimally or even exclusively against another human, largely for the same reason Pong had been made that way: artificial intelligence was a hard thing to implement under any circumstances on an 8-bit computer with as little as 16 K of memory, and it only got harder when you were asking said artificial intelligence to formulate a strategy for Operation Barbarossa rather than to move a tennis racket around in front of a bouncing ball. Many strategy-game designers in these early days saw multiplayer options almost as a necessary evil, a stopgap until the computer could fully replace the human player, thus alleviating that eternal problem of the war-gaming hobby on the tabletop: the difficulty of finding other people in one’s neighborhood who were able and willing to play such weighty, complex games.

At least one designer, however, saw multiplayer as a positive advantage rather than a kludge — in fact, as the way the games of the future by all rights ought to be. “When I was a kid, the only times my family spent together that weren’t totally dysfunctional were when we were playing games,” remembered Dani Bunten Berry. From the beginning of her design career in 1979, when she made an auction game called Wheeler Dealers for the Apple II,[3]Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992. multiplayer was her priority. In fact, she was willing to go to extreme lengths to make it possible; in addition to a cassette tape containing the software, Wheeler Dealers shipped with a custom-made hardware add-on, the only method she could come up with to let four players bid at once. Such experiments culminated in M.U.L.E., one of the first four games ever published by Electronic Arts, a deeply, determinedly social game of economics and, yes, auctions for Atari and Commodore personal computers that many people, myself included, still consider her unimpeachable masterpiece.

A M.U.L.E. auction in progress.

And yet it was Seven Cities of Gold, her second game for Electronic Arts, that became a big hit. Ironically, it was also the first she had ever made with no multiplayer option whatsoever. She was learning to her chagrin that games meant to be played together on a single personal computer were a hard sell; such machines were typically found in offices and bedrooms, places where people went to isolate themselves, not in living rooms or other spaces where they went to be together. She decided to try another tack, thereby injecting the “online” part of POMG into our discussion.

In 1988, Electronic Arts published Berry’s Modem Wars, a game that seems almost eerily prescient in retrospect, anticipating the ludic zeitgeist of more than a decade later with remarkable accuracy. It was a strategy game played in real time (although not quite a real-time strategy of the resource-gathering and army-building stripe that would later be invented by Dune II and popularized by Warcraft and Command & Conquer). And it was intended to be played online against another human sitting at another computer, connected to yours by the gossamer thread of a peer-to-peer modem hookup over an ordinary telephone line. Like most of Berry’s games, it didn’t sell all that well, being a little too far out in front of the state of her nation’s telecommunications infrastructure.

Nevertheless, she continued to push her agenda of computer games as ways of being entertained together rather than alone over the years that followed. She never did achieve the breakout hit she craved, but she inspired countless other designers with her passion. She died far too young in 1998, just as the world was on the cusp of embracing her vision on a scale that even she could scarcely have imagined. “It is no exaggeration to characterize her as the world’s foremost authority on multiplayer computer games,” said Brian Moriarty when he presented Dani Bunten Berry with the first ever Game Developers Conference Lifetime Achievement Award two months before her death. “Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes [representing her eleven published games] the prototypes of the defining art form of the 21st century.” Let this article and the ones that will follow it, written well into said century, serve as partial proof of the truth of his words.

Danielle Bunten Berry, 1949-1998.

For by the time Moriarty spoke them, other designers had been following the trails she had blazed for quite some time, often with much more commercial success. A good early example is Populous, Peter Molyneux’s strategy game in real time (although, again, not quite a real-time strategy) that was for most of its development cycle strictly a peer-to-peer online multiplayer game, its offline single-player mode being added only during the last few months. An even better, slightly later one is DOOM, John Carmack and John Romero’s game of first-person 3D mayhem, whose star attraction, even more so than its sadistic single-player levels, was the “deathmatch” over a local-area network. Granted, these testosterone-fueled, relentlessly zero-sum contests weren’t quite the same as what Berry was envisioning for gaming’s multiplayer future near the end of her life; she wished passionately for games with a “people orientation,” directed toward “the more mainstream, casual players who are currently coming into the PC market.” Still, as the saying goes, you have to start somewhere.

But there is once more a caveat to state here about access, or rather the lack thereof. Being built for local networks only — i.e., networks that lived entirely within a single building or at most a small complex of them — DOOM deathmatches were out of reach on a day-to-day basis for those who didn’t happen to be students or employees at institutions with well-developed data-processing departments and permissive or oblivious authority figures. Outside of those ivory towers, this was the era of the “LAN party,” when groups of gamers would all lug their computers over to someone’s house, wire them together, and go at it over the course of a day or a weekend. These occasions went on to become treasured memories for many of their participants, but they achieved that status precisely because they were so sporadic and therefore special.

And yet DOOM‘s rise corresponded with the transformation of the Internet from an esoteric tool for the technological elite to the most flexible medium of communication ever placed at the disposal of the great unwashed, thanks to a little invention out of Switzerland called the World Wide Web. What if there was a way to move DOOM and other games like it from a local network onto this one, the mother of all wide-area networks? Instead of deathmatching only with your buddy in the next cubicle, you would be able to play against somebody on another continent if you liked. Now wouldn’t that be cool?

The problem was that local-area networks ran over a protocol known as IPX, while the Internet ran on a completely different one called TCP/IP. Whoever could bridge that gap in a reasonably reliable, user-friendly way stood to become a hero to gamers all over the world.



Jay Cotton discovered DOOM in the same way as many another data-processing professional: when it brought down his network. He was employed at the University of Georgia at the time, and was assigned to figure out why the university’s network kept buckling under unprecedented amounts of spurious traffic. He tracked the cause down to DOOM, the game that half the students on campus seemed to be playing more than half the time. More specifically, the problem was caused by a bug, which was patched out of existence by John Carmack as soon as he was informed. Problem solved. But Cotton stuck around to play, the warden seduced by the inmates of the asylum.

He was soon so much better at the game than anyone else on campus that he was getting a bit bored. Looking for worthier opponents, he stumbled across a program called TCPSetup, written by one Jake Page, which was designed to translate IPX packets into TCP/IP ones and vice versa on the fly, “tricking” DOOM into communicating across the vast Internet. It was cumbersome to use and extremely unreliable, but on a good day it would let you play DOOM over the Internet for brief periods of time at least, an amazing feat by any standard. Cotton would meet other players on an Internet chat channel dedicated to the game, they’d exchange IP addresses, and then they’d have at it — or try to, depending on the whims of the Technology Gods that day.

On August 22, 1994, Cotton received an email from a fellow out of the University of Illinois — yes, PLATO’s old home — whom he’d met and played in this way (and beaten, he was always careful to add). His name was Scott Coleman. “I have some ideas for hacking TCPSetup to make it a little easier. Care to do some testing later?” Coleman wrote. “I’ve already emailed Jake [Page] on this, but he hasn’t responded (might be on vacation or something). If he approves, I’m hoping some of these ideas might make it into the next release of TCPSetup. In the meantime, I want to do some experimenting to see what’s feasible.”

Jake Page never did respond to their queries, so Cotton and Coleman just kept beavering away on their own, eventually rewriting TCPSetup entirely to create iDOOM, a more reliable and far less fiddly implementation of the same concept, with support for three- or four-player deathmatches instead of just one-on-one duels. It took off like a rocket; the pair were bombarded with feature requests, most notably to make iDOOM work with other IPX-only games as well. In January of 1995, they added support for Heretic, one of the most popular of the first wave of so-called “DOOM clones.” They changed their program’s name to “iFrag” to reflect the fact that it was now about more than just DOOM.

Having come this far, Cotton and Coleman soon made the conceptual leap that would transform their software from a useful tool to a way of life for a time for many, many thousands of gamers. Why not add support for more games, they asked themselves, not in a bespoke way as they had been doing to date, but in a more sustainable one, by turning their program into a general-purpose IPX-to-TCP/IP bridge, suitable for use with the dozens of other multiplayer games out there that supported only local-area networks out of the box. And why not make their tool into a community while they were at it, by adding an integrated chat service? In addition to its other functions, the program could offer a list of “servers” hosting games, which you could join at the click of a button; no more trolling for opponents elsewhere on the Internet, then laboriously exchanging IP addresses and meeting times and hoping the other guy followed through. This would be instant-gratification online gaming. It would also provide a foretaste at least of persistent online multiplayer gaming; as people won matches, they would become known commodities in the community, setting up a meta-game, a sporting culture of heroes and zeroes where folks kept track of win-loss records and where everybody clamored to hear the results when two big wheels faced off against one another.

Cotton and Coleman renamed their software for the third time in less than nine months, calling it Kali, a name suggested by Coleman’s Indian-American girlfriend (later his wife). “The Kali avatar is usually depicted with swords in her hands and a necklace of skulls from those she has killed,” says Coleman, “which seemed appropriate for a deathmatch game.” Largely at the behest of Cotton, always the more commercially-minded of the pair, they decided to make Kali shareware, just like DOOM itself: multiplayer sessions would be limited to fifteen minutes at a time until you coughed up a $20 registration fee. Cotton went through the logistics of setting up and running a business in Georgia while Coleman did most of the coding in Illinois. (Rather astonishingly, Cotton and Coleman had still never met one another face to face in 2013, when gaming historian David L. Craddock conducted an interview with them that has been an invaluable source of quotes and information for this article.)

Kali certainly wasn’t the only solution in this space; a commercial service called DWANGO had existed since December of 1994, with the direct backing of John Carmack and John Romero, whose company id Software collected 20 percent of its revenue in return for the endorsement. But DWANGO ran over old-fashioned direct-dial-up connections rather than the Internet, meaning you had to pay long-distance charges to use it if you weren’t lucky enough to live close to one of its host computers. On top of that, it charged $9 for just five hours of access per month, with the fees escalating from there. Kali, by contrast, was available to you forever for as many hours per month as you liked after you plunked down your one-time fee of $20.

So, Kali was popular right from its first release on April 26, 1995. Yet it was still an awkward piece of software for the casual user despite the duo’s best efforts, being tied to MS-DOS, whose support for TCP/IP relied on a creaky edifice of third-party tools. The arrival of Windows 95 was a godsend for Kali, as it was for computer gaming in general, making the hobby accessible in a way it had never been before. The so-called “Kali95” was available by early 1996, and things exploded from there. Kali struck countless gamers with all the force of a revelation; who would have dreamed that it could be so easy to play against another human online? Lloyd Case, for example, wrote in Computer Gaming World magazine that using Kali for the first time was “one of the most profound gaming experiences I’ve had in a long time.” Reminiscing seventeen years later, David L. Craddock described how “using Kali for the first time was like magic. Jumping into a game and playing with other people. It blew my fourteen-year-old mind.” In late 1996, the number of registered Kali users ticked past 50,000, even as quite possibly just as many or more were playing with cracked versions that bypassed the simplistic serial-number-registration process. First-person-shooter deathmatches abounded, but you could also play real-time strategies like Command & Conquer and Warcraft, or even the Links golf simulation. Computer Gaming World gave Kali a special year-end award for “Online-Enabling Technology.”

Kali for Windows 95.

Competitors were rushing in at a breakneck pace by this time, some of them far more conventionally “professional” than Kali, whose origin story was, as we’ve seen, as underground and organic as that of DOOM itself. The most prominent of the venture-capital-funded startups were MPlayer (co-founded by Brian Moriarty of Infocom and LucasArts fame, and employing Dani Bunten Berry as a consultant during the last months of her life) and the Total Entertainment Network, better known as simply TEN. In contrast to Kali’s one-time fee, they, like DWANGO before them, relied on subscription billing: $20 per month for MPlayer, $15 per month for TEN. Despite slick advertising and countless other advantages that Kali lacked, neither would ever come close to overtaking its scruffy older rival, which had price as well as oodles of grass-roots goodwill on its side. Jay Cotton:

It was always my belief that Kali would continue to be successful as long as I never got greedy. I wanted everyone to be so happy with their purchase that they would never hesitate to recommend it to a friend. [I would] never charge more than someone would be readily willing to pay. It also became a selling point that Kali only charged a one-time fee, with free upgrades forever. People really liked this, and it prevented newcomers (TEN, Heat [a service launched in 1997 by Sega of America], MPlayer, etc.) from being able to charge enough to pay for their expensive overheads.

Kali was able to compete with TEN, MPlayer, and Heat because it already had a large established user base (more users equals more fun) and because it was much, much cheaper. These new services wanted to charge a subscription fee, but didn’t provide enough added benefit to justify the added expense.

It was a heady rush indeed, although it would also prove a short-lived one; Kali’s competitors would all be out of business within a year or so of the turn of the millennium. Kali itself stuck around after that, but as a shadow of what it had been, strictly a place for old-timers to reminisce and play the old hits. “I keep it running just out of habit,” said Jay Cotton in 2013. “I make just enough money on website ads to pay for the server.” It still exists today, presumably as a result of the same force of habit.

One half of what Kali and its peers offered was all too obviously ephemeral from the start: as the Internet went mainstream, developers inevitably began building TCP/IP support right into their games, eliminating the need for an external IPX-to-TCP/IP bridge. (For example, Quake, id Software’s much-anticipated follow-up to DOOM, did just this when it finally arrived in 1996.) But the other half of what they offered was community, which may have seemed a more durable sort of benefit. As it happened, though, one clever studio did an end-run around them here as well.



The folks at Blizzard Entertainment, the small studio and publisher that was fast coming to rival id Software for the title of the hottest name in gaming, were enthusiastic supporters of Kali in the beginning, to the point of hand-tweaking Warcraft II, their mega-hit real-time strategy, to run optimally over the service. They were rewarded by seeing it surpass even DOOM to become the most popular game there of all. But as they were polishing their new action-CRPG Diablo for release in 1996, Mike O’Brien, a Blizzard programmer, suggested that they launch their own service that would do everything Kali did in terms of community, albeit for Blizzard’s games alone. And then he additionally suggested that they make it free, gambling that knowledge of its existence would sell enough games for them at retail to offset its maintenance costs. Blizzard’s unofficial motto had long been “Let’s be awesome,” reflecting their determination to sell exactly the games that real hardcore gamers were craving, honed to a perfect finish, and to always give them that little bit extra. What better way to be awesome than by letting their customers effortlessly play and socialize online, and to do so for free?

The idea was given an extra dollop of urgency by the fact that Westwood Games, the maker of Warcraft‘s chief competitor Command & Conquer, had introduced a service called Westwood Chat that could launch people directly into a licensed version of Monopoly. (Shades of Dani Bunten Berry’s cherished childhood memories…) At the moment it supported only Monopoly, a title that appealed to a very different demographic from the hardcore crowd who favored Blizzard’s games, but who knew how long that would last?[4]Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.

So, when Diablo shipped in the last week of 1996, it included something called Battle.net, a one-click chat and matchmaking service and multiplayer facilitator. Battle.net made everything easier than it had ever been before. It would even automatically patch your copy of the game to the latest version when you logged on, pioneering the “software as a service” model in gaming that has become everyday life in our current age of Steam. “It was so natural,” says Blizzard executive Max Schaefer. “You didn’t think about the fact that you were playing with a dude in Korea and a guy in Israel. It’s really a remarkable thing when you think about it. How often are people casually matched up in different parts of the world?” The answer to that question, of course, was “not very often” in the context of 1997. Today, it’s as normal as computers themselves, thanks to groundbreaking initiatives like this one. Blizzard programmer Jeff Strain:

We believed that in order for it [Battle.net] to really be embraced and adopted, that accessibility had to be there. The real catch for Battle.net was that it was inside-out rather than outside-in. You jumped right into the game. You connected players from within the game experience. You did not alt-tab off into a Web browser to set up your games and have the Web browser try to pass off information or something like that. It was a service designed from Day One to be built into actual games.

The combination of Diablo and Battle.net brought a new, more palpable sort of persistence to online gaming. Players of DOOM or Warcraft II might become known as hotshots on services like Kali, but their reputation conferred no tangible benefit once they entered a game session. A DOOM deathmatch or a Warcraft II battle was a one-and-done event, which everyone started on an equal footing, which everyone would exit again within an hour or so, with nothing but memories and perhaps bragging rights to show for what had transpired.

Diablo, however, was different. Although less narratively and systemically ambitious than many of its recent brethren, it was nevertheless a CRPG, a genre all about building up a character over many gaming sessions. Multiplayer Diablo retained this aspect: the first time you went online, you had to pick one of the three pre-made first-level characters to play, but after that you could keep bringing the same character back to session after session, with all of the skills and loot she had already collected. Suddenly the link between the real people in the chat rooms and their avatars that lived in the game proper was much more concrete. Many found it incredibly compelling. People started to assume the roles of their characters even when they were just hanging out in the chat rooms, started in some very real sense to live the game.

But it wasn’t all sunshine and roses. Battle.net became a breeding ground of the toxic behaviors that have continued to dog online gaming to this day, a social laboratory demonstrating what happens when you take a bunch of hyper-competitive, rambunctious young men and give them carte blanche to have at it any way they wish with virtual swords and spells. The service was soon awash with “griefers,” players who would join others on their adventures, ostensibly as their allies in the dungeon, then literally stab them in the back when they least expected it, killing their characters and running off with all of their hard-won loot. The experience could be downright traumatizing for the victims, who had thought they were joining up with friendly strangers simply to have fun together in a cool new game. “Going online and getting killed was so scarring,” acknowledges David Brevick, Diablo‘s original creator. “Those players are still feeling a little bit apprehensive.”

To make matters worse, many of the griefers were also cheaters. Diablo had been born and bred a single-player game; multiplayer had been a very late addition. This had major ramifications. Diablo stored all the information about the character you played online on your local hard drive rather than the Battle.net server. Learn how to modify this file, and you could create a veritable god for yourself in about ten minutes, instead of the dozens of hours it would take playing the honest way. “Trainers” — programs that could automatically do the necessary hacking for you — spread like wildfire across the Internet. Other folks learned to hack the game’s executable files themselves. Most infamously, they figured out ways to attack other players while they were still in the game’s above-ground town, supposedly a safe space reserved for shopping and healing. Battle.net as a whole took on a siege mentality, as people who wanted to play honorably and honestly learned to lock the masses out with passwords that they exchanged only with trusted friends. This worked after a fashion, but it was also a betrayal of the core premise and advantage of Battle.net, the ability to find a quick pick-up game anytime you wanted one. Yet there was nothing Blizzard could do about it without rewriting the whole game from the ground up. They would eventually do this — but they would call the end result Diablo II. In the meanwhile, it was a case of player beware.

It’s important to understand that, for all that it resembled what would come later all too much from a sociological perspective, multiplayer Diablo was still no more persistent than Moria and Oubliette had been on the old PLATO network: each player’s character was retained from session to session, but nothing about the state of the world. Each world, or instance of the game, could contain a maximum of four human players, and disappeared as soon as the last player left it, leaving as its legacy only the experience points and items its inhabitants had collected from it while it existed. Players could and did kill the demon Diablo, the sole goal of the single-player game, one that usually required ten hours or more of questing to achieve, over and over again in the online version. In this sense, multiplayer Diablo was a completely different game from single-player Diablo, replacing the simple quest narrative of the latter with a social meta-game of character-building and player-versus-player combat.

For lots and lots of people, this was lots and lots of fun; Diablo was hugely popular despite all of the exploits it permitted — indeed, for some players perchance, because of them. It became one of the biggest computer games of the 1990s, bringing online gaming to the masses in a way that even Kali had never managed. Yet there was still a ways to go to reach total persistence, to bring a permanent virtual world to life. Next time, then, we’ll see how mainstream commercial games of the 1990s sought to achieve a degree of persistence that the first MUD could boast of already in 1979. These latest virtual worlds, however, would attempt to do so with all the bells and whistles and audiovisual niceties that a new generation of gamers raised on multimedia and 3D graphics demanded. An old dog in the CRPG space was about to learn a new trick, creating in the process a new gaming acronym that’s even more of a mouthful than POMG.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Stay Awhile and Listen Volumes 1 and 2 by David L. Craddock, Masters of Doom by David Kushner, and The Friendly Orange Glow by Brian Dear; Retro Gamer 43, 90, and 103; Computer Gaming World of September 1996 and May 1997; Next Generation of March 1997. Online sources include “The Story of Battle.net” by Wes Fenlon at PC Gamer, Dan Griliopoulos’s collection of interviews about Command & Conquer, Brian Moriarty’s speech honoring Dani Bunten Berry from the 1998 Game Developers Conference, and Jay Cotton’s history of Kali on the DOOM II fan site. Plus some posts on The CRPG Addict, to which I’ve linked in the article proper.

Footnotes

Footnotes
1 The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name.
2 Not the same game as the 2002 Bioware CRPG of the same name.
3 Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992.
4 Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.
 
 

Tags: , , , , , , , , , , , ,

Putting the “J” in the RPG, Part 1: Dorakue!


Fair warning: this article includes some plot spoilers of Final Fantasy I through VI.

The videogame industry has always run on hype, but the amount of it that surrounded Final Fantasy VII in 1997 was unparalleled in its time. This new game for the Sony PlayStation console was simply inescapable. The American marketing teams of Sony and Square Corporation, the game’s Japanese developer and publisher, had been given $30 million with which to elevate Final Fantasy VII to the same status as the Super Marios of the world. They plastered Cloud, Aerith, Tifa, Sephiroth, and the game’s other soon-to-be-iconic characters onto urban billboards, onto the sides of buses, and into the pages of glossy magazines like Rolling Stone, Playboy, and Spin. Commercials for the game aired round the clock on MTV, during NFL games and Saturday Night Live, even on giant cinema screens in lieu of more traditional coming-attractions trailers. “They said it couldn’t be done in a major motion picture,” the stentorian announcer intoned. “They were right!” Even if you didn’t care a whit about videogames, you couldn’t avoid knowing that something pretty big was going down in that space.

And if you did care… oh, boy. The staffs of the videogame magazines, hardly known for their sober-mindedness in normal times, worked themselves up to positively orgasmic heights under Square’s not-so-gentle prodding. GameFan told its readers that Final Fantasy VII would be “unquestionably the greatest entertainment product ever created.”

The game is ridiculously beautiful. Analyze five minutes of gameplay in Final Fantasy VII and witness more artistic prowess than most entire games have. The level of detail is absolutely astounding. These graphics are impossible to describe; no words are great enough. Both map and battle graphics are rendered to a level of detail completely unprecedented in the videogame world. Before Final Fantasy VII, I couldn’t have imagined a game looking like this for many years, and that’s no exaggeration. One look at a cut scene or call spell should handily convince you. Final Fantasy VII looks so consistently great that you’ll quickly become numb to the power. Only upon playing another game will you once again realize just how fantastic it is.

But graphics weren’t all that the game had going for it. In fact, they weren’t even the aspect that would come to most indelibly define it for most of its players. No… that thing was, for the very first time in a mainstream console-based videogame with serious aspirations of becoming the toppermost of the poppermost, the story.

I don’t have any room to go into the details, but rest assured that Final Fantasy VII possesses the deepest, most involved story line ever in an RPG. There’s few games that have literally caused my jaw to drop at plot revelations, and I’m most pleased to say that Final Fantasy VII doles out these shocking, unguessable twists with regularity. You are constantly motivated to solve the latest mystery.

So, the hype rolled downhill, from Square at the top to the mass media, then on to the hardcore gamer magazines to ordinary owners of PlayStations. You would have to have been an iconoclastic PlayStation owner indeed not to be shivering with anticipation as the weeks counted down toward the game’s September 7 release. (Owners of other consoles could eat their hearts out; Final Fantasy VII was a PlayStation exclusive.)

Just last year, a member of an Internet gaming forum still fondly recalled how

the lead-up for the US launch of this game was absolutely insane, and, speaking personally, it is the most excited about a game I think I had ever been in my life, and nothing has come close since then. I was only fifteen at the time, and this game totally overtook all my thoughts and imagination. I had never even played a Final Fantasy game before, and I didn’t even like RPGs, yet I would spend hours reading and rereading all the articles from all the gaming magazines I had, inspecting all the screenshots and being absolutely blown away at the visual fidelity I was witnessing. I spent multiple days/hours with my Sony Discman listening to music and drawing the same artwork that was in all the mags. It was literally a genre- and generation-defining game.

Those who preferred to do their gaming on personal computers rather than consoles might be excused for scoffing at all these breathless commentators who seemed to presume that Final Fantasy VII was doing something that had never been done before. If you spent your days playing Quake, Final Fantasy VII‘s battle graphics probably weren’t going to impress you overmuch; if you knew, say, Toonstruck, even the cut scenes might strike you as pretty crude. And then, too, computer-based adventure games and RPGs had been delivering well-developed long-form interactive narratives for many years by 1997, most recently with a decidedly cinematic bent more often than not, with voice actors in place of Final Fantasy VII‘s endless text boxes. Wasn’t Final Fantasy VII just a case of console gamers belatedly catching on to something computer gamers had known all along, and being forced to do so in a technically inferior fashion at that?

Well, yes and no. It’s abundantly true that much of what struck so many as so revelatory about Final Fantasy VII really wasn’t anywhere near as novel as they thought it was. At the same time, though, the aesthetic and design philosophies which it applied to the abstract idea of the RPG truly were dramatically different from the set of approaches favored by Western studios. They were so different, in fact, that the RPG genre in general would be forever bifurcated in gamers’ minds going forward, as the notion of the “JRPG” — the Japanese RPG — entered the gaming lexicon. In time, the label would be applied to games that didn’t actually come from Japan at all, but that evinced the set of styles and approaches so irrevocably cemented in the Western consciousness under the label of “Japanese” by Final Fantasy VII.

We might draw a parallel with what happened in music in the 1960s. The Beatles, the Rolling Stones, and all the other Limey bands who mounted the so-called “British Invasion” of their former Colonies in 1964 had all spent their adolescence steeped in American rock and roll. They took those influences, applied their own British twist to them, then sold them back to American teenagers, who screamed and fainted in the concert halls like Final Fantasy VII fans later would in the pages of the gaming magazines, convinced that the rapture they were feeling was brought on by something genuinely new under the sun — which in the aggregate it was, of course. It took the Japanese to teach Americans how thrilling and accessible — even how emotionally moving — the gaming genre they had invented could truly be.



The roots of the JRPG can be traced back not just to the United States but to a very specific place and time there: to the American Midwest in the early 1970s, where and when Gary Gygax and Dave Arneson, a pair of stolid grognards who would have been utterly nonplussed by the emotional histrionics of a Final Fantasy VII, created a “single-unit wargame” called Dungeons & Dragons. I wrote quite some years ago on this site that their game’s “impact on the culture at large has been, for better or for worse, greater than that of any single novel, film, or piece of music to appear during its lifetime.” I almost want to dismiss those words now as the naïve hyperbole of a younger self. But the thing is, I can’t; I have no choice but to stand by them. Dungeons & Dragons really was that earthshaking, not only in the obvious ways — it’s hard to imagine the post-millennial craze for fantasy in mass media, from the Lord of the Rings films to Game of Thrones, ever taking hold without it — but also in subtler yet ultimately more important ones, in the way it changed the role we play in our entertainments from that of passive spectators to active co-creators, making interactivity the watchword of an entire age of media.

The early popularity of Dungeons & Dragons coincided with the rise of accessible computing, and this proved a potent combination. Fans of the game with access to PLATO, a groundbreaking online community rooted in American universities, moved it as best they could onto computers, yielding the world’s first recognizable CRPGs. Then a couple of PLATO users named Robert Woodhead and Andrew Greenberg made a game of this type for the Apple II personal computer in 1981, calling it Wizardry. Meanwhile Richard Garriott was making Ultima, a different take on the same broad concept of “Dungeons & Dragons on a personal computer.”

By the time Final Fantasy VII stormed the gates of the American market so triumphantly in 1997, the cultures of gaming in the United States and Japan had diverged so markedly that one could almost believe they had never had much of anything to do with one another. Yet in these earliest days of digital gaming — long before the likes of the Nintendo Entertainment System, when Japanese games meant only coin-op arcade hits like Space Invaders, Pac-Man, and Donkey Kong in the minds of most Americans — there was in fact considerable cross-pollination. For Japan was the second place in the world after North America where reasonably usable, pre-assembled, consumer-grade personal computers could be readily purchased; the Japanese Sharp MZ80K and Hitachi MB-6880 trailed the American Trinity of 1977 — the Radio Shack TRS-80, Apple II, and Commodore PET — by less than a year. If these two formative cultures of computing didn’t talk to one another, whom else could they talk to?

Thus pioneering American games publishers like Sierra On-Line and Brøderbund forged links with counterparts in Japan. A Japanese company known as Starcraft became the world’s first gaming localizer, specializing in porting American games to Japanese computers and translating their text into Japanese for the domestic market. As late as the summer of 1985, Roe R. Adams III could write in Computer Gaming World that Sierra’s sprawling twelve-disk-side adventure game Time Zone, long since written off at home as a misbegotten white elephant, “is still high on the charts after three years” in Japan. Brøderbund’s platformer Lode Runner was even bigger, having swum like a salmon upstream in Japan, being ported from home computers to coin-op arcade machines rather than the usual reverse. It had even spawned the world’s first e-sports league, whose matches were shown on Japanese television.

At that time, the first Wizardry game and the second and third Ultima had only recently been translated and released in Japan. And yet if Adams was to be believed,[1]Adams was not an entirely disinterested observer. He was already working with Robert Woodhead on Wizardry IV, and had in fact accompanied him to Japan in this capacity. both games already

have huge followings. The computer magazines cover Lord British [Richard Garriott’s nom de plume] like our National Inquirer would cover a television star. When Robert Woodhead of Wizardry fame was recently in Japan, he was practically mobbed by autograph seekers. Just introducing himself in a computer store would start a near-stampede as people would run outside to shout that he was inside.

Robert Woodhead with Japanese Wizardry fans.

The Wizardry and Ultima pump had been primed in Japan by a game called The Black Onyx, created the year before in their image for the Japanese market by an American named Henk Rogers.[2]A man with an international perspective if ever there was one, Rogers would later go on to fame and fortune as the man who brought Tetris out of the Soviet Union. But his game was quickly eclipsed by the real deals that came directly out of the United States.

Wizardry in particular became a smashing success in Japan, even as a rather lackadaisical attitude toward formal and audiovisual innovation on the part of its masterminds was already condemning it to also-ran status against Ultima and its ilk in the United States. It undoubtedly helped that Wizardry was published in Japan by ASCII Corporation, that country’s nearest equivalent to Microsoft, with heaps of marketing clout and distributional muscle to bring to bear on any challenge. So, while the Wizardry series that American gamers knew petered out in somewhat anticlimactic fashion in the early 1990s after seven games,[3]It would be briefly revived for one final game, the appropriately named Wizardry 8, in 2001. it spawned close to a dozen Japanese-exclusive titles later in that decade alone, plus many more after the millennium, such that the franchise remains to this day far better known by everyday gamers in Japan than it is in the United States. Robert Woodhead himself spent two years in Japan in the early 1990s working on what would have been a Wizardry MMORPG, if it hadn’t proved to be just too big a mouthful for the hardware and telecommunications infrastructure at his disposal.

Box art helps to demonstrate Wizardry‘s uncanny legacy in Japan. Here we see the original 1981 American release of the first game.

And here we have a Japan-only Wizardry from a decade later, self-consciously echoing a foreboding, austere aesthetic that had become more iconic in Japan than it had ever been in its home country. (American Wizardry boxes from the period look nothing like this, being illustrated in a more conventional, colorful epic-fantasy style.)

Much of the story of such cultural exchanges inevitably becomes a tale of translation. In its original incarnation, the first Wizardry game had had the merest wisp of a plot. In this as in all other respects it was a classic hack-and-slash dungeon crawler: work your way down through ten dungeon levels and kill the evil wizard, finito. What background context there was tended to be tongue-in-cheek, more Piers Anthony than J.R.R. Tolkien; the most desirable sword in the game was called the “Blade of Cuisinart,” for Pete’s sake. Wizardry‘s Japanese translators, however, took it all in with wide-eyed earnestness, missing the winking and nodding entirely. They saw a rather grim, austere milieu a million miles away from the game that Americans knew — a place where a Cuisinart wasn’t a stainless-steel food processor but a portentous ancient warrior clan.

When the Japanese started to make their own Wizardry games, they continued in this direction, to almost hilarious effect if one knew the source material behind their efforts; it rather smacks of the post-apocalyptic monks in A Canticle for Liebowitz making a theology for themselves out of the ephemeral advertising copy of their pre-apocalyptic forebears. A franchise that had in its first several American releases aspired to be about nothing more than killing monsters for loot — and many of them aggressively silly monsters at that — gave birth to audio CDs full of po-faced stories and lore, anime films and manga books, a sprawling line of toys and miniature figures, even a complete tabletop RPG system. But, lest we Westerners begin to feel too smug about all this, know that the same process would eventually come to work in reverse in the JRPG field, with nuanced Japanese writing being flattened out and flat-out misunderstood by clueless American translators.

The history of Wizardry in Japan is fascinating by dint of its sheer unlikeliness, but the game’s importance on the global stage actually stems more from the Japanese games it influenced than from the ones that bore the Wizardry name right there on the box. For Wizardry, along with the early Ultima games, happened to catch the attention of Koichi Nakamura and Yuji Horii, a software-development duo who had already made several games together for a Japanese publisher called Enix. “Horii-san was really into Ultima, and I was really into Wizardry,” remembers Nakamura. This made sense. Nakamura was the programmer of the pair, naturally attracted to Wizardry‘s emphasis on tactics and systems. Horii, on the other hand, was the storytelling type, who wrote for manga magazines in addition to games, and was thus drawn to Ultima‘s quirkier, more sprawling world and its spirit of open-ended exploration. The pair decided to make their own RPG for the Japanese market, combining what they each saw as the best parts of Wizardry and Ultima.

Yuji Horii in the 1980s. Little known outside his home country, he is a celebrity inside its borders. In his book on Japanese videogame culture, Chris Kohler calls him a Steven Spielberg-like figure there, in terms both of name recognition and the style of entertainment he represents.

This was interesting, but not revolutionary in itself; you’ll remember that Henk Rogers had already done essentially the same thing in Japan with The Black Onyx before Wizardry and Ultima ever officially arrived there. Nevertheless, the choices Nakamura and Horii made as they set about their task give them a better claim to the title of revolutionaries on this front than Rogers enjoys. They decided that making a game that combined the best of Wizardry and Ultima really did mean just that: it did not mean, that is to say, throwing together every feature of each which they could pack in and calling it a day, as many a Western developer might have. They decided to make a game that was simpler than either of its inspirations, much less the two of them together.

Their reasons for doing so were artistic, commercial, and technical. In the realm of the first, Horii in particular just didn’t like overly complicated games; he was the kind of player who would prefer never to have to glance at a manual, whose ideal game intuitively communicated to you everything you needed to know in order to play it. In the realm of the second, the pair was sure that the average Japanese person, like the average person in most countries, felt the same as Horii; even in the United States, Ultima and Wizardry were niche products, and Nakamura and Horii had mass-market ambitions. And in the realm of the third, they were sharply limited in how much they could put into their RPG anyway, because they intended it for the Nintendo Famicom console, where their entire game — code, data, graphics, and sound — would have to fit onto a 64 K cartridge in lieu of floppy disks and would have to be steerable using an eight-button controller in lieu of a keyboard. Luckily, Nakamura and Horii already had experience with just this sort of simplification. Their most recent output had been inspired by the adventure games of American companies like Sierra and Infocom, but had replaced those games’ text parsers with controller-friendly multiple-choice menus.

In deciding to put American RPGs through the same wringer, they established one of the core attributes of the JRPG sub-genre: generally speaking, these games were and would remain simpler than their Western counterparts, which sometimes seemed to positively revel in their complexity as a badge of honor. Another attribute emerged fully-formed from the writerly heart of Yuji Horii. He crafted an unusually rich, largely linear plot for the game. Rather than being a disadvantage, he thought linearity would make this new style of console game “more accessible to consumers”: “We really focused on ensuring people would be able to experience the fun of the story.”

He called upon his friends at the manga magazines to help him illustrate his tale with large, colorful figures in that distinctly Japanese style that has become so immediately recognizable all over the world. At this stage, it was perhaps more prevalent on the box than in the game itself, the Famicom’s graphical fidelity being what it was. Nonetheless, another precedent that has held true in JRPGs right down to the present day was set by the overall visual aesthetic of this, the canonical first example of the breed. Ditto its audio aesthetic, which took the form of a memorable, melodic, eminently hummable chip-tune soundtrack. “From the very beginning, we wanted to create a warm, inviting world,” says Horii.

Dragon Quest. Ultima veterans will almost expect to meet Lord British on his throne somewhere. With its overhead view and its large over-world full of towns to be visited, Dragon Quest owed even more to Ultima than it did to Wizardry — unsurprisingly so, given that the former was the American RPG which its chief creative architect Yuji Horii preferred.

Dragon Quest was released on May 27, 1986. Console gamers — not only those in Japan, but anywhere on the globe — had never seen anything like it. Playing this game to the end was a long-form endeavor that could stretch out over weeks or months; you wrote down an alphanumeric code it provided to you on exit, then entered this code when you returned to the game in order to jump back to wherever you had left off.

That said, the fact that the entire game state could be packed into a handful of numbers and letters does serve to illustrate just how simple Dragon Quest really was at bottom. By the standards of only a few years later, much less today, it was pretty boring. Fighting random monsters wasn’t so much a distraction from the rest of the game as the only thing available to do; the grinding was the game. In 2012, critic Nick Simberg wondered at “how willing we were to sit down on the couch and fight the same ten enemies over and over for hours, just building up gold and experience points”; he compared Dragon Quest to “a child’s first crayon drawing, stuck with a magnet to the fridge.”

And yet, as the saying goes, you have to start somewhere. Japanese gamers were amazed and entranced, buying 1 million copies of Dragon Quest in its first six months, over 2 million copies in all. And so a new sub-genre was born, inspired by American games but indelibly Japanese in a way The Black Onyx had not been. Many or most of the people who played and enjoyed Dragon Quest had never even heard of its original wellspring Dungeons & Dragons.

We all know what happens when a game becomes a hit on the scale of Dragon Quest. There were sequels — two within two years of the first game, then three more in the eight years after them, as the demands of higher production values slowed down Enix’s pace a bit. Wizardry was big in Japan, but it was nothing compared to Dragon Quest, which sold 2.4 million copies in its second incarnation, followed by an extraordinary 3.8 million copies in its third. Middle managers and schoolmasters alike learned to dread the release of a new entry in the franchise, as about half the population of Japan under a certain age would invariably call in sick that day. When Enix started bringing out the latest games on non-business days, a widespread urban legend said this had been done in accordance with a decree from the Japanese Diet, which demanded that “henceforth Dragon Quest games are to be released on Sunday or national holidays only”; the urban legend wasn’t true, but the fact that so many people in Japan could so easily believe it says something in itself. Just as the early American game Adventure lent its name to an entire genre that followed it, the Japanese portmanteau word for “Dragon Quest” — Dorakue — became synonymous with the RPG in general there, such that when you told someone you were “playing dorakue” you might really be playing one of the series’s countless imitators.

Giving any remotely complete overview of these dorakue games would require dozens of articles, along with someone to write them who knows far more about them than I do. But one name is inescapable in the field. I refer, of course, to Final Fantasy.


Hironobu Sakaguchi in 1991.

Legend has it that Hironobu Sakaguchi, the father of Final Fantasy, chose that name because he thought that the first entry in the eventual franchise would be the last videogame he ever made. A former professional musician with numerous and diverse interests, Sakaguchi had been working for the Japanese software developer and publisher Square for a few years already by 1987, designing and programming Famicom action games that he himself found rather banal and that weren’t even selling all that well. He felt ready to do something else with his life, was poised to go back to university to try to figure out what that thing ought to be. But before he did so, he wanted to try something completely different at Square.

Another, less dramatic but probably more accurate version of the origin story has it that Sakaguchi simply liked the way the words “final’ and “fantasy” sounded together. At any rate, he convinced his managers to give him half a dozen assistants and six months to make a dorakue game.[4]In another unexpected link between East and West, one of his most important assistants became Nasir Gebelli, an Iranian who had fled his country’s revolution for the United States in 1979 and become a game-programming rock star on the Apple II. After the heyday of the lone-wolf bedroom auteur began to fade there, Doug Carlston, the head of Brøderbund, brokered a job for him with his friends in Japan. There he maximized the Famicom’s potential in the same way he had that of the Apple II, despite not speaking a word of Japanese when he arrived. (“We’d go to a restaurant and no matter what he’d order — spaghetti or eggs — they’d always bring out steak,” Sakaguchi laughs.) Gebelli would program the first three Final Fantasy games almost all by himself.

 

Final Fantasy I.

The very first Final Fantasy may not have looked all that different from Dragon Quest at first glance — it was still a Famicom game, after all, with all the audiovisual limitations that implies — but it had a story line that was more thematically thorny and logistically twisted than anything Yuji Horii might have come up with. As it began, you found yourself in the midst of a quest to save a princess from an evil knight, which certainly sounded typical enough to anyone who had ever played a dorakue game before. In this case, however, you completed that task within an hour, only to learn that it was just a prologue to the real plot. In his book-length history and study of the aesthetics of Japanese videogames, Chris Kohler detects an implicit message here: “Final Fantasy is about much more than saving the princess. Compared to the adventure that is about to take place, saving a princess is merely child’s play.” In fact, only after the prologue was complete did the opening credits finally roll, thus displaying another consistent quality of Final Fantasy: its love of unabashedly cinematic drama.

Still, for all that it was more narratively ambitious than what had come before, the first Final Fantasy can, like the first Dragon Quest, seem a stunted creation today. Technical limitations meant that you still spent 95 percent of your time just grinding for experience. “Final Fantasy may have helped build the genre, but it didn’t necessarily know exactly how to make it fun,” acknowledges Aidan Moher in his book about JRPGs. And yet when it came to dorakue games in the late 1980s, it seemed that Sakaguchi’s countrymen were happy to reward even the potential for eventual fun. They made Final Fantasy the solid commercial success that had heretofore hovered so frustratingly out of reach of its creator; it sold 400,000 copies. Assured that he would never have to work on a mindless action game again, Sakaguchi agreed to stay on at Square to build upon its template.

Final Fantasy II, which was released exactly one year after the first game in December of 1988 and promptly doubled its sales, added more essential pieces to what would become the franchise’s template. Although labelled and marketed as a sequel, its setting, characters, and plot had no relation to what had come before. Going forward, it would remain a consistent point of pride with Sakaguchi to come up with each new Final Fantasy from whole cloth, even when fans begged him for a reunion with their favorite places and people. In a world afflicted with the sequelitis that ours is, he can only be commended for sticking to his guns.

In another sense, though, Final Fantasy II was notable for abandoning a blank slate rather than embracing it. For the first time, its players were given a pre-made party full of pre-made personalities to guide rather than being allowed to roll their own. Although they could rename the characters if they were absolutely determined to do so — this ability would be retained as a sort of vestigial feature as late as Final Fantasy VII — they were otherwise set in stone, the better to serve the needs of the set-piece story Sakaguchi wanted to tell. This approach, which many players of Western RPGs did and still do regard as a betrayal of one of the core promises of the genre, would become commonplace in JRPGs. Few contrasts illustrate so perfectly the growing divide between these two visions of the RPG: the one open-ended and player-driven, sometimes to a fault; the other tightly scripted and story-driven, again sometimes to a fault. In a Western RPG, you write a story for yourself; in a JRPG, you live a story that someone else has already written for you.

Consider, for example, the two lineage’s handling of mortality. If one of your characters dies in battle in a Western RPG, it might be difficult and expensive, or in some cases impossible, to restore her to life; in this case, you either revert to an earlier saved state or you just accept her death as another part of the story you’re writing and move on to the next chapter with an appropriately heavy heart. In a JRPG, on the other hand, death in battle is never final; it’s almost always easy to bring a character who gets beat down to zero hit points back to life. What are truly fatal, however, are pre-scripted deaths, the ones the writers have deemed necessary for storytelling purposes. Final Fantasy II already contained the first of these; years later, Final Fantasy VII would be host to the most famous of them all, a death so shocking that you just have to call it that scene and everyone who has ever played the game will immediately know what you’re talking about. To steal a phrase from Graham Nelson, the narrative always trumps the crossword in JRPGs; they happily override their gameplay mechanics whenever the story they wish to tell demands it, creating an artistic and systemic discontinuity that’s enough to make Aristotle roll over in his grave. Yet a huge global audience of players are not bothered at all by it — not if the story is good enough.

But we’ve gotten somewhat ahead of ourselves; the evolution of the 1980s JRPG toward the modern-day template came in fits and starts rather than a linear progression. Final Fantasy III, which was released in 1990, actually returned to a player-generated party, and yet the market failed to punish it for its conservatism. Far from it: it sold 1.4 million copies.

Final Fantasy IV, on the other hand, chose to double down on the innovations Final Fantasy II had deployed, and sold in about the same numbers as Final Fantasy III. Released in July of 1991, it provided you with not just a single pre-made party but an array of characters who moved in and out of your control as the needs of the plot dictated, thereby setting yet another longstanding precedent for the series going forward. Ditto the nature of the plot, which leaned into shades of gray as never before. Chris Kohler:

The story deals with mature themes and complex characters. In Final Fantasy II, the squeaky-clean main characters were attacked by purely evil dark knights; here, our main character is a dark knight struggling with his position, paid to kill innocents, trying to reconcile loyalty to his kingdom with his sense of right and wrong. He is involved in a sexual relationship. His final mission for the king turns out to be a mass murder: the “phantom monsters” are really just a town of peaceful humans whose magic the corrupt king has deemed dangerous. (Note the heavy political overtones.)

Among Western RPGs, only the more recent Ultima games had dared to deviate so markedly from the absolute-good-versus-absolute-evil tales of everyday heroic fantasy. (In fact, the plot of Final Fantasy IV bears a lot of similarities to that of Ultima V…)

Ever since Final Fantasy IV, the series has been filled with an inordinate number of moody young James Deans and long-suffering Natalie Woods who love them.

Final Fantasy IV was also notable for introducing an “active-time battle system,” a hybrid between the turn-based systems the series had previously employed and real-time combat, designed to provide some of the excitement of the latter without completely sacrificing the tactical affordances of the former. (In a nutshell, if you spend too long deciding what to do when it’s your turn, the enemies will jump in and take another turn of their own while you dilly-dally.) It too would remain a staple of the franchise for many installments to come.

Final Fantasy V, which was released in December of 1992, was like Final Fantasy III something of a placeholder or even a retrenchment, dialing back on several of the fourth game’s innovations. It sold almost 2.5 million copies.

Both the fourth and fifth games had been made for the Super Famicom, Nintendo’s 16-bit successor to its first console, and sported correspondingly improved production values. But most JRPG fans agree that it was with the sixth game — the last for the Super Famicom — that all the pieces finally came together into a truly friction-less whole. Indeed, a substantial and vocal minority will tell you that Final Fantasy VI rather than its immediate successor is the best Final Fantasy ever, balanced perfectly between where the series had been and where it was going.

Final Fantasy VI abandoned conventional epic-fantasy settings for a steampunk milieu out of Jules Verne. As we’ll see in a later article, Final Fantasy VII‘s setting would deviate even more from the norm. This creative restlessness is one of the series’s best traits, standing it in good stead in comparison to the glut of nearly indistinguishably Tolkienesque Western RPGs of the 1980s and 1990s.

From its ominous opening-credits sequence on, Final Fantasy VI strained for a gravitas that no previous JRPG had approached, and arguably succeeded in achieving it at least intermittently. It played out on a scale that had never been seen before; by the end of the game, more than a dozen separate characters had moved in and out of your party. Chris Kohler identifies the game’s main theme as “love in all its forms — romantic love, parental love, sibling love, and platonic love. Sakaguchi asks the player, what is love and where can we find it?”

Before that scene in Final Fantasy VII, Hironobu Sakaguchi served up a shocker of equal magnitude in Final Fantasy VI. Halfway through the game, the bad guys win despite your best efforts and the world effectively ends, leaving your party wandering through a post-apocalyptic World of Ruin like the characters in a Harlan Ellison story. The effect this had on some players’ emotions could verge on traumatizing — heady stuff for a videogame on a console still best known worldwide as the cuddly home of Super Mario. For many of its young players, Final Fantasy VI was their first close encounter on their own recognizance — i.e., outside of compulsory school assignments — with the sort of literature that attempts to move beyond tropes to truly, thoughtfully engage with the human condition.

It’s easy for an old, reasonably well-read guy like me to mock Final Fantasy VI‘s highfalutin aspirations, given that they’re stuffed into a game that still resolves at the granular level into bobble-headed figures fighting cartoon monsters. And it’s equally easy to scoff at the heavy-handed emotional manipulation that has always been part and parcel of the JRPG; subtle the sub-genre most definitely is not. Nonetheless, meaningful literature is where you find it, and the empathy it engenders can only be welcomed in a world in desperate need of it. Whatever else you can say about Final Fantasy and most of its JRPG cousins, the messages these games convey are generally noble ones, about friendship, loyalty, and the necessity of trying to do the right thing in hard situations, even when it isn’t so easy to even figure out what the right thing is. While these messages are accompanied by plenty of violence in the abstract, it is indeed abstracted — highly stylized and, what with the bifurcation between game and story that is so prevalent in the sub-genre, often oddly divorced from the games’ core themes.

Released in April of 1994, Final Fantasy VI sold 2.6 million copies in Japan. By this point the domestic popularity of the Final Fantasy franchise as a whole was rivaled only by that of Super Mario and Dragon Quest; two of the three biggest gaming franchises in Japan, that is to say, were dorakue games. In the Western world, however, the picture was quite different.

In the United States, the first-generation Nintendo Famicom was known as the Nintendo Entertainment System, the juggernaut of a console that rescued videogames in the eyes of the wider culture from the status of a brief-lived fad to that of a long-lived entertainment staple, on par with movies in terms of economics if not cachet. Yet JRPGs weren’t a part of that initial success story. The first example of the breed didn’t even reach American shores until 1989. It was, appropriately enough, the original Dragon Quest, the game that had started it all in Japan; it was renamed Dragon Warrior for the American market, due to a conflict with an old American tabletop RPG by the name of Dragonquest whose trademarks had been acquired by the notoriously litigious TSR of Dungeons & Dragons fame. Enix did make some efforts to modernize the game, such as replacing the password-based saving system with a battery that let you save your state to the cartridge itself. (This same method had been adopted by Final Fantasy and most other post-Dragon Quest JRPGs on the Japanese market as well.) But American console gamers had no real frame of reference for Dragon Warrior, and even the marketing geniuses of Nintendo, which published the game itself in North America, struggled to provide them one. With cartridges piling up in Stateside warehouses, they were reduced to giving away hundreds of thousands of copies of Dragon Warrior to the subscribers of Nintendo Power magazine. For some of these, the game came as a revelation seven years before Final Fantasy VII; for most, it was an inscrutable curiosity that was quickly tossed aside.

Final Fantasy I, on the other hand, received a more encouraging reception in the United States when it reached there in 1990: it sold 700,000 copies, 300,000 more than it had managed in Japan. Nevertheless, with the 8-bit Nintendo console reaching the end of its lifespan, Square didn’t bother to export the next two games in the series. It did export Final Fantasy IV for the Super Famicom — or rather the Super Nintendo Entertainment System, as it was known in the West. The results were disappointing in light of the previous game’s reception, so much so that Square didn’t export Final Fantasy V.[5]Square did release a few spinoff games under the Final Fantasy label in the United States and Europe as another way of testing the Western market: Final Fantasy Legend and Final Fantasy Adventure for the Nintendo Game Boy handheld console, and Final Fantasy: Mystic Quest for the Super Nintendo. Although none of them were huge sellers, the Game Boy titles in particular have their fans even today. This habit of skipping over parts of the series led to a confusing state of affairs whereby the American Final Fantasy II was the Japanese Final Fantasy IV and the American Final Fantasy III was the Japanese Final Fantasy VI. The latter game shifted barely one-fourth as many copies in the three-times larger American marketplace as it had in Japan — not disastrous numbers, but still less than the first Final Fantasy had managed.

The heart of the problem was translation, in both the literal sense of the words on the screen and a broader cultural sense. Believing with some justification that the early American consoles from Atari and others had been undone by a glut of substandard product, Nintendo had long made a science out of the polishing of gameplay, demanding that every prospective release survive an unrelenting testing gauntlet before it was granted the “Nintendo Seal of Quality” and approved for sale. But the company had no experience or expertise in polishing text to a similar degree. In most cases, this didn’t matter; most Nintendo games contained very little text anyway. But RPGs were the exception. The increasingly intricate story lines which JRPGs were embracing by the early 1990s demanded good translations by native speakers. What many of them actually got was something very different, leaving even those American gamers who wanted to fall in love baffled by the Japanese-English-dictionary-derived word salads they saw before them. And then, too, many of the games’ cultural concerns and references were distinctly Japanese, such that even a perfect translation might have left Americans confused. It was, one might say, the Blade of Cuisinart problem in reverse.

To be sure, there were Americans who found all of the barriers to entry into these deeply foreign worlds to be more bracing than intimidating, who took on the challenge of meeting the games on their own terms, often emerging with a lifelong passion for all things Japanese. At this stage, though, they were the distinct minority. In Japan and the United States alike, the conventional wisdom through the mid-1990s was that JRPGs didn’t and couldn’t sell well overseas; this was regarded as a fact of life as fundamental as the vagaries of climate. (Thanks to this belief, none of the mainline Final Fantasy games to date had been released in Europe at all.) It would take Final Fantasy VII and a dramatic, controversial switch of platforms on the part of Square to change that. But once those things happened… look out. The JRPG would conquer the world yet.


Where to Get It: Remastered and newly translated versions of the Japanese Final Fantasy I, II, III, IV, V, and VI are available on Steam. The Dragon Quest series has been converted to iOS and Android apps, just a search away on the Apple and Google stores.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Pure Invention: How Japan Made the Modern World by Matt Alt, Power-Up: How Japanese Video Games Gave the World an Extra Life by Chris Kohler, Fight, Magic, Items: The History of Final Fantasy, Dragon Quest, and the Rise of Japanese RPGs in the West by Aidan Moher, and Atari to Zelda: Japan’s Videogames in Global Contexts by Mia Consalvo. GameFan of September 1997; Retro Gamer 69, 108, and 170; Computer Gaming World of September 1985 and December 1992.

Online sources include Polygon‘s authoritative Final Fantasy 7: An Oral History”; “The Long Life of the Original Wizardry by guest poster Alex on The CRPG Addict blog; Wizardry: Japanese Franchise Outlook” by Sam Derboo at Hardcore Gaming 101, plus an interview Robert Woodhead, conducted by Jared Petty at the same site; Wizardry‘s Wild Ride from West to East” at VentureBeat; “The Secret History of AnimEigo” at that company’s homepage; Robert Woodhead’s slides from a presentation at the 2022 KansasFest Apple II convention; a post on tabletop Wizardry at the Japanese Tabletop RPG blog; and Dragon Warrior: Aging Disgracefully” by Nick Simberg at (the now-defunct) DamnLag.

Footnotes

Footnotes
1 Adams was not an entirely disinterested observer. He was already working with Robert Woodhead on Wizardry IV, and had in fact accompanied him to Japan in this capacity.
2 A man with an international perspective if ever there was one, Rogers would later go on to fame and fortune as the man who brought Tetris out of the Soviet Union.
3 It would be briefly revived for one final game, the appropriately named Wizardry 8, in 2001.
4 In another unexpected link between East and West, one of his most important assistants became Nasir Gebelli, an Iranian who had fled his country’s revolution for the United States in 1979 and become a game-programming rock star on the Apple II. After the heyday of the lone-wolf bedroom auteur began to fade there, Doug Carlston, the head of Brøderbund, brokered a job for him with his friends in Japan. There he maximized the Famicom’s potential in the same way he had that of the Apple II, despite not speaking a word of Japanese when he arrived. (“We’d go to a restaurant and no matter what he’d order — spaghetti or eggs — they’d always bring out steak,” Sakaguchi laughs.) Gebelli would program the first three Final Fantasy games almost all by himself.
5 Square did release a few spinoff games under the Final Fantasy label in the United States and Europe as another way of testing the Western market: Final Fantasy Legend and Final Fantasy Adventure for the Nintendo Game Boy handheld console, and Final Fantasy: Mystic Quest for the Super Nintendo. Although none of them were huge sellers, the Game Boy titles in particular have their fans even today.
 
65 Comments

Posted by on November 17, 2023 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

A Dialog in Real Time (Strategy)

At the end of the 1990s, the two most popular genres in computer gaming were the first-person shooter and the real-time strategy game. They were so dominant that most of the industry’s executives seemed to want to publish little else. And yet at the beginning of the decade neither genre even existed.

The stories of how the two rose to such heady heights are a fascinating study in contrasts, of how influences in media can either go off like an explosion in a TNT factory or like the slow burn of a long fuse. Sometimes something appears and everyone knows instantly that it’s just changed everything; when the Beatles dropped Sgt. Pepper’s Lonely Hearts Club Band in 1967, there was no doubt that the proverbial goalposts in rock music had just been shifted. Other times, though, influence can take years to make itself felt, as was the case for another album of 1967, The Velvet Underground & Nico, about which Brian Eno would later famously say that it “only sold 10,000 copies, but everyone who bought it formed a band.”

Games are the same. Gaming’s Sgt. Pepper was DOOM, which came roaring up out of the shareware underground at the tail end of 1993 to sweep everything from its path, blowing away all of the industry’s extant conventional wisdom about what games would become and what role they would play in the broader culture. Gaming’s Velvet Underground, on the other hand, was the avatar of real-time strategy, which came to the world in the deceptive guise of a sequel in the fall of 1992. Dune II: The Building of a Dynasty sported its Roman numeral because its transnational publisher had gotten its transatlantic cables crossed and accidentally wound up with two separate games based on Frank Herbert’s epic 1965 science-fiction novelone made in Paris, the other in Las Vegas. The former turned out to be a surprisingly evocative and playable fusion of adventure and strategy game, but it was the latter that would quietly — oh, so quietly in the beginning! — shift the tectonic plates of gaming.

For Dune II, which was developed by Westwood Studios and published by Virgin Games, really was the first recognizable implementation of the genre of real-time strategy as we have come to know it since. You chose one of three warring trading houses to play, then moved through a campaign made up of a series of set-piece scenarios, in which your first goal was always to make yourself an army by gathering resources and using them to build structures that could churn out soldiers, tanks, aircraft, and missiles, all of which you controlled by issuing them fairly high-level orders: “go here,” “harvest there,” “defend this building,” “attack that enemy unit.” Once you thought you were strong enough, you could launch your full-on assault on the enemy — or, if you weren’t quick enough, you might find yourself trying to fend off his attack. What made it so different from most of the strategy games of yore was right there in the name: in the fact that it all played out in real time, at a pace that ranged from the brisk to the frantic, making it a test of your rapid-fire mousemanship and your ability to think on your feet. Bits and pieces of all this had been seen before — perhaps most notably in Peter Molyneux and Bullfrog’s Populous and the Sega Genesis game Herzog Zwei — but Dune II was where it all came together to create a gaming paradigm for the ages.

That said, Dune II was very much a diamond in the rough, a game whose groundbreaking aspirations frequently ran up against the brick wall of its limitations. It’s likely to leave anyone who has ever played almost any other real-time-strategy game seething with frustration. It runs at a resolution of just 320 X 200, giving only the tiniest window into the battlefield; it only lets you select and control one unit at a time, making coordinated attacks and defenses hard to pull off; its scenarios are somewhat rote exercises, differing mainly in the number of enemy hordes they throw against you as you advance through the campaign rather than the nature of the terrain or your objectives. Even its fog of war is wonky: the whole battlefield is blank blackness until one of your units gets within visual range, after which you can see everything that goes on there forevermore, whether any of your units can still lay eyes on it or not. And it has no support whatsoever for the multiplayer free-for-alls that are for many or most players the biggest draw of the genre.

Certainly Virgin had no inkling that they had a nascent ludic revolution on their hands. They released Dune II with more of a disinterested shrug than a fulsome fanfare, having expended most of their promotional energies on the other Dune, which had come out just a few months earlier. It’s a testimony to the novelty of the gameplay experience that it did as well as it did. It didn’t become a massive hit, but it sold well enough to earn its budget back and then some on the strength of reasonably positive reviews — although, again, no reviewer had the slightest notion that he was witnessing the birth of what would be one of the two hottest genres in gaming six years in the future. Even Westwood seemed initially to regard Dune II as a one-and-done. They wouldn’t release another game in the genre they had just invented for almost three years.

But the gaming equivalent of all those budding bedroom musicians who listened to that Velvet Underground record was also out there in the case of Dune II. One hungry, up-and-coming studio in particular decided there was much more to be done with the approach it had pioneered. And then Westwood themselves belatedly jumped back into the fray. Thanks to the snowball that these two studios got rolling in earnest during the mid-1990s, the field of real-time strategy would be well and truly saturated by the end of the decade, the yin to DOOM‘s yang. This, then, is the tale of those first few years of these two studios’ competitive dialog, over the course of which they turned the real-time strategy genre from a promising archetype into one of gaming’s two biggest, slickest crowd pleasers.


Blizzard Studios is one of the most successful in the history of gaming, so much so that it now lends its name to the Activision Blizzard conglomerate, with annual revenues in the range of $7.5 billion. In 1993, however, it was Westwood, flying high off the hit dungeon crawlers Eye of the Beholder and Lands of Lore, that was by far the more recognizable name. In fact, Blizzard wasn’t even known yet as Blizzard.

The company had been founded in late 1990 by Allen Adham and Mike Morhaime, a couple of kids fresh out of university, on the back of a $15,000 loan from Morhaime’s grandmother. They called their venture Silicon & Synapse, setting it up in a hole-in-the-wall office in Costa Mesa, California. They kept the lights on initially by porting existing games from one platform to another for publishers like Interplay — the same way, as it happened, that Westwood had gotten off the ground almost a decade before. And just as had happened for Westwood, Silicon & Synapse gradually won opportunities to make their own games once they had proven themselves by porting those of others. First there was a little auto-racing game for the Super Nintendo called RPM Racing, then a pseudo-sequel to it called Rock ‘n’ Roll Racing, and then a puzzle platformer called The Lost Vikings, which appeared for the Sega Genesis, MS-DOS, and the Commodore Amiga in addition to the Super Nintendo. None of these titles took the world by storm, but they taught Silicon & Synapse what it took to create refined, playable, mass-market videogames from scratch. All three of those adjectives have continued to define the studio’s output for the past 30 years.

It was now mid-1993; Silicon & Synapse had been in business for more than two and a half years already. Adham and Morhaime wanted to do something different — something bigger, something that would be suitable for computers only rather than the less capable consoles, a real event game that would get their studio’s name out there alongside the Westwoods of the world. And here there emerged another of their company’s future trademarks: rather than invent something new from whole or even partial cloth, they decided to start with something that already existed, but make it better than ever before, polishing it until it gleamed. The source material they chose was none other than Westwood’s Dune II, now relegated to the bargain bins of last year’s releases, but a perennial after-hours favorite at the Silicon & Synapse offices. They all agreed as to the feature they most missed in Dune II: a way to play it against other people, like you could its ancestor Populous. The bane of most multiplayer strategy games was their turn-based nature, which left you waiting around half the time while your buddy was playing. Real-time strategy wouldn’t have this problem of downtime.

That became the design brief for Warcraft: Orcs & Humans: remake Dune II but make it even better, and then add a multiplayer feature. And then, of course, actually try to sell the thing in all the ways Virgin had not really tried to sell its inspiration.

To say that Warcraft was heavily influenced by Dune II hardly captures the reality. Most of the units and buildings to hand have a direct correspondent in Westwood’s game. Even the menu of icons on the side of the screen is a virtual carbon copy — or at least a mirror image. “I defensively joked that, while Warcraft was certainly inspired by Dune II, [our] game was radically different,” laughs Patrick Wyatt, the lead programmer and producer on the project. “Our radar mini-map was in the upper left corner of the screen, whereas theirs was in the bottom right corner.”

In the same spirit of change, Silicon & Synapse replaced the desert planet of Arrakis with a fantasy milieu pitting, as the subtitle would suggest, orcs against humans. The setting and the overall look of Warcraft owe almost as much to the tabletop miniatures game Warhammer as the gameplay does to Dune II; a Warhammer license was seriously considered, but ultimately rejected as too costly and potentially too restrictive. Years later, Wyatt’s father would give him a set of Warhammer miniatures he’d noticed in a shop: “I found these cool toys and they reminded me a lot of your game. You might want to have your legal department contact them because I think they’re ripping you off.”

Suffice to say, then, that Warcraft was even more derivative than most computer games. The saving grace was the same that it would ever be for this studio: that they executed their mishmash of influences so well. The squishy, squint-eyed art is stylized like a cartoon, a wise choice given that the game is still limited to a resolution of just 320 X 200, so that photo-realism is simply not on the cards. The overall look of Warcraft has more in common with contemporary console games than the dark, gritty aesthetic that was becoming so popular on computers. The guttural exclamations of the orcs and the exaggerated Monty Python and the Holy Grail-esque accents of the humans, all courtesy of regular studio staffers rather than outside voice actors, become a chorus line as you order them hither and yon, making Dune II seem rather stodgy and dull by comparison. “We felt too many games took themselves too seriously,” says Patrick Wyatt. “We just wanted to entertain people.”

Slavishly indebted though it is to Dune II in all the broad strokes, Warcraft doesn’t neglect to improve on its inspiration in those nitty-gritty details that can make the difference between satisfaction and frustration for the player. It lets you select up to four units and give them orders at the same time by simply dragging a box around them, a quality-of-life addition whose importance is difficult to overstate, one so fundamental that no real-time-strategy game from this point forward would dare not to include it. Many more keyboard shortcuts are added, a less technically impressive addition but one no less vital to the cause of playability when the action starts to heat up. There are now two resources you need to harvest, lumber and gold, in places of Dune II‘s all-purpose spice. Units are now a little more intelligent about interpreting your orders, such that they no longer blithely ignore targets of opportunity, or let themselves get mauled to death without counterattacking just because you haven’t explicitly told them to. Scenario design is another area of marked improvement: whereas every Dune II scenario is basically the same drill, just with ever more formidable enemies to defeat, Warcraft‘s are more varied and arise more logically out of the story of the campaign, including a couple of special scenarios with no building or gathering at all, where you must return a runaway princess to the fold (as the orcs) or rescue a stranded explorer (as the humans).

The orc on the right who’s stroking his “sword” looks so very, very wrong — and this screenshot doesn’t even show the animation…

And, as the cherry on top, there was multiplayer support. Patrick Wyatt finished his first, experimental implementation of it in June of 1994, then rounded up a colleague in the next cubicle over so that they could became the first two people ever to play a full-fledged real-time-strategy game online. “As we started the game, I felt a greater sense of excitement than I’d ever known playing any other game,” he says.

It was just this magic moment, because it was so invigorating to play against a human and know that it wasn’t some stupid AI. It was a player who was smart and doing his absolute best to crush you. I knew we were making a game that would be fun, but at that moment I knew the game would absolutely kick ass.

While work continued on Warcraft, the company behind it was going through a whirlwind of changes. Recognizing at long last that “Silicon & Synapse” was actually a pretty terrible name, Adham and Morhaime changed it to Chaos Studios, which admittedly wasn’t all that much better, in December of 1993. Two months later, they got an offer they couldn’t refuse: Davidson & Associates, a well-capitalized publisher of educational software that was looking to break into the gaming market, offered to buy the freshly christened Chaos for the princely sum of $6.75 million. It was a massive over-payment for what was in all truth a middling studio at best, such that Adham and Morhaime felt they had no choice but to accept, especially after Davidson vowed to give them complete creative freedom. Three months after the acquisition, the founders decided they simply had to find a decent name for their studio before releasing Warcraft, their hoped-for ticket to the big leagues. Adham picked up a dictionary and started leafing through it. He hit pay dirt when his eyes flitted over the word “blizzard.” “It’s a cool name! Get it?” he asked excitedly. And that was that.

So, Warcraft hit stores in time for the Christmas of 1994, with the name of “Blizzard Entertainment” on the box as both its developer and its publisher — the wheels of the latter role being greased by the distributional muscle of Davidson & Associates. It was not immediately heralded as a game that would change everything, any more than Dune II had been; real-time strategy continued to be more of a slowly growing snowball than the ton of bricks to the side of the head that the first-person shooter had been. Computer Gaming World magazine gave Warcraft a cautious four stars out of five, saying that “if you enjoy frantic real-time games and if you don’t mind a linear structure in your strategic challenges, Warcraft is a good buy.” At the same time, the extent of the game’s debt to Dune II was hardly lost on the reviewer: “It’s a good thing for Blizzard that there’s no precedent for ‘look and feel’ lawsuits in computer entertainment.”[1]This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.

Warcraft would eventually sell 400,000 units, bettering Dune II‘s numbers by a factor of four or more. As soon as it became clear that it was doing reasonably well, Blizzard started on a sequel.


Out of everyone who looked at Warcraft, no one did so with more interest — or with more consternation at its close kinship with Dune II — than the folks at Westwood. “When I played Warcraft, the similarities between it and Dune II were pretty… blatant, so I didn’t know what to think,” says the Westwood designer Adam Isgreen. Patrick Wyatt of Blizzard got the impression that his counterparts “weren’t exactly happy” at the slavish copying when they met up at trade shows, though he “reckoned they should have been pleased that we’d taken their game as a base for ours.” Only gradually did it become clear why Warcraft‘s existence was a matter of such concern for Westwood: because they themselves had finally decided to make another game in the style of Dune II.

The game that Westwood was making could easily have wound up looking even more like the one that Blizzard had just released. The original plan was to call it Command & Conquer: Fortress of Stone and to set it in a fantasy world. (Westwood had been calling their real-time-strategy engine “Command & Conquer” since the days of promoting Dune II.) “It was going to have goldmines and wood for building things. Sound familiar?” chuckles Westwood’s co-founder Louis Castle. “There were going to be two factions, humans and faerie folk… pretty fricking close to orcs versus humans.”

Some months into development, however, Westwood decided to change directions, to return to a science-fictional setting closer to that of Dune II. For they wanted their game to be a hit, and it seemed to them that fantasy wasn’t the best guarantee of such a thing: CRPGs were in the doldrums, and the most recent big strategy release with a fantasy theme, MicroProse’s cult-classic-to-be Master of Magic, hadn’t done all that well either. Foreboding near-future stories, however, were all the rage; witness the stellar sales of X-COM, another MicroProse strategy game of 1994. “We felt that if we were going to make something that was massive,” says Castle, “it had to be something that anybody and everybody could relate to. Everybody understands a tank; everybody understands a guy with a machine gun. I don’t have to explain to them what this spell is.” Westwood concluded that they had made the right decision as soon as they began making the switch in software: “Tanks and vehicles just felt better.” The game lost its subtitle to become simply Command & Conquer.

While the folks at Blizzard were plundering Warhammer for their units and buildings, those at Westwood were trolling the Jane’s catalogs of current military hardware and Soldier of Fortune magazine. “We assumed that anything that was talked about as possibly coming was already here,” says Castle, “and that was what inspired the units.” The analogue of Dune II‘s spice — the resource around which everything else revolved — became an awesomely powerful space-born element come to earth known as tiberium.

Westwood included most of the shortcuts and conveniences that Blizzard had built into Warcraft, but went one or two steps further more often than not. For example, they also made it possible to select multiple units by dragging a box around them, but in their game there was no limit to the number of units that could be selected in this way. The keyboard shortcuts they added not only let you quickly issue commands to units and buildings, but also jump around the map instantly to custom viewpoints you could define. And up to four players rather than just two could now play together at once over a local network or the Internet, for some true mayhem. Then, too, scenario design was not only more varied than in Dune II but was even more so than in Warcraft, with a number of “guerilla” missions in the campaigns that involved no resource gathering or construction. It’s difficult to say to what extent these were cases of parallel innovation and to what extent they were deliberate attempts to one-up what Warcraft had done. It was probably a bit of both, given that Warcraft was released a good nine months before Command & Conquer, giving Westwood plenty of time to study it.

But other innovations in Command & Conquer were without any precedent. The onscreen menus could now be toggled on and off, for instance, a brilliant stroke that gave you a better view of the battlefield when you really needed it. Likewise, Westwood differentiated the factions in the game in a way that had never been done before. Whereas the different houses in Dune II and the orcs and humans in Warcraft corresponded almost unit for unit, the factions in Command & Conquer reflected sharply opposing military philosophies, demanding markedly different styles of play: the establishment Global Defense Initiative had slow, strong, and expensive units, encouraging a methodical approach to building up and husbanding your forces, while the terroristic Brotherhood of Nod had weaker but faster and cheaper minions better suited to madcap kamikaze rushes than carefully orchestrated combined-arms operations.

Yet the most immediately obvious difference between Command & Conquer and Warcraft was all the stuff around the game. Warcraft had been made on a relatively small budget with floppy disks in mind. It sported only a brief opening cinematic, after which scenario briefings consisted of nothing but scrolling text and a single voice over a static image. Command & Conquer, by contrast, was made for CD-ROM from the outset, by a studio with deeper pockets that had invested a great deal of time and energy into both 3D animation and full-motion video, that trendy art of incorporating real-world actors and imagery into games. The much more developed story line of Command & Conquer is forwarded by little between-mission movies that, if not likely to make Steven Spielberg nervous, are quite well-done for what they are, featuring as they do mostly professional performers — such as a local Las Vegas weatherman playing a television-news anchorman — who were shot by a real film crew in Westwood’s custom-built blue-screen studio. Westwood’s secret weapon here was Joseph Kucan, a veteran theater director and actor who oversaw the film shoots and personally played the charismatic Nod leader Kane so well that he became the very face of Command & Conquer in the eyes of most gamers, arguably the most memorable actual character ever associated with a genre better known for its hordes of generic little automatons. Louis Castle reckons that at least half of Command & Conquer‘s considerable budget went into the cut scenes.

The game was released with high hopes in August of 1995. Computer Gaming World gave it a pretty good review, four stars out of five: “The entertainment factor is high enough and the action fast enough to please all but the most jaded wargamers.”

The gaming public would take to it even more than that review might imply. But in the meantime…


As I noted in an earlier article, numbered sequels weren’t really commonplace for strategy games prior to the mid-1990s. Blizzard had originally imagined Warcraft as a strategy franchise of a different stripe: each game bearing the name would take the same real-time approach into a completely different milieu, as SSI was doing at the time with their “5-Star General” series of turn-based strategy games that had begun with Panzer General and continued with the likes of Fantasy General and Star General. But Blizzard soon decided to make their sequel a straight continuation of the first game, an approach to which real-time strategy lent itself much more naturally than more traditional styles of strategy game; the set-piece story of a campaign could, after all, always be continued using all the ways that Hollywood had long since discovered for keeping a good thing going. The only snafu was that either the orcs or the humans could presumably have won the war in the first game, depending on which side the player chose. No matter: Blizzard decided the sequel would be more interesting if the orcs had been the victors and ran with that.

Which isn’t to say that building upon its predecessor’s deathless fiction was ever the real point of Warcraft II: Tides of Darkness. Blizzard knew now that they had a competitor in Westwood, and were in any case eager to add to the sequel all of the features and ideas that time had not allowed them to include in the first game. There would be waterways and boats to sail on them, along with oil, a third resource, one that could only be mined at sea. Both sides would get new units to play with, while elves, dwarves, trolls, ogres, and goblins would join the fray as allies of one of the two main racial factions. The interface would be tweaked with another welcome shortcut: selecting a unit and right-clicking somewhere would cause it to carry out the most logical action there without having to waste time choosing from a menu. (After all, if you selected a worker unit and sent him to a goldmine, you almost certainly wanted him to start collecting gold. Why should you have to tell the game the obvious in some more convoluted fashion?)

But perhaps the most vital improvement was in the fog of war. The simplistic implementations of same seen in the first Warcraft and Command & Conquer were inherited from Dune II: areas of the map that had been seen once by any of your units were revealed permanently, even if said units went away or were destroyed. Blizzard now made it so that you would see only a back-dated snapshot of areas currently out of your units’ line of sight, reflecting what was there the last time one of your units had eyes on them. This innovation, no mean feat of programming on the part of Patrick Wyatt, brought a whole new strategic layer to the game. Reconnaissance suddenly became something you had to think about all the time, not just once.

Other improvements were not so conceptually groundbreaking, but no less essential for keeping ahead of the Joneses (or rather the Westwoods). For example, Blizzard raised the screen-resolution stakes, from 320 X 200 to 640 X 480, even as they raised the number of people who could play together online from Command & Conquer‘s four to eight. And, while there was still a limit on the number of units you could select at one time using Blizzard’s engine, that limit at least got raised from the first Warcraft‘s four to nine.

The story and its presentation, however, didn’t get much more elaborate than last time out. While Westwood was hedging its bets by keeping one foot in the “interactive movie” space of games like Wing Commander III, Blizzard was happy to “just” make Warcraft a game. The two series were coming to evince very distinct personalities and philosophies, just as gamers were sorting themselves into opposing groups of fans — with a large overlap of less partisan souls in between them, of course.

Released in December of 1995, Warcraft II managed to shake Computer Gaming World free of some of its last reservations about the burgeoning genre of real-time strategy, garnering four and a half stars out of five: “If you enjoy fantasy gaming, then this is a sure bet for you.” It joined Command & Conquer near the top of the bestseller lists, becoming the game that well and truly made Blizzard a name to be reckoned with, a peer in every sense with Westwood.

Meanwhile, and despite the sometimes bitter rivalry between the two studios and their fans, Command & Conquer and Warcraft II together made real-time strategy into a commercial juggernaut. Both games became sensations, with no need to shirk from comparison to even DOOM in terms of their sales and impact on the culture of gaming. Each eventually sold more than 3 million copies, numbers that even the established Westwood, much less the upstart Blizzard, had never dreamed of reaching before, enough to enshrine both games among the dozen or so most popular computer games of the entire 1990s. More than three years after real-time strategy’s first trial run in Dune II, the genre had arrived for good and all. Both Westwood and Blizzard rushed to get expansion packs of additional scenarios for their latest entries in the genre to market, even as dozens of other developers dropped whatever else they were doing in order to make real-time-strategy games of their own. Within a couple of years, store shelves would be positively buckling under the weight of their creations — some good, some bad, some more imaginative, some less so, but all rendered just a bit anonymous by the sheer scale of the deluge. And yet even the most also-ran of the also-rans sold surprisingly well, which explained why they just kept right on coming. Not until well into the new millennium would the tide begin to slacken.


With Command & Conquer and Warcraft II, Westwood and Blizzard had arrived at an implementation of real-time strategy that even the modern player can probably get on with. Yet there is one more game that I just have to mention here because it’s so loaded with a quality that the genre is known for even less than its characters: that of humor. Command & Conquer: Red Alert is as hilarious as it is unexpected, the only game of this style that’s ever made me laugh out loud.

Red Alert was first envisioned as a scenario pack that would move the action of its parent game to World War II. But two things happened as work progressed on it: Westwood decided it was different enough from the first game that it really ought to stand alone, and, as designer Adam Isgreen says, “we found straight-up history really boring for a game.” What they gave us instead of straight-up history is bat-guano insane, even by the standards of videogame fictions.

We’re in World War II, but in a parallel timeline, because Albert Einstein — why him? I have no idea! — chose to travel back in time on the day of the Trinity test of the atomic bomb and kill Adolf Hitler. Unfortunately, all that’s accomplished is to make world conquest easier for Joseph Stalin. Now Einstein is trying to save the democratic world order by building ever more powerful gadgets for its military. Meanwhile the Soviet Union is experimenting with the more fantastical ideas of Nikola Tesla, which in this timeline actually work. So, the battles just keep getting crazier and crazier as the game wears on, with teleporters sending units jumping instantly from one end of the map to the other, Tesla coils zapping them with lightning, and a fetching commando named Tanya taking out entire cities all by herself when she isn’t chewing the scenery in the cut scenes. Those actually display even better production values than the ones in the first game, but the script has become pure, unadulterated camp worthy of Mel Brooks, complete with a Stalin who ought to be up there singing and dancing alongside Der Führer in Springtime for Hitler. Even our old friend Kane shows up for a cameo. It’s one of the most excessive spectacles of stupidity I’ve ever seen in a game… and one of the funniest.

Joseph Stalin gets rough with an underling. When you don’t have the Darth Vader force grip, you have to do things the old-fashioned way…

Up there at the top is the killer commando Tanya, who struts across the battlefield with no regard for proportion.

Released in the dying days of 1996, Red Alert didn’t add that much that was new to the real-time-strategy template, technically speaking; in some areas such as fog of war, it still lagged behind the year-old Warcraft II. Nonetheless, it exudes so much joy that it’s by far my favorite of the games I’ve written about today. If you ask me, it would have been a better gaming world had the makers of at least a few of the po-faced real-time-strategy games that followed looked here for inspiration. Why not? Red Alert too sold in the multiple millions.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: the book Stay Awhile and Listen, Book I by David L. Craddock; Computer Gaming World of January 1995, March 1995, December 1995, March 1996, June 1996, September 1996, December 1996, March 1997, June 1997, and July 1997; Retro Gamer 48, 111, 128, and 148; The One of January 1993; the short film included with the Command & Conquer: The First Decade game collection. Online sources include Patrick Wyatt’s recollections at his blog Code of Honor, Dan Griliopoulos’s collection of interviews with Westwood alumni at Funambulism, Soren Johnson’s interview with Louis Castle for his Designer’s Notes podcast, and Richard Moss’s real-time-strategy retrospective for Ars Technica.

Warcraft: Orcs & Humans and Warcraft II: Tides of Darkness, are available as digital purchases at GOG.com. The first Command & Conquer and Red Alert are available in remastered versions as a bundle from Steam.)

Footnotes

Footnotes
1 This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.
 

Tags: , , , , ,