RSS

Tag Archives: blizzard

Starcraft (A History in Two Acts)

Act 1: Starcraft the Game



Great success brings with it great expectations. And sometimes it brings an identity crisis as well.

After Blizzard Entertainment’s Warcraft: Orcs and Humans became a hit in 1995, the company started down a very conventional path for a new publisher feeling its oats, initiating a diverse array of projects from internal and external development teams. In addition to the inevitable Warcraft sequel, there were a streamlined CRPG known as Diablo, a turn-based tactical-battle game known as Shattered Nations, a 4X grand-strategy game known as Pax Imperia II, even an adventure game taking place in the Warcraft universe — to be called, naturally enough, Warcraft Adventures. Then, too, even before Warcraft II: Tides of Darkness was finished, Blizzard had already started on a different sort of spinoff than Warcraft Adventures, one which jettisoned the fantasy universe but stayed within the same gameplay genre of real-time strategy. It was to be called Starcraft, and was to replace fantasy with science fiction. Blizzard thought that one team could crank it out fairly quickly using the existing Warcraft II engine, while another one retooled their core RTS technology for Warcraft III.

In May of 1996, with Warcraft II now six months old and a massive hit, Blizzard brought early demos of Starcraft and most of their other works in progress to E3, the games industry’s new annual showcase. One could make a strong argument that the next few days on the E3 show floor were the defining instant for the Blizzard brand as we still know it today.

The version of Starcraft that Blizzard brought to the 1996 E3 show. Journalists made fun of its fluorescent purple color palette among other things. Was this game being designed by Prince?

The gaming press was not particularly kind to the hodgepodge of products that Blizzard showed them at E3. They were especially cruel to Starcraft, which they roundly mocked for being exactly what it was, a thinly reskinned version of Warcraft II — or, as some journalists took to calling it, Orcs in Space. Everyone from Blizzard came home badly shaken by the treatment. So, after a period of soul searching and much fraught internal discussion, Blizzard’s co-founders Allen Adham and Mike Morhaime decided not to be quite so conventional in the way they ran their business. They took a machete to their jungle of projects which seemed to have spontaneously sprouted out of nowhere as soon as the money started to roll in. When all was said and done, they allowed only two of them to live on: Diablo, which was being developed at the newly established Blizzard North, of San Mateo, California; and Starcraft, down at Blizzard South in Irvine, California. But the latter was no longer to be just a spinoff. “We realized, this product’s just going to suck,” says Blizzard programmer Pat Wyatt of the state of the game at that time. “We need to have all our effort put into it. And everything about it was rebooted: the team that was working on it, the leadership, the design, the artwork — everything was changed.”

Blizzard’s new modus operandi would be to publish relatively few games, but to make sure that each and every one of them was awesome, no matter what it took. In pursuit of that goal, they would do almost everything in-house, and they would release no game before its time. The time of Starcraft, that erstwhile quickie Warcraft spinoff, wouldn’t come until March of 1998, while Warcraft III wouldn’t drop until 2002. In defiance of all of the industry’s conventional wisdom, the long gaps between releases wouldn’t prove ruinous; quite the opposite, in fact. Make the games awesome, Blizzard would learn, and the gamers will be there waiting with money in hand when they finally make their appearance.

Adham and Morhaime fostered as non-hierarchical a structure as possible at Blizzard, such that everyone, regardless of their ostensible role — from programmers to artists, testers to marketers — felt empowered to make design suggestions, knowing that they would be acted upon if they were judged worthy by their peers. Thus, although James Phinney and Chris Metzen were credited as “lead designers” on Starcraft, the more telling credit is the one that attributes the design as a whole simply to “Blizzard Entertainment.” The founders preferred to promote from within, retaining the entry-level employees who had grown up with the Blizzard Way rather than trying to acclimatize outsiders who were used to less freewheeling approaches. Phinney and Metzen were typical examples: the former had started at Blizzard as a humble tester, the latter as a manual writer and line artist.

For all that Blizzard’s ambitions for Starcraft increased dramatically over the course of its development, it was never intended to be a radical formal departure from what had come before. From start to finish, it was nothing more nor less than another sprite-based 2D RTS like Warcraft II.  It was just to be a better iteration on that concept — so much better that it verged on becoming a sort of Platonic ideal for this style of game. Blizzard would keep on improving it until they started to run out of ideas for making it better still. Only then would they think about shipping it.

The finished Starcraft in action, looking much more chic than it did during its Orcs in Space days.

The exceptions to this rule of iteration rather than blue-sky invention all surrounded the factions that you could either control or play against. There were three of them rather than the standard two, for one thing. But far more importantly, each of the factions was truly unique, in marked contrast to those of Warcraft and Warcraft II. In those games, the two factions’ units largely mirrored one another in a tit-for-tat fashion, merely substituting different names and sprites for the same sets of core functions. Yet Starcraft had what Blizzard liked to call an “asymmetric” design; each of the three factions played dramatically differently, with none of the neat one-to-one correspondences that had been the norm within the RTS genre prior to this point.

In fact, the factions could hardly have been more different from one another. There were the Terrans, Marines in space who talked like the drill sergeant in Full Metal Jacket and fought with rifles and tanks made out of good old reliable steel; the Zerg, an insectoid alien race in thrall to a central hive mind, all crunchy carapaces and savage slime; and the Protoss, aloof, enigmatic giants who could employ psionic powers as devastatingly as they could their ultra-high-tech weaponry.

The single-player campaign in which you got to take the factions for a spin was innovative in its way as well. Instead of asking you to choose a side to control at the outset, the campaign expected you to play all three of them in succession, working your way through a sprawling story of interstellar conflict, as told in no fewer than 30 individual scenarios. It cleverly began by placing you in control of the Terrans, the most immediately relatable faction, then moved on to the movie-monster-like Zerg and finally the profoundly alien Protoss once you’d gotten your sea legs.

Although it seems safe to say that the campaign was never the most exciting part of Starcraft for the hyper-competitive young men at Blizzard, they didn’t stint on the effort they put into it. They recognized that the story and cinematics of Westwood Studio’s Command & Conquer — all that stuff around the game proper — was the one area where that arch-rival RTS franchise had comprehensively outdone them to date. Determined to rectify this, they hired Harley D. Huggins II, a fellow who had done some CGI production on the recent film Starship Troopers — a movie whose overall aesthetic had more than a little in common with Starcraft — as the leader of their first dedicated cinematics team. The story can be a bit hard to follow, what with its sometimes baffling tangle of groups who are forever allying with and then betraying one another, the better to set up every possible permutation of battle. (As Blizzard wrote on their back-of-the-box copy, “The only allies are enemies!”) Still, no one can deny that the campaign is presented really, really well, from the cut-scenes that come along every few scenarios to the voice acting during the mission briefings, which turn into little audio dramas in themselves. That said, a surprising amount of the story is actually conveyed during the missions, when your objectives can unexpectedly change on a dime; this was new to the RTS genre.

One of the cut-scenes which pop up every few scenarios during the campaign. Blizzard’s guiding ethic was to make them striking but short, such that no one would be tempted to skip them. Their core player demographic was not known for its patience with long-winded exposition…

Nonetheless, any hardcore Starcraft player will tell you that multiplayer is where it’s really at. When Blizzard released Diablo in the dying days of 1996, they debuted alongside it Battle.net, a social space and matchmaking service for multiplayer sessions over the Internet. Its contribution to Diablo’s enormous success is incalculable. Starcraft was to be the second game supported by the service, and Blizzard had no reason to doubt that it would prove just as important if not more so to their latest RTS.

If all of Starcraft was to be awesome, multiplayer Starcraft had to be the most awesome part of all. This meant that the factions had to be balanced; it wouldn’t do to have the outcome of matches decided before they even began, based simply upon who was playing as whom. After the basic framework of the game was in place, Blizzard brought in a rare outsider, a tireless analytical mind by the name of Rob Pardo, to be a sort of balance specialist, looking endlessly for ways to break the game. He not only played it to exhaustion himself but watched match after match, including hundreds played over Battle.net by fans who were lucky enough to be allowed to join a special beta program, the forerunner of Steam Early Access and the like of today. Rather than merely erasing the affordances that led to balance problems — affordances which were often among the funnest parts of the game — Pardo preferred to tweak the numbers involved and/or to implement possible countermeasures for the other factions, then throw the game out for yet another round of testing. This process added months to the development cycle, but no one seemed to mind. “We will release it when it’s ready,” remained Blizzard’s credo, in defiance of holidays, fiscal quarters, and all of the other everyday business logic of their industry. Luckily, the ongoing strong sales of Warcraft II and Diablo gave them that luxury.

Indeed, Blizzard veterans like to joke today that Starcraft was just two months away from release for a good fourteen months. They crunched and crunched and crunched, living lifestyles that were the opposite of healthy. “Relationships were destroyed,” admits Pat Wyatt. “People got sick.” At last, on March 27, 1998, the exhausted team pronounced the game done and sent it off to be pressed onto hundreds of thousands of CDs. The first boxed copies reached store shelves four days later.

Starcraft was a superb game by any standard, the most tactically intricate, balanced, and polished RTS to date, arguably for years still to come. It was familiar enough not to intimidate, yet fresh enough to make the purchase amply justifiable. Thanks to all of these qualities, it sold more than 1.5 million copies in the first nine months, becoming the biggest new computer game of the year. By the end of 1998, Battle.net was hosting more than 100,000 concurrent users during peak hours. Blizzard was now the hottest name in computer gaming; they had left even id Software — not to mention Westwood of Command & Conquer fame — in their dust.

There was always a snowball effect when it came to online games in particular; everyone wanted the game their friends were already playing, so that they too could get in on the communal fun. Thus Starcraft continued to sell well for years and years, flirting with 10 million units worldwide before all was said and done, by which time it had become almost synonymous with the RTS in general for many gamers. Although your conclusions can vary depending on where you move the goalposts — Myst sold more units during the 1990s — Starcraft has at the very least a reasonable claim to the title of most successful single computer game of its decade. Everyone who played games during its pre- and post-millennial heyday, everyone who had a friend that did so, everyone who even had a friend of a friend that did so remembers Starcraft today. It became that inescapable. And yet the Starcraft mania in the West was nothing compared to the fanaticism it engendered in one mid-sized Asian country.

If you had told the folks at Blizzard on the day they shipped Starcraft that their game would soon be played for significant money by professional teams of young people who trained as hard or harder than traditional athletes, they would have been shocked. If you had told them that these digital gladiators would still be playing it fifteen years later, they wouldn’t have believed you. And if you had told them that all of this would be happening in, of all places, South Korea, they would have decided you were as crazy as a bug in a rug. But all of these things would come to pass.


Act 2: Starcraft and the Rise of Gaming as a Spectator Sport



Why Starcraft? And why South Korea?

We’ve gone a long way toward answering the first question already. More than any RTS that came before it and the vast majority of those that came after it, Starcraft lent itself to esport competition by being so incredibly well-balanced. Terran, Zerg, or Protoss… you could win (and lose) with any of them. The game was subtle and complex enough that viable new strategies would still be appearing a decade after its release. At the same time, though, it was immediately comprehensible in the broad strokes and fast-paced enough to be a viable spectator sport, with most matches between experienced players wrapping up within half an hour. A typical Command & Conquer or Age of Empires match lasted about twice as long, with far more downtime when little was happening in the way of onscreen excitement.

The question of why South Korea is more complicated to answer, but by no means impossible. In the three decades up to the mid-1990s, the country’s economy expanded like gangbusters. Its gross national product increased by an average of 8.2 percent annually, with average annual household income increasing from $80 to over $10,000 over that span. In 1997, however, all of that came to a crashing halt for the time being, when an overenthusiastic and under-regulated banking sector collapsed like a house of cards, resulting in the worst recession in the country’s modern history. The International Monetary Fund had to step in to prevent a full-scale societal collapse, an intervention which South Koreans universally regarded as a profound national humiliation.

This might not seem like an environment overly conducive to a new fad in pop culture, but it proved to be exactly that. The economic crash left a lot of laid-off businessmen — in South Korea during this era, they were always men — looking for ways to make ends meet. With the banking system in free fall, there was no chance of securing much in the way of financing. So, instead of thinking on a national or global scale, as they had been used to doing, they thought about what they could do close to home. Some opened fried-chicken joints or bought themselves a taxicab. Others, however, turned to Internet cafés — or “PC bangs,” as they were called in the local lingo.

Prior to the economic crisis, the South Korean government hadn’t been completely inept by any means. It had seen the Internet revolution coming, and had spent a lot of money building up the country’s telecommunications infrastructure. But in South Korea as in all places, the so-called “last mile” of Internet connectivity was the most difficult to bring to an acceptable fruition. Even in North America and Western Europe, most homes could only access the Internet at this time through slow and fragile dial-up connections. South Korean PC bangs, however, jacked directly into the Internet from city centers, justifying the expense of doing so with economies of scale: 20 to 100 computers, each with a paying customer behind the screen, were a very different proposition from a single computer in the home.

The final ingredient in the cultural stew was another byproduct of the recession. An entire generation of young South Korean men found themselves unemployed or underemployed. (Again, I write about men alone here because South Korea was a rigidly patriarchal society at that time, although this is slowly — painfully slowly — changing now.) They congregated in the PC bangs, which gave them unfettered access to the Internet for about $2 per hour. It was hard to imagine a cheaper form of entertainment. The PC bangs became social scenes unto themselves, packed at all hours of the day and night with chattering, laughing youths who were eager to forget the travails of real life outside their four walls. They drank bubble tea and slurped ramen noodles while, more and more, they played online games, both against one another and against the rest of the country. In a way, they actually had it much better than the gamers who were venturing online in the Western world: they didn’t have to deal with all of the problems of dial-up modems, could game on rock-solid connections running at speeds of which most Westerners could only dream.

A few months after it had made its American debut, Starcraft fell out of the clear blue South Korean sky to land smack dab in the middle of this fertile field. The owners of the PC bangs bought copies and installed them for their customers’ benefit, as they already had plenty of other games. But something about Starcraft scratched an itch that no PC-bang patron had known he had. The game became a way of life for millions of South Koreans, who became addicted to the adrenaline rush it provided. Soon many of the PC bangs could be better described as Starcraft bangs. Primary-school children and teenagers hung out there as well as twenty-somethings, playing and, increasingly, just watching others play, something you could do for free. The very best players became celebrities in their local community. It was an intoxicating scene, where testosterone rather than alcohol served as the social lubricant. Small wonder that the PC bangs outlived the crisis that had spawned them, remaining a staple of South Korean youth culture even after the economy got back on track and started chugging along nicely once again. In 2001, long after the crisis had passed, there were 23,548 PC bangs in the country, roughly the same number of Internet cafés as there were 7-Elevens.

Of course, the PC bangs were all competing with one another to lure customers through their doors. The most reliable way to do so was to become known as the place where the very best Starcraft players hung out. To attract such players, some enterprising owners began hosting tournaments, with prizes that ranged from a few hours of free computer time to up to $1000 in cash. This was South Korean esports in their most nascent form.

The impresario who turned Starcraft into a professional sport as big as any other in the country was named Hwang Hyung Jun. During the late 1990s, Hwang was a content producer at a television station called Tooniverse, whose usual fare was syndicated cartoons. He first started to experiment with videogame programming in the summer of 1998, when he commemorated that year’s World Cup of Football by broadcasting simulated versions of each match, played in Electronic Arts’s World Cup 98. That led to other experiments with simulated baseball. (Chan Ho Park, the first South Korean to play Major League Baseball in North America, was a superstar on two continents at that time.)

But it was only when Hwang tried organizing and broadcasting a Starcraft tournament in 1999 that he truly hit paydirt. Millions were instantly entranced. Among them was a young PC bang hanger-on and Starcraft fanatic named Baro Hyun, who would go on to write a book about esports in his home country.

Late one afternoon, I returned from school, unloaded my backpack, and turned on the television in the living room. Thanks to my parents, we had recently subscribed to a cable-TV network with dozens of channels. As a cable-TV newbie, I navigated my way through what felt like a nearly infinite number of channels. Movie channel; next. Sports channel; next. Professional Go channel; popular among fathers, but a definite next for me.

Suddenly I stopped clicking and stared open-mouthed at the television. I could not believe what I was seeing. A one-on-one game of Starcraft was on TV.

Initially, I thought I’d stumbled across some sort of localized commercial made by Blizzard. Soon, however, it became obvious that wasn’t the case. The camera angle shifted from the game screen to the players. They were oddly dressed, like budget characters in Mad Max. Each one wore a headset and sat in front of a dedicated PC. They appeared to be engaged in a serious Starcraft duel.

This was interesting enough, but when I listened carefully, I could hear commentators explaining what was happening in the game. One explained the facts and game decisions of the players, while another interpreted what those decisions might mean to the outcome of the game. After the match, the camera angle switched to the caster and the commentators, who briefed viewers on the result of the game and the overall story. The broadcast gave the unmistakable impression of a professional sports match.

Esports history is made, as two players face off in one of the first Starcraft matches ever to be broadcast on South Korean television, from a kitschy set that looks to have been constructed from the leavings of old Doctor Who episodes.

These first broadcasts corresponded with the release of Brood War, Starcraft’s first and only expansion pack. Its development had been led by the indefatigable Rob Pardo, who used it to iron out the last remaining balance issues in the base game. (“Starcraft [alone] was not a game that could have been an esport,” wrote a super-fan bluntly years later in an online “Brief History of Starcraft.” “It was [too] simple and imbalanced.”)

Now, the stage was set. Realizing he had stumbled upon something with almost unlimited potential, Hwang Hyung Jun put together a full-fledged national Starcraft league in almost no time at all. From the bottom rungs at the level of the local PC bangs, players could climb the ladder all the way to the ultimate showcase, the “Tooniverse Starleague” final, in which five matches were used to determine the best Starcraft player of them all. Surprisingly, when the final was held for the first time in 2000, that player turned out to be a Canadian, a fellow named Guillaume Patry who had arrived in South Korea just the year before.

No matter; the tournament put up ratings that dwarfed those of Tooniverse’s usual programming. Hwang promptly started his own television channel. Called OnGameNet, it was the first in the world to be dedicated solely to videogames and esports. The Starcraft players who were featured on the channel became national celebrities, as did the sportscasters and color commentators: Jung Il Hoon, who looked like a professor and spoke in the stentorian tones of a newscaster; Jeon Yong Jun, whom words sometimes failed when things got really exciting, yielding to wild water-buffalo bellowing; Jung Sorin, a rare woman on the scene, a kindly and nurturing “gamer mom.” Their various shticks may have been calculated, but they helped to make the matches come alive even for viewers who had never played Starcraft for themselves.

A watershed was reached in 2002, when 20,000 screaming fans packed into a Seoul arena to witness that year’s final. The contrast with just a few years before, when a pair of players had dueled on a cheap closed set for the sake of mid-afternoon programming on a third-tier television station, could hardly have been more pronounced. Before this match, a popular rock band known as Cherry Filter put on a concert. Then, accepting their unwonted opening-act status with good grace, the rock stars sat down to watch the showdown between Lim Yo Hwan and Park Jung Seok on the arena’s giant projection screens, just like everyone else in the place. Park, who was widely considered the underdog, wound up winning three matches to one. Even more remarkably, he did so while playing as the Protoss, the least successful of the three factions in professional competitions prior to this point.

Losing the 2002 final didn’t derail Lim Yo Hwan’s career. He went on to become arguably the most successful Starcraft player in history. He was definitely the most popular during the game’s golden age in South Korea. His 2005 memoir, advising those who wanted to follow in his footsteps to “practice relentlessly” and nodding repeatedly to his sponsors — he wrote of opening his first “Shinhan Bank account” as a home for his first winnings — became a bestseller.

Everything was in flux; new tactics and techniques were coming thick and fast, as South Korean players pushed themselves to superhuman heights, the likes of which even the best players at Blizzard could scarcely have imagined. By now, they were regularly performing 250 separate actions per minute in the game.

The scene was rapidly professionalizing in all respects. Big-name corporations rushed in to sponsor individual players and, increasingly, teams, who lived together in clubhouses, neglecting education and all of the usual pleasures of youth in favor of training together for hours on end. The very best Starcraft players were soon earning hundreds of thousands of dollars per year from prize money and their sponsorship deals.

Baseball had long been South Korea’s most popular professional sport. In 2004, 30,000 people attended the baseball final in Seoul. Simultaneously, 100,000 people were packing a stadium in Busan, the country’s second largest city, for the OnGameNet Starcraft final. Judged on metrics like this one, Starcraft had a legitimate claim to the title of most popular sport of all in South Korea. The matches themselves just kept getting more intense; some of the best players were now approaching 500 actions per minute. Maintaining a pace like that required extraordinary reflexes and mental and physical stamina — reflexes and stamina which, needless to say, are strictly the purview of the young. Indeed, the average professional Starcraft player was considered washed up even younger than the average soccer player. Women weren’t even allowed to compete, out of the assumption that they couldn’t possibly be up to the demands of the sport. (They were eventually given a league of their own, although it attracted barely a fraction of the interest of the male leagues — sadly, another thing that Starcraft has in common with most other professional sports.)

Ten years after Starcraft’s original release as just another boxed computer game, it was more popular than ever in South Korea. The PC bangs had by now fallen in numbers and importance, in reverse tandem with the rise in the number of South Korean households with computers and broadband connections of their own. Yet esports hadn’t missed a beat during this transition. Millions of boys and young men still practiced Starcraft obsessively in the hopes of going pro. They just did it from the privacy of their bedrooms instead of from an Internet café.

Starcraft fandom in South Korea grew up alongside the music movement known as K-pop, and shares many attributes with it. Just as K-pop impresarios absorbed lessons from Western boy bands, then repurposed them into something vibrantly and distinctly South Korean, the country’s Starcraft moguls made the game their own; relatively few international tournaments were held, simply because nobody had much chance of beating the top South Korean players. There was an almost manic quality to both K-pop and the professional Starcraft leagues, twin obsessions of a country to which the idea of a disposable income and the consumerism it enables were still fairly new. South Korea’s geographical and geopolitical positions were precarious, perched there on the doorstep of giant China alongside its own intransigent and bellicose mirror image, a totalitarian state hellbent on acquiring nuclear weapons. A mushroom cloud over Seoul suddenly ending the party remained — and remains — a very real prospect for everyone in the country, giving ample reason to live for today. Rather than the decadent hedonism that marked, say, Cold War Berlin, South Korea turned to a pop culture of giddy, madding innocence for relief.

A 2010 match in the Korean Air Headquarters Hangar in Gimpo.

Alas, though, it seems that all forms of sport must eventually pass through a loss of innocence. Starcraft’s equivalent of the 1919 Major League Baseball scandal started with Ma Jae-yoon, a former superstar who by 2010 was struggling to keep up with the ever more demanding standard of play. Investigating persistent rumors that Ma was taking money to throw some of his matches, the South Korean Supreme Prosecutors’ Office found that they were truer than anyone had dared to speculate. Ma stood at the head of a conspiracy with as many tendrils as a Zerg, involving the South Korean mafia and at least a dozen other players. The scandal was front-page news in the country for months. Ma ended up going to prison for a year and being banned for life from South Korean esports. (“Say it ain’t so, Ma!”) His crimes cast a long shadow over the Starcraft scene; a number of big-name sponsors pulled out completely.

The same year as the match-fixing scandal, Blizzard belatedly released Starcraft II: Wings of Liberty. Yet another massive worldwide hit for its parent company, the sequel proved a mixed blessing for South Korean esports. The original Starcraft had burrowed its way deep into the existing players’ consciousnesses; every tiny quirk in the code that Blizzard had written so many years earlier had been dissected, internalized, and exploited. Many found the prospect of starting over from scratch deeply unappealing; perhaps there is space in a lifetime to learn only one game as deeply as millions of South Korean players had learned the first Starcraft. Some put on a brave face and tried to jump over to the sequel, but it was never quite the same. Others swore that they would stop playing the original only when someone pried it out of their cold, dead hands — but that wasn’t the same either. A third, disconcertingly large group decided to move on to some other game entirely, or just to move on with life. By 2015, South Korean Starcraft was a ghost of its old self.

Which isn’t to say that esports as a whole faded away in the country. Rather than Starcraft II, a game called League of Legends became the original Starcraft’s most direct successor in South Korea, capable of filling stadiums with comparable numbers of screaming fans. (As a member of a newer breed known as “multiplayer online battle arena” (MOBA) games, League of Legends is similar to Starcraft in some ways, but very different in others; each player controls only a single unit instead of amassing armies of them.) Meanwhile esports, like K-pop, were radiating out from Asia to become a fixture of global youth culture. The 2017 international finals of League of Legends attracted 58 million viewers all over the world; the Major League Baseball playoffs that year managed just 38 million, the National Basketball Association finals only 32 million. Esports are big business. And with annual growth rates in the double digits in percentage terms, they show every sign of continuing to get bigger and bigger for years to come.

How we feel about all of this is, I fear, dictated to a large extent by the generation to which we happen to belong. (Hasn’t that always been the way with youth culture?) Being a middle-aged man who grew up with digital games but not with gaming as a spectator sport, my own knee-jerk reaction vacillates between amusement and consternation. My first real exposure to esports came not that many years ago, via an under-sung little documentary film called State of Play, which chronicles the South Korean Starcraft scene, fly-on-the-wall style, just as its salad days are coming to an end. Having just re-watched the film before writing this piece, I still find much of it vaguely horrifying: the starry-eyed boys who play Starcraft ten to fourteen hours per day; the coterie of adult moguls and handlers who are clearly making a lot of money by… well, it’s hard for me not to use the words “exploiting them” here. At one point, a tousle-headed boy looks into the camera and says, “We don’t really play for fun anymore. Mostly I play for work. My work just happens to be a game.” That breaks my heart every time. Certainly this isn’t a road that I would particularly like to see any youngster I care about go down. A happy, satisfying life, I’ve long believed, is best built out of a diversity of experiences and interests. Gaming can be one of these, as rewarding as any of the rest, but there’s no reason it should fill more than a couple of hours of anyone’s typical day.

On the other hand, these same objections perchance apply equally to sports of the more conventionally athletic kind. Those sports’ saving grace may be that it’s physically impossible to train at most of them for ten to fourteen hours at a stretch. Or maybe it has something to do with their being intrinsically healthy activities when pursued in moderation, or with the spiritual frisson that can come from being out on the field with grass underfoot and sun overhead, with heart and lungs and limbs all pumping in tandem as they should. Just as likely, though, I’m merely another old man yelling at clouds. The fact is that a diversity of interests is usually not compatible with ultra-high achievement in any area of endeavor.

Anyway, setting the Wayback Machine to 1998 once again, I can at least say definitively that gaming stood on the verge of exploding in unanticipated, almost unimaginable directions at that date. Was Starcraft the instigator of some of that, or was it the happy beneficiary? Doubtless a little bit of both. Blizzard did have a way of always being where the action was…



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: The books Stay Awhile and Listen, Book II by David L. Craddock and Demystifying Esports by Baro Hyun; Computer Gaming World of May 1997, September 1997, and July 1998; Retro Gamer 170; International Journal of Communication 14; the documentary film State of Play.

Online sources include Soren Johnson’s interview of Rob Pardo for his Designer’s Notes podcast, “Behind the Scenes of Starcraft’s Earliest Days” by Kat Bailey at VG247, and “A Brief History of Starcraft at TL.net.

Starcraft and the Brood War expansion are now available for free at Blizzard’s website.

 

Tags: , , ,

The Rise of POMG, Part 1: It Takes a Village…

No one on their deathbed ever said, “I wish I had spent more time alone with my computer!”

— Dani Bunten Berry

If you ever want to feel old, just talk to the younger generation.

A few years ago now, I met the kids of a good friend of mine for the very first time: four boys between the ages of four and twelve, all more or less crazy about videogames. As someone who spends a lot of his time and earns a lot of his income writing about games, I arrived at their house with high expectations attached.

Alas, I’m afraid I proved a bit of a disappointment to them. The distance between the musty old games that I knew and the shiny modern ones that they played was just too far to bridge; shared frames of reference were tough to come up with. This was more or less what I had anticipated, given how painfully limited I already knew my knowledge of modern gaming to be. But one thing did genuinely surprise me: it was tough for these youngsters to wrap their heads around the very notion of a game that you played to completion by yourself and then put on the shelf, much as you might a book. The games they knew, from Roblox to Fortnite, were all social affairs that you played online with friends or strangers, that ended only when you got sick of them or your peer group moved on to something else. Games that you played alone, without at the very least leader boards and achievements on-hand to measure yourself against others, were utterly alien to them. It was quite a reality check for me.

So, I immediately started to wonder how we had gotten to this point — a point not necessarily better or worse than the sort of gaming that I knew growing up and am still most comfortable with, just very different. This series of articles should serve as the beginning of an answer to that complicated question. Their primary focus is not so much how computer games went multiplayer, nor even how they first went online; those things are in some ways the easy, obvious parts of the equation. It’s rather how games did those things persistently — i.e., permanently, so that each session became part of a larger meta-game, if you will, embedded in a virtual community. Or perhaps the virtual community is embedded in the game. It all depends on how you look at it, and which precise game you happen to be talking about. Whichever way, it has left folks like me, whose natural tendency is still to read games like books with distinct beginnings, middles, and ends, anachronistic iconoclasts in the eyes of the youthful mainstream.

Which, I hasten to add, is perfectly okay; I’ve always found the ditch more fun than the middle of the road anyway. Still, sometimes it’s good to know how the other 90 percent lives, especially if you claim to be a gaming historian…



“Persistent online multiplayer gaming” (POMG, shall we say?) is a mouthful to be sure, but it will have to do for lack of a better descriptor of the phenomenon that has created such a divide between myself and my friend’s children.  It’s actually older than you might expect, having first come to be in the 1970s on PLATO, a non-profit computer network run out of the University of Illinois but encompassing several other American educational institutions as well. Much has been written about this pioneering network, which uncannily presaged in so many of its particulars what the Internet would become for the world writ large two decades later. (I recommend Brian Dear’s The Friendly Orange Glow for a book-length treatment.) It should suffice for our purposes today to say that PLATO became host to, among other online communities of interest, an extraordinarily vibrant gaming culture. Thanks to the fact that PLATO games lived on a multi-user network rather than standalone single-user personal computers, they could do stuff that most gamers who were not lucky enough to be affiliated with a PLATO-connected university would have to wait many more years to experience.

The first recognizable single-player CRPGs were born on PLATO in the mid-1970s, inspired by the revolutionary new tabletop game known as Dungeons & Dragons. They were followed by the first multiplayer ones in amazingly short order. Already in 1975’s Moria,[1]The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name. players met up with their peers online to chat, brag, and sell or trade loot to one another. When they were ready to venture forth to kill monsters, they could do so in groups of up to ten, pooling their resources and sharing the rewards. A slightly later PLATO game called Oubliette implemented the same basic concept in an even more sophisticated way. The degree of persistence of these games was limited by a lack of storage capacity — the only data that was saved between sessions were the statistics and inventory of each player’s character, with the rest of the environment being generated randomly each time out — but they were miles ahead of anything available for the early personal computers that were beginning to appear at the same time. Indeed, Wizardry, the game that cemented the CRPG’s status as a staple genre on personal computers in 1981, was in many ways simply a scaled-down version of Oubliette, with the multiplayer party replaced by a party of characters that were all controlled by the same player.

Chester Bolingbroke, better known online as The CRPG Addict, plays Moria. Note the “Group Members” field at bottom right. Chester is alone here, but he could be adventuring with up to nine others.

A more comprehensive sort of persistence arrived with the first Multi-User Dungeon (MUD), developed by Roy Trubshaw and Richard Bartle, two students at the University of Essex in Britain, and first deployed there in a nascent form in late 1978 or 1979. A MUD borrowed the text-only interface and presentation of Will Crowther and Don Woods’s seminal game of Adventure, but the world it presented was a shared, fully persistent one between its periodic resets to a virgin state, chockablock with other real humans to interact with and perhaps fight. “The Land,” as Bartle dubbed his game’s environs, expanded to more than 600 rooms by the early 1980s, even as its ideas and a good portion of its code were used to set up other, similar environments at many more universities.

In the meanwhile, the first commercial online services were starting up in the United States. By 1984, you could, for the price of a substantial hourly fee, dial into the big mainframes of services like CompuServe using your home computer. Once logged in there, you could socialize, shop, bank, make travel reservations, read newspapers, and do much else that most people wouldn’t begin to do online until more than a decade later — including gaming. For example, CompuServe offered MegaWars, a persistent grand-strategy game of galactic conquest whose campaigns took groups of up to 100 players four to six weeks to complete. (Woe betide the ones who couldn’t log in for some reason of an evening in the midst of that marathon!) You could also find various MUDs, as well as Island of Kesmai, a multiplayer CRPG boasting most of the same features as PLATO’s Oubliette in a genuinely persistent world rather than a perpetually regenerated one. CompuServe’s competitor GEnie had Air Warrior, a multiplayer flight simulator with bitmapped 3D graphics and sound effects to rival any of the contemporaneous single-player simulators on personal computers. For the price of $11 per hour, you could participate in grand Air Warrior campaigns that lasted three weeks each and involved hundreds of other subscribers, organizing and flying bombing raids and defending against the enemy’s attacks on their own lines. In 1991, America Online put up Neverwinter Nights,[2]Not the same game as the 2002 Bioware CRPG of the same name. which did for the “Gold Box” line of licensed Dungeons & Dragons CRPGs what MUD had done for Adventure and Air Warrior had done for flight simulators, transporting the single-player game into a persistent multiplayer space.

All of this stuff was more or less incredible in the context of the times. At the same time, though, we mustn’t forget that it was strictly the purview of a privileged elite, made up of those with login credentials for institutional-computing networks or money in their pockets to pay fairly exorbitant hourly fees to feed their gaming habits. So, I’d like to back up now and tell a different story of POMG — one with more of a populist thrust, focusing on what was actually attainable by the majority of people out there, the ones who neither had access to a university’s mainframe nor could afford to spend hundreds of dollars per month on a hobby. Rest assured that the two narratives will meet before all is said and done.



POMG came to everyday digital gaming in the reverse order of the words that make up the acronym: first games were multiplayer, then they went online, and then these online games became persistent. Let’s try to unpack how that happened.

From the very start, many digital games were multiplayer, optionally if not unavoidably so. Spacewar!, the program generally considered the first fully developed graphical videogame, was exclusively multiplayer from its inception in the early 1960s. Ditto Pong, the game that launched Atari a decade later, and with it a slow-building popular craze for electronic games, first in public arcades and later in living rooms. Multiplayer here was not so much down to design intention as technological affordances. Pong was an elaborate analog state machine rather than a full-blown digital computer, relying on decentralized resistors and potentiometers and the like to do its “thinking.” It was more than hard enough just to get a couple of paddles and a ball moving around on the screen of a gadget like this; a computerized opponent was a bridge too far.

Very quickly, however, programmable microprocessors entered the field, changing everyone’s cost-benefit analyses. Building dual controls into an arcade cabinet was expensive, and the end result tended to take up a lot of space. The designers of arcade classics like Asteroids and Galaxian soon realized that they could replace the complications of a human opponent with hordes of computer-controlled enemies, flying in rudimentary, partially randomized patterns. Bulky multiplayer machines thus became rarer and rarer in arcades, replaced by slimmer, more standardized single-player cabinets. After all, if you wanted to compete with your friends in such games, there was still a way to do so: you could each play a round against the computerized enemies and compare your scores afterward.

While all of this was taking shape, the Trinity of 1977 — the Radio Shack TRS-80, Apple II, and Commodore PET — had ushered in the personal-computing era. The games these early microcomputers played were sometimes ports or clones of popular arcade hits, but just as often they were more cerebral, conceptually ambitious affairs where reflexes didn’t play as big — or any — role: flight simulations, adventure games, war and other strategy games. The last were often designed to be played optimally or even exclusively against another human, largely for the same reason Pong had been made that way: artificial intelligence was a hard thing to implement under any circumstances on an 8-bit computer with as little as 16 K of memory, and it only got harder when you were asking said artificial intelligence to formulate a strategy for Operation Barbarossa rather than to move a tennis racket around in front of a bouncing ball. Many strategy-game designers in these early days saw multiplayer options almost as a necessary evil, a stopgap until the computer could fully replace the human player, thus alleviating that eternal problem of the war-gaming hobby on the tabletop: the difficulty of finding other people in one’s neighborhood who were able and willing to play such weighty, complex games.

At least one designer, however, saw multiplayer as a positive advantage rather than a kludge — in fact, as the way the games of the future by all rights ought to be. “When I was a kid, the only times my family spent together that weren’t totally dysfunctional were when we were playing games,” remembered Dani Bunten Berry. From the beginning of her design career in 1979, when she made an auction game called Wheeler Dealers for the Apple II,[3]Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992. multiplayer was her priority. In fact, she was willing to go to extreme lengths to make it possible; in addition to a cassette tape containing the software, Wheeler Dealers shipped with a custom-made hardware add-on, the only method she could come up with to let four players bid at once. Such experiments culminated in M.U.L.E., one of the first four games ever published by Electronic Arts, a deeply, determinedly social game of economics and, yes, auctions for Atari and Commodore personal computers that many people, myself included, still consider her unimpeachable masterpiece.

A M.U.L.E. auction in progress.

And yet it was Seven Cities of Gold, her second game for Electronic Arts, that became a big hit. Ironically, it was also the first she had ever made with no multiplayer option whatsoever. She was learning to her chagrin that games meant to be played together on a single personal computer were a hard sell; such machines were typically found in offices and bedrooms, places where people went to isolate themselves, not in living rooms or other spaces where they went to be together. She decided to try another tack, thereby injecting the “online” part of POMG into our discussion.

In 1988, Electronic Arts published Berry’s Modem Wars, a game that seems almost eerily prescient in retrospect, anticipating the ludic zeitgeist of more than a decade later with remarkable accuracy. It was a strategy game played in real time (although not quite a real-time strategy of the resource-gathering and army-building stripe that would later be invented by Dune II and popularized by Warcraft and Command & Conquer). And it was intended to be played online against another human sitting at another computer, connected to yours by the gossamer thread of a peer-to-peer modem hookup over an ordinary telephone line. Like most of Berry’s games, it didn’t sell all that well, being a little too far out in front of the state of her nation’s telecommunications infrastructure.

Nevertheless, she continued to push her agenda of computer games as ways of being entertained together rather than alone over the years that followed. She never did achieve the breakout hit she craved, but she inspired countless other designers with her passion. She died far too young in 1998, just as the world was on the cusp of embracing her vision on a scale that even she could scarcely have imagined. “It is no exaggeration to characterize her as the world’s foremost authority on multiplayer computer games,” said Brian Moriarty when he presented Dani Bunten Berry with the first ever Game Developers Conference Lifetime Achievement Award two months before her death. “Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes [representing her eleven published games] the prototypes of the defining art form of the 21st century.” Let this article and the ones that will follow it, written well into said century, serve as partial proof of the truth of his words.

Danielle Bunten Berry, 1949-1998.

For by the time Moriarty spoke them, other designers had been following the trails she had blazed for quite some time, often with much more commercial success. A good early example is Populous, Peter Molyneux’s strategy game in real time (although, again, not quite a real-time strategy) that was for most of its development cycle strictly a peer-to-peer online multiplayer game, its offline single-player mode being added only during the last few months. An even better, slightly later one is DOOM, John Carmack and John Romero’s game of first-person 3D mayhem, whose star attraction, even more so than its sadistic single-player levels, was the “deathmatch” over a local-area network. Granted, these testosterone-fueled, relentlessly zero-sum contests weren’t quite the same as what Berry was envisioning for gaming’s multiplayer future near the end of her life; she wished passionately for games with a “people orientation,” directed toward “the more mainstream, casual players who are currently coming into the PC market.” Still, as the saying goes, you have to start somewhere.

But there is once more a caveat to state here about access, or rather the lack thereof. Being built for local networks only — i.e., networks that lived entirely within a single building or at most a small complex of them — DOOM deathmatches were out of reach on a day-to-day basis for those who didn’t happen to be students or employees at institutions with well-developed data-processing departments and permissive or oblivious authority figures. Outside of those ivory towers, this was the era of the “LAN party,” when groups of gamers would all lug their computers over to someone’s house, wire them together, and go at it over the course of a day or a weekend. These occasions went on to become treasured memories for many of their participants, but they achieved that status precisely because they were so sporadic and therefore special.

And yet DOOM‘s rise corresponded with the transformation of the Internet from an esoteric tool for the technological elite to the most flexible medium of communication ever placed at the disposal of the great unwashed, thanks to a little invention out of Switzerland called the World Wide Web. What if there was a way to move DOOM and other games like it from a local network onto this one, the mother of all wide-area networks? Instead of deathmatching only with your buddy in the next cubicle, you would be able to play against somebody on another continent if you liked. Now wouldn’t that be cool?

The problem was that local-area networks ran over a protocol known as IPX, while the Internet ran on a completely different one called TCP/IP. Whoever could bridge that gap in a reasonably reliable, user-friendly way stood to become a hero to gamers all over the world.



Jay Cotton discovered DOOM in the same way as many another data-processing professional: when it brought down his network. He was employed at the University of Georgia at the time, and was assigned to figure out why the university’s network kept buckling under unprecedented amounts of spurious traffic. He tracked the cause down to DOOM, the game that half the students on campus seemed to be playing more than half the time. More specifically, the problem was caused by a bug, which was patched out of existence by John Carmack as soon as he was informed. Problem solved. But Cotton stuck around to play, the warden seduced by the inmates of the asylum.

He was soon so much better at the game than anyone else on campus that he was getting a bit bored. Looking for worthier opponents, he stumbled across a program called TCPSetup, written by one Jake Page, which was designed to translate IPX packets into TCP/IP ones and vice versa on the fly, “tricking” DOOM into communicating across the vast Internet. It was cumbersome to use and extremely unreliable, but on a good day it would let you play DOOM over the Internet for brief periods of time at least, an amazing feat by any standard. Cotton would meet other players on an Internet chat channel dedicated to the game, they’d exchange IP addresses, and then they’d have at it — or try to, depending on the whims of the Technology Gods that day.

On August 22, 1994, Cotton received an email from a fellow out of the University of Illinois — yes, PLATO’s old home — whom he’d met and played in this way (and beaten, he was always careful to add). His name was Scott Coleman. “I have some ideas for hacking TCPSetup to make it a little easier. Care to do some testing later?” Coleman wrote. “I’ve already emailed Jake [Page] on this, but he hasn’t responded (might be on vacation or something). If he approves, I’m hoping some of these ideas might make it into the next release of TCPSetup. In the meantime, I want to do some experimenting to see what’s feasible.”

Jake Page never did respond to their queries, so Cotton and Coleman just kept beavering away on their own, eventually rewriting TCPSetup entirely to create iDOOM, a more reliable and far less fiddly implementation of the same concept, with support for three- or four-player deathmatches instead of just one-on-one duels. It took off like a rocket; the pair were bombarded with feature requests, most notably to make iDOOM work with other IPX-only games as well. In January of 1995, they added support for Heretic, one of the most popular of the first wave of so-called “DOOM clones.” They changed their program’s name to “iFrag” to reflect the fact that it was now about more than just DOOM.

Having come this far, Cotton and Coleman soon made the conceptual leap that would transform their software from a useful tool to a way of life for a time for many, many thousands of gamers. Why not add support for more games, they asked themselves, not in a bespoke way as they had been doing to date, but in a more sustainable one, by turning their program into a general-purpose IPX-to-TCP/IP bridge, suitable for use with the dozens of other multiplayer games out there that supported only local-area networks out of the box. And why not make their tool into a community while they were at it, by adding an integrated chat service? In addition to its other functions, the program could offer a list of “servers” hosting games, which you could join at the click of a button; no more trolling for opponents elsewhere on the Internet, then laboriously exchanging IP addresses and meeting times and hoping the other guy followed through. This would be instant-gratification online gaming. It would also provide a foretaste at least of persistent online multiplayer gaming; as people won matches, they would become known commodities in the community, setting up a meta-game, a sporting culture of heroes and zeroes where folks kept track of win-loss records and where everybody clamored to hear the results when two big wheels faced off against one another.

Cotton and Coleman renamed their software for the third time in less than nine months, calling it Kali, a name suggested by Coleman’s Indian-American girlfriend (later his wife). “The Kali avatar is usually depicted with swords in her hands and a necklace of skulls from those she has killed,” says Coleman, “which seemed appropriate for a deathmatch game.” Largely at the behest of Cotton, always the more commercially-minded of the pair, they decided to make Kali shareware, just like DOOM itself: multiplayer sessions would be limited to fifteen minutes at a time until you coughed up a $20 registration fee. Cotton went through the logistics of setting up and running a business in Georgia while Coleman did most of the coding in Illinois. (Rather astonishingly, Cotton and Coleman had still never met one another face to face in 2013, when gaming historian David L. Craddock conducted an interview with them that has been an invaluable source of quotes and information for this article.)

Kali certainly wasn’t the only solution in this space; a commercial service called DWANGO had existed since December of 1994, with the direct backing of John Carmack and John Romero, whose company id Software collected 20 percent of its revenue in return for the endorsement. But DWANGO ran over old-fashioned direct-dial-up connections rather than the Internet, meaning you had to pay long-distance charges to use it if you weren’t lucky enough to live close to one of its host computers. On top of that, it charged $9 for just five hours of access per month, with the fees escalating from there. Kali, by contrast, was available to you forever for as many hours per month as you liked after you plunked down your one-time fee of $20.

So, Kali was popular right from its first release on April 26, 1995. Yet it was still an awkward piece of software for the casual user despite the duo’s best efforts, being tied to MS-DOS, whose support for TCP/IP relied on a creaky edifice of third-party tools. The arrival of Windows 95 was a godsend for Kali, as it was for computer gaming in general, making the hobby accessible in a way it had never been before. The so-called “Kali95” was available by early 1996, and things exploded from there. Kali struck countless gamers with all the force of a revelation; who would have dreamed that it could be so easy to play against another human online? Lloyd Case, for example, wrote in Computer Gaming World magazine that using Kali for the first time was “one of the most profound gaming experiences I’ve had in a long time.” Reminiscing seventeen years later, David L. Craddock described how “using Kali for the first time was like magic. Jumping into a game and playing with other people. It blew my fourteen-year-old mind.” In late 1996, the number of registered Kali users ticked past 50,000, even as quite possibly just as many or more were playing with cracked versions that bypassed the simplistic serial-number-registration process. First-person-shooter deathmatches abounded, but you could also play real-time strategies like Command & Conquer and Warcraft, or even the Links golf simulation. Computer Gaming World gave Kali a special year-end award for “Online-Enabling Technology.”

Kali for Windows 95.

Competitors were rushing in at a breakneck pace by this time, some of them far more conventionally “professional” than Kali, whose origin story was, as we’ve seen, as underground and organic as that of DOOM itself. The most prominent of the venture-capital-funded startups were MPlayer (co-founded by Brian Moriarty of Infocom and LucasArts fame, and employing Dani Bunten Berry as a consultant during the last months of her life) and the Total Entertainment Network, better known as simply TEN. In contrast to Kali’s one-time fee, they, like DWANGO before them, relied on subscription billing: $20 per month for MPlayer, $15 per month for TEN. Despite slick advertising and countless other advantages that Kali lacked, neither would ever come close to overtaking its scruffy older rival, which had price as well as oodles of grass-roots goodwill on its side. Jay Cotton:

It was always my belief that Kali would continue to be successful as long as I never got greedy. I wanted everyone to be so happy with their purchase that they would never hesitate to recommend it to a friend. [I would] never charge more than someone would be readily willing to pay. It also became a selling point that Kali only charged a one-time fee, with free upgrades forever. People really liked this, and it prevented newcomers (TEN, Heat [a service launched in 1997 by Sega of America], MPlayer, etc.) from being able to charge enough to pay for their expensive overheads.

Kali was able to compete with TEN, MPlayer, and Heat because it already had a large established user base (more users equals more fun) and because it was much, much cheaper. These new services wanted to charge a subscription fee, but didn’t provide enough added benefit to justify the added expense.

It was a heady rush indeed, although it would also prove a short-lived one; Kali’s competitors would all be out of business within a year or so of the turn of the millennium. Kali itself stuck around after that, but as a shadow of what it had been, strictly a place for old-timers to reminisce and play the old hits. “I keep it running just out of habit,” said Jay Cotton in 2013. “I make just enough money on website ads to pay for the server.” It still exists today, presumably as a result of the same force of habit.

One half of what Kali and its peers offered was all too obviously ephemeral from the start: as the Internet went mainstream, developers inevitably began building TCP/IP support right into their games, eliminating the need for an external IPX-to-TCP/IP bridge. (For example, Quake, id Software’s much-anticipated follow-up to DOOM, did just this when it finally arrived in 1996.) But the other half of what they offered was community, which may have seemed a more durable sort of benefit. As it happened, though, one clever studio did an end-run around them here as well.



The folks at Blizzard Entertainment, the small studio and publisher that was fast coming to rival id Software for the title of the hottest name in gaming, were enthusiastic supporters of Kali in the beginning, to the point of hand-tweaking Warcraft II, their mega-hit real-time strategy, to run optimally over the service. They were rewarded by seeing it surpass even DOOM to become the most popular game there of all. But as they were polishing their new action-CRPG Diablo for release in 1996, Mike O’Brien, a Blizzard programmer, suggested that they launch their own service that would do everything Kali did in terms of community, albeit for Blizzard’s games alone. And then he additionally suggested that they make it free, gambling that knowledge of its existence would sell enough games for them at retail to offset its maintenance costs. Blizzard’s unofficial motto had long been “Let’s be awesome,” reflecting their determination to sell exactly the games that real hardcore gamers were craving, honed to a perfect finish, and to always give them that little bit extra. What better way to be awesome than by letting their customers effortlessly play and socialize online, and to do so for free?

The idea was given an extra dollop of urgency by the fact that Westwood Games, the maker of Warcraft‘s chief competitor Command & Conquer, had introduced a service called Westwood Chat that could launch people directly into a licensed version of Monopoly. (Shades of Dani Bunten Berry’s cherished childhood memories…) At the moment it supported only Monopoly, a title that appealed to a very different demographic from the hardcore crowd who favored Blizzard’s games, but who knew how long that would last?[4]Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.

So, when Diablo shipped in the last week of 1996, it included something called Battle.net, a one-click chat and matchmaking service and multiplayer facilitator. Battle.net made everything easier than it had ever been before. It would even automatically patch your copy of the game to the latest version when you logged on, pioneering the “software as a service” model in gaming that has become everyday life in our current age of Steam. “It was so natural,” says Blizzard executive Max Schaefer. “You didn’t think about the fact that you were playing with a dude in Korea and a guy in Israel. It’s really a remarkable thing when you think about it. How often are people casually matched up in different parts of the world?” The answer to that question, of course, was “not very often” in the context of 1997. Today, it’s as normal as computers themselves, thanks to groundbreaking initiatives like this one. Blizzard programmer Jeff Strain:

We believed that in order for it [Battle.net] to really be embraced and adopted, that accessibility had to be there. The real catch for Battle.net was that it was inside-out rather than outside-in. You jumped right into the game. You connected players from within the game experience. You did not alt-tab off into a Web browser to set up your games and have the Web browser try to pass off information or something like that. It was a service designed from Day One to be built into actual games.

The combination of Diablo and Battle.net brought a new, more palpable sort of persistence to online gaming. Players of DOOM or Warcraft II might become known as hotshots on services like Kali, but their reputation conferred no tangible benefit once they entered a game session. A DOOM deathmatch or a Warcraft II battle was a one-and-done event, which everyone started on an equal footing, which everyone would exit again within an hour or so, with nothing but memories and perhaps bragging rights to show for what had transpired.

Diablo, however, was different. Although less narratively and systemically ambitious than many of its recent brethren, it was nevertheless a CRPG, a genre all about building up a character over many gaming sessions. Multiplayer Diablo retained this aspect: the first time you went online, you had to pick one of the three pre-made first-level characters to play, but after that you could keep bringing the same character back to session after session, with all of the skills and loot she had already collected. Suddenly the link between the real people in the chat rooms and their avatars that lived in the game proper was much more concrete. Many found it incredibly compelling. People started to assume the roles of their characters even when they were just hanging out in the chat rooms, started in some very real sense to live the game.

But it wasn’t all sunshine and roses. Battle.net became a breeding ground of the toxic behaviors that have continued to dog online gaming to this day, a social laboratory demonstrating what happens when you take a bunch of hyper-competitive, rambunctious young men and give them carte blanche to have at it any way they wish with virtual swords and spells. The service was soon awash with “griefers,” players who would join others on their adventures, ostensibly as their allies in the dungeon, then literally stab them in the back when they least expected it, killing their characters and running off with all of their hard-won loot. The experience could be downright traumatizing for the victims, who had thought they were joining up with friendly strangers simply to have fun together in a cool new game. “Going online and getting killed was so scarring,” acknowledges David Brevick, Diablo‘s original creator. “Those players are still feeling a little bit apprehensive.”

To make matters worse, many of the griefers were also cheaters. Diablo had been born and bred a single-player game; multiplayer had been a very late addition. This had major ramifications. Diablo stored all the information about the character you played online on your local hard drive rather than the Battle.net server. Learn how to modify this file, and you could create a veritable god for yourself in about ten minutes, instead of the dozens of hours it would take playing the honest way. “Trainers” — programs that could automatically do the necessary hacking for you — spread like wildfire across the Internet. Other folks learned to hack the game’s executable files themselves. Most infamously, they figured out ways to attack other players while they were still in the game’s above-ground town, supposedly a safe space reserved for shopping and healing. Battle.net as a whole took on a siege mentality, as people who wanted to play honorably and honestly learned to lock the masses out with passwords that they exchanged only with trusted friends. This worked after a fashion, but it was also a betrayal of the core premise and advantage of Battle.net, the ability to find a quick pick-up game anytime you wanted one. Yet there was nothing Blizzard could do about it without rewriting the whole game from the ground up. They would eventually do this — but they would call the end result Diablo II. In the meanwhile, it was a case of player beware.

It’s important to understand that, for all that it resembled what would come later all too much from a sociological perspective, multiplayer Diablo was still no more persistent than Moria and Oubliette had been on the old PLATO network: each player’s character was retained from session to session, but nothing about the state of the world. Each world, or instance of the game, could contain a maximum of four human players, and disappeared as soon as the last player left it, leaving as its legacy only the experience points and items its inhabitants had collected from it while it existed. Players could and did kill the demon Diablo, the sole goal of the single-player game, one that usually required ten hours or more of questing to achieve, over and over again in the online version. In this sense, multiplayer Diablo was a completely different game from single-player Diablo, replacing the simple quest narrative of the latter with a social meta-game of character-building and player-versus-player combat.

For lots and lots of people, this was lots and lots of fun; Diablo was hugely popular despite all of the exploits it permitted — indeed, for some players perchance, because of them. It became one of the biggest computer games of the 1990s, bringing online gaming to the masses in a way that even Kali had never managed. Yet there was still a ways to go to reach total persistence, to bring a permanent virtual world to life. Next time, then, we’ll see how mainstream commercial games of the 1990s sought to achieve a degree of persistence that the first MUD could boast of already in 1979. These latest virtual worlds, however, would attempt to do so with all the bells and whistles and audiovisual niceties that a new generation of gamers raised on multimedia and 3D graphics demanded. An old dog in the CRPG space was about to learn a new trick, creating in the process a new gaming acronym that’s even more of a mouthful than POMG.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


Sources: the books Stay Awhile and Listen Volumes 1 and 2 by David L. Craddock, Masters of Doom by David Kushner, and The Friendly Orange Glow by Brian Dear; Retro Gamer 43, 90, and 103; Computer Gaming World of September 1996 and May 1997; Next Generation of March 1997. Online sources include “The Story of Battle.net” by Wes Fenlon at PC Gamer, Dan Griliopoulos’s collection of interviews about Command & Conquer, Brian Moriarty’s speech honoring Dani Bunten Berry from the 1998 Game Developers Conference, and Jay Cotton’s history of Kali on the DOOM II fan site. Plus some posts on The CRPG Addict, to which I’ve linked in the article proper.

Footnotes

Footnotes
1 The PLATO Moria was a completely different game from the 1983 single-player roguelike that bore the same name.
2 Not the same game as the 2002 Bioware CRPG of the same name.
3 Wheeler Dealers and all of her other games that are mentioned in this article were credited to Dan Bunten, the name under which she lived until 1992.
4 Westwood Chat would indeed evolve eventually into Westwood Online, with full support for Command & Conquer, but that would happen only after Blizzard had rolled out their own service.
 
 

Tags: , , , , , , , , , , , ,

Diablo

All of us had become disappointed with computer RPGs because they were going in the opposite direction of where we thought they should be going. They were becoming story- and stat-laden, really appealing to a super-small niche of super RPG geeks — which we were in a way, but that wasn’t really our style.

So, when [David] Brevik mentioned these roguelike games, it was kind of a natural. “Yeah, let’s take that cool, addictive structure and modernize it. Let’s strip away the stuff that’s turning off a lot of game fans from RPGs.”

— Max Schaefer of Blizzard North

A palpable sense of ennui dogged the Consumer Electronics Shows of 1994. The venerable semiannual expo where such landmark gaming hardware as the Atari VCS, the Commodore 64 and Amiga, the Nintendo Entertainment System and Super Entertainment System, and the Sega Genesis had been seen for the first time seemed somehow past its sell-by date now. Attendance at the Summer CES in particular was down in a big way, so much so that the organizers would move the event out of its long-standing home in Chicago’s McCormick Place the following year and turn it into a traveling exhibition in the hope of drumming up some much-needed excitement. In the meantime, the makers of gaming software had an especially underwhelming time of it in Chicago that year: as usual, they were treated as second-class citizens by the organizers, relegated to the hall’s basement so that the choicer spaces were kept free for cutting-edge toasters, refrigerators, and microwave ovens.

Among the games people who were having the worst time of it of all were the folks behind a tiny San Mateo, California, studio called Condor, Incorporated. David Brevik and his co-founders, the brothers Max and Erich Schaefer, were ostensibly at the show to demonstrate their very first finished original game, a Genesis title called Justice League: Task Force. But they knew the game was no great shakes. They had made exactly what their publisher, the financially troubled Japanese giant Sunsoft, had ordered them to make in rather pedantic detail: a blatant clone of yesteryear’s massive hit Street Fighter II, with DC Comics superheroes inserted in place of its inspiration’s pugilists. They felt it was competently executed, but knew as well as anyone that it was no more than a quickie placeholder product for a five-year-old console that was soon due to be superseded by the next-generation Sega Saturn.

Their ulterior motive for being at CES was something else entirely. Brevik had an idea for a computer game called Diablo, which he had been slowly expanding upon ever since he had lived with his family at the foot of the California mountain of that name back in the mid-1980s. Now, he felt its time had come; he desperately wanted to interest a publisher in it. But every executive he talked to at the show starting shaking his head as soon as he saw the first line of the pitch document, stating that it was “a proposal for a role-playing game.” For CRPGs were dead and buried according to the industry’s conventional wisdom, having nothing to offer in an era when multimedia flash and 3D mayhem reigned supreme. They were quaint at best, deadly boring at worst, as their recent sales figures reflected.

Thoroughly disheartened by his proposal’s reception, Brevik duly turned up with the Schaefer brothers at the appointed time to show Justice League to the assembled press. And here they all got a shock. They learned only minutes before taking the stage that Sunsoft had actually arranged to make a second version of the game for the Super Nintendo, sending the same design brief to another little studio, Blizzard Entertainment of Costa Mesa, California. Both development teams could immediately see that the other had done a pretty solid, professional job with a less than inspiring project. Indeed, they were struck by how similar the two end results were to one another.

They soon learned that they had much more in common. Blizzard too had been founded on a shoestring by three games-obsessed kids just out of university, in this case by the names of Allen Adham, Mike Morhaime, and Frank Pearce. And they too had become all too familiar with workaday projects like Justice League, which they too saw as a way for their new, unproven studio to pay its dues on the way to bigger, better things to come. The big difference was that Blizzard was a few years older, and thus that much further along the road to becoming a marquee studio. They had recently been acquired by the educational-software giant Davidson & Associates, whose distributional pipeline they would be able to use to publish their own games under their own imprint. Now, they were hard at work finishing up the project that they hoped would change everything for them: a game for computers only called Warcraft. They took the Condor boys into a cramped back room and showed it to them. “I had no idea at that point that Warcraft would become an historically important game,” says Max Schaefer. “It just looked cool.” A relationship was forged. The Blizzard folks said they were just too busy to think about anything else just then, but they promised to listen to Condor’s pitch for Diablo once Warcraft was out the door.

They were true to their word. In January of 1995, with Warcraft on store shelves and selling well, everyone came together again in Blizzard’s conference room to talk about Diablo. No one in that room was unaware of the concerns that had caused publisher after publisher to walk away from the proposal; in fact, in many ways they shared them. CRPGs had glutted the market just a few years earlier, a bewildering procession of elves and dwarves and dragons. For the hardcore aficionados, all of the different games and series were (and still are) possessed of their own distinctive personalities and intricate subtleties, but it was hard for everybody else to keep Dungeons & Dragons separate from Dungeon Master, Might and Magic separate from The Magic Candle. I have a friend who likes to say that there are only two blues songs: “the fast one and the slow one.” Likewise, one might go so far as to say that for most gamers there were only two CRPGs, the first-person Wizardry style and the overhead Ultima style. As computers had gotten more capable, games of the former type had gotten ever more complex in terms of rules, while those of the latter type had threatened to collapse under the sheer weight of their lore and verbiage, which minuscule computer memories no longer restricted. Those sorts of things were not what the Condor guys were into at all. Sure, they had all played tabletop Dungeons & Dragons as kids, but world-building and storytelling hadn’t been their primary interest. “It was all about killing monsters and finding good stuff,” says Max Schaefer.

And so that was what Diablo was to be about as well. “As games today substitute gameplay with multimedia extravaganzas and strive toward needless scale and complexity,” read the pitch document, “we seek to reinvigorate the hack-and-slash, feel-good gaming audience. Emphasis will be on exploration, conflict, and character development.”

Diablo‘s most direct influence by far was the roguelike games, which David Brevik had played for hundreds upon hundreds of hours while a student at university. From roguelikes it inherited its minimalist narrative — amounting to little more than “make it to the last level and kill the boss of bosses Diablo” — as well as randomized dungeons that would be new with every playthrough, along with the randomized “good stuff” they contained. Brevik’s favorite roguelike of all was Angband, which distinguished itself from the likes of the original Rogue and its spiritual successor NetHack by having a town to serve as the player’s base of operations for her expeditions into the nearby dungeon, resulting in a slightly more relaxed pacing and introducing an economic element. Diablo was to duplicate this structure exactly: “Forays into the dungeon will be broken up by trips to the town located above. In the town, a general store will provide standard equipment and repairs, and will also purchase extra equipment from the player. A temple will provide healing for injured and sick characters. Training and other facilities may also be available.”

In Brevik’s initial vision, Diablo was even to have roguelike perma-death: if the player’s character was killed, “that character will be erased completely from the hard drive, and the player must start over from scratch.” Combat would be turn-based like in a roguelike, but heavily influenced by the game’s secondary inspiration, Julian Gollop’s 1994 strategy classic X-COM; Diablo would use a similar interface and action-points system. If it strikes you as strange that a game that would later be so commonly dismissed as nothing more than a mindless, frantic click-fest could have two such cerebral inspirations as these… well, such are the paradoxes of game development.

At any rate, Blizzard was suitably impressed, and agreed to fund and publish the game described in the pitch document. But several of the Blizzard folks who were present at the meeting have since claimed that they were already thinking about a major change: to make Diablo run in real time. Not long after work began on the game in earnest down in San Mateo, Blizzard began slowly but relentlessly to apply pressure to Condor — more specifically, to David Brevik — to make the switch.

Brevik was appalled. There was a certain kind of moment, familiar to every roguelike player, that he considered essential to recreate in Diablo. It’s that moment when you’re down to your last few hit points and are staring down the maw of a mind flayer or a wyvern, knowing that it’s about to hit you and kill you on its next turn unless you do something really clever and/or get really lucky on your own last turn before it can do so. Do you pull out that potion that you have no idea what it does and drink it down, hoping against hope that it’s a Potion of Protection? Or do you take one last swipe at the monster with your sword, hoping it’s as close to death as you are? Or do you try to get away by running down that nearby staircase, hoping against hope that it misses with its last lunge against your vulnerable backside? Most of the time, of course, you choose wrong and/or don’t get lucky, and another character goes to the graveyard. But every once in a while, it works out, your character lives to fight another day, and you shout and dance around the room and rush to tell your friends about it. That dopamine release is what keeps people coming back to roguelikes again and again. Brevik was understandably loath to lose it.

But the slow drip, drip, drip from Blizzard continued, seeping even into Condor’s own ranks. Knowing this, Allen Adham made a suggestion to Brevik in or around May of 1995: Why not ask your own people? Why not take a vote on whether just to try real time? If it doesn’t work, you can always go back to turn-based.

It was too reasonable a suggestion to refuse. Brevik asked for a show of hands among his own people of those interested in exploring real time, and was dismayed to see almost every hand in the room go up. Acceding to the will of the majority, he retreated into his office to have a good-faith go at something he was sure would never fit with the game he wanted to make. The quicker it was demonstrated to everyone that real time wasn’t a practical possibility, he thought, the quicker they could all get back to more productive endeavors. What followed instead was the project’s kairos moment.

I can remember the moment like it was yesterday. I was sitting and I was coding the game, and I had a warrior with a sword, and there was a skeleton on the other side of the screen. I’d been working on this code to make characters move smoothly, doing a whole bunch of testing, and we’d talked about how the controls would work.

We wanted it to be visceral. Click and swing, click and swing. We wanted it to automatically happen: if you clicked on the monster, your character would go over there and swing.

I remember very vividly: I clicked on the monster, the guy walked over, and he smashed this skeleton, and it fell apart onto the ground.

The light from heaven shone through the office down onto the keyboard. I said, “Oh, my God, this is so amazing!” I knew it was not only the right decision, but that Diablo was just going to be massive. It was really the most defining moment of my career, as well as for that genre of gaming.

A new genre was born in that moment, and it was really quite incredible to be the person coding it and creating it. I was just there by myself coding it up. It was pretty incredible.

Diablo may have lost that suspended instant of supreme tension that Brevik had always seen as essential, but it had gained something else, something that would make it a different sort of game entirely. Kelly Johnson, an artist who worked on the game:

In a turn-based game, when you win, you say, “Cool, my plan worked. I took time, I deliberated, I made a plan, and it worked out.” But in a real-time [game], it’s, “Wow! I won!” It’s visceral. You’re in the moment.

Everyone at Condor, including Brevik, was soon marveling that they had ever imagined Diablo being anything other than a real-time game. Millions of players would eventually feel the same way, as the game’s real-time nature became the core of its very identity.

The Diablo team with Diablo himself. We must hope that the keytar is intended ironically.

But before that could happen, Diablo had to be finished. In their excitement over not being rejected yet again, Condor had secured less than half a million dollars in funding from Blizzard, to support a team that numbered a dozen or more. By the beginning of 1996, that money was running out. The founders dipped deep into their personal bank accounts just to cover payroll, and their employees started racing one another to the bank on payday, knowing that the last checks deposited had a tendency to bounce. Meanwhile Blizzard was soaring. That Christmas, they had released Warcraft II, a refinement of its predecessor that blew up massively; it would sell 3 million copies before all was said and done.

The Schaefer brothers and David Brevik were stunned when their publisher came to them and asked whether they would be interested in being acquired; Blizzard was suddenly flush with cash, and the brain trust there was very, very excited about Diablo‘s prospects, such that they wanted to have it all for themselves. For the people making Diablo, the unexpected offer was a lifeline materializing out of thin air in front of a drowning man. In March of 1996, Condor became Blizzard North.

It was Blizzard that had pushed the erstwhile Condor to make Diablo run in real time. Now, it would be Blizzard South that drove another core feature into being. The initial pitch document had included “two-player and multiplayer game sessions via modem or network.” Since actual work had begun on the game, however, that aspiration had been all but forgotten. Yet Blizzard South knew how important multiplayer could be for a game in this new era of widespread network connectivity. They knew that multiplayer deathmatches had made DOOM what it was, and they knew that, long after players had finished Warcraft II‘s single-player campaign, it was multiplayer that kept them going there as well, turning the game into a veritable institution. They wanted all that for Diablo, so much so that they made their only significant technical intervention into its development, sending programmers up to San Mateo to apply their Warcraft II expertise to Diablo‘s multiplayer mode.

For Blizzard had huge plans for multiplayer games in general. Everyone could sense that a large percentage of future gaming would take place between real people on the Internet, that the “LAN parties” of the current age were just a temporary stopgap. Yet gaming over long distances was still technically challenging for the user, even as sessions had to be pre-planned with buddies who had bought the same game you had; spontaneous, pick-up-and-play matches were impossible. Various third-party companies were experimenting with ways to change both of these things, but everything was in a nascent, febrile state. Having money to spend as they did, Blizzard decided to introduce a game hosting and matchmaking service for their customers, under the name (and the Internet URL) of Battle.net. And they decided to offer it to buyers of their games for the low, low price of free, on the logic that the boxed-game sales it would generate would easily pay for its upkeep. It was a revolutionary idea, one that would prove as important to Blizzard’s rise into gaming’s stratosphere as any of their individual titles, iconic as they were. Thanks to Battle.net, you would always be able to find someone to play with, then be in a game with them within seconds. Patches would download automatically when you logged onto the service, a first step toward the always-online mentality that has taken over since. And Diablo was the very first Battle.net-enabled game. If it had achieved nothing else, it would be historically notable for this fact alone.

With Diablo being refined into an ever more effortless, frictionless experience, it was inevitable that another legacy of the roguelikes would fall away. The Southerners told the Northerners that perma-death just wouldn’t fly in the modern commercial market. David Brevik kvetched, but there was no way he was going to win this argument. Even if it hadn’t started out that way, Diablo was evolving into a lean-back rather than a lean-forward sort of game, designed to be more fun than it was demanding. Mistakes would happen in a game like that, and nobody wanted to lose a character he had spent eight hours building because he got distracted by the pizza guy ringing the doorbell. By way of compromise, the Southerners did agree to allow only one save slot, which fit in nicely with the game’s ethic of simplicity anyway. And of course, if anyone really wanted to play Diablo like a roguelike, there was nothing but the temptation of that extant last save file preventing it.

Warcraft II had made Blizzard one of the biggest names in mainstream gaming, on a level with id Software of DOOM and Quake fame and Westwood Studios, the makers of Command & Conquer, Blizzard’s great rival in the real-time-strategy space. Everything Blizzard did was now of interest to obsessive gamers. Diablo was to be their first game that ran under Windows 95 rather than MS-DOS; like Battle.net, this was another outcome of the company’s guiding principle of frictionless ease in all things. In the summer of 1996, Blizzard arranged to have a two-level demo of Diablo included on a Microsoft DirectX sampler disc. Interest in the game exploded. It became easily the most anticipated title of the 1996 holiday season.

That fact makes the next bit that much more remarkable. When the last possible instant to send the game out to be burned onto hundreds of thousands of CDs and shipped to stores all over the country in time for the Christmas buying season arrived, Blizzard took a long, hard look at its current state. It wasn’t in terrible shape, but it still had its fair share of minor niggles here and there. The vast majority of publishers would have said it was good enough and shipped it at this point — after all, they could always patch it later, right? (Wasn’t that one of the points of Battle.net?) But Blizzard decided to wait, resigning themselves to letting Christmas slip by without a major new release from them. It was better, they judged, to make sure Diablo was just exactly perfect when it did ship. More than anything else, it would be this thoroughgoing focus on quality — quality at almost any cost — that would make Blizzard one of the most extraordinary success stories in the entire history of gaming. From the beginning, their tender-aged founders understood something that eluded a bizarre number of their more grizzled peers: that one’s reputation is one’s most precious business asset of all, being laborious to build up and disconcertingly easy to lose. In an industry fueled by short-term hype, they took the long view. “If you truly put the game first,” says Allen Adham, “then decisions like holding a product an extra couple of months, even if it means missing Christmas, become fairly clear.” Gamers came to know that Blizzard would never let them down, and this knowledge fueled the company’s rise. The sacrificing of tens of thousands of sales the following month led to millions and millions of sales over the following decade.

So, Diablo missed the Christmas deadline, but not by much: the first copies wended their way onto store shelves between Christmas and New Years, when lots of younger gamers had gift checks from uncles and aunts and grandparents burning holes in their pockets. Others trotted down to their local software store and traded some less desirable Christmas present for Diablo. Retailers fended off the return-season blues by turning Diablo‘s release into an event, plastering posters all over their walls and filling their display windows with mannequins of the devil on the cover. All told, it’s questionable whether the belated release really hurt Diablo very much at all, even in the shortest of terms. By spring, it was clear both from the sales reports and from the level of activity on Battle.net that Diablo was the hottest computer game in the world. It was blowing up huge, even by comparison with Warcraft II. Diablo‘s sales surpassed 1 million units within months.



Diablo‘s eventual impact on the culture and practices of computer gaming was arguably more pronounced than that of any individual title since DOOM. It introduced phrases like “loot drop” into the gamer lexicon; it was the pioneer of a new era of easy online multiplayer gaming, between friends and strangers alike; it single-handedly dragged the entire genre of the CRPG back into public favor. This long shadow can make it oddly difficult to discuss as just a game. When I went back to play it recently for the first time in a quarter of century — boy, I’m getting old! — I was impressed if not blown away by the experience. And yet, despite my best efforts, I couldn’t quite avoid allowing my opinions to be colored by some of what Diablo has wrought. We’ll get to that in due course. But first, Diablo the game…[1]The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.

When you start a new adventure in the world of Diablo, you first choose your character from three fantasy archetypes: the warrior, who is best at bashing things with his big old sword; the rogue, who fights a little more surgically, preferring the bow and arrow; or the mage, who unlike his counterparts is pretty good with spells from the outset. But you don’t spend any time fussing about with statistics. You’re dropped into the hardscrabble village of Tristram, which has had the misfortune to be built over a demon’s not-so-final resting place, as soon as you’ve given your character a name. In Tristram, you can buy and sell in a few different shops and talk to a handful of villagers, but it’s all kept very short and sweet. Before you know it, you’ll be in the first dungeon, which is found beneath the graveyard of the local church.

You’ll have to fight your way through sixteen dungeon levels in all, divided into four sets of four that open up one after another, presenting ever more powerful monsters for your ever more powerful character to battle. In keeping with the game’s roguelike heritage, each level is procedurally generated. There is a modicum of story, even a cut scene here and there, but nothing you ever need to think too much about. (Although a fairly elaborate backstory does appear in the manual, it too is nothing you need to concern yourself with if you don’t want to. It was tacked on very late in development by Blizzard South, who realized that some gamers at least still liked to see such things.) There are also some pre-scripted quests to carry out, selected randomly from a pool of possibilities each time you start a new game. Most of these are given to you by the townspeople when you talk to them — but, again, all are extremely basic, coming down to “kill this monster” or “collect this object” (which, come to think of it, always involves killing the monster guarding it).

In practice, playing Diablo is a very simple loop. You go into the depths and make as much progress as you can against the hordes of enemies that await you there. Then you return topside to sell off the stuff you’ve collected that you don’t need, heal up and buy any potions or other equipment you think you’re going to need, and go downstairs again. Rinse and repeat, until you meet and hopefully kill Diablo himself. Unlike the typical epic CRPG, Diablo is intended to be a game you play over and over again. Thus the average playthrough takes only ten hours or so, as opposed to the hundred or more of its weightier brethren.

Blizzard North’s stated goal was to make Diablo “so easy your mom could play it.” Setting aside the condescension of their choice of words, they certainly achieved their goal in spirit. Fighting monsters is simply a matter of clicking on them, which causes your character to whack them with his melee weapon or fire off an arrow or spell at them. Tactics in the dungeons come down to common sense: whittling away at the edges of large groups of monsters instead of charging right into the middle of them, using doorways and narrow corridors to your advantage, keeping a healthy distance and using ranged attacks if you’re playing a rogue or a mage. That said, it does pay to learn the monsters’ strengths and weaknesses and tailor your attacks to them: skeletons, for example, are more vulnerable to attacks by blunt weapons such as maces than edged weapons such as swords.

The biggest source of tension is the question of when you should leave off in the dungeon and return to the town for succor. Usually when you die, it’s because you’ve pressed your luck just a bit too much. On the whole, though — and ironically given its line of descent through one of the most infamously unforgiving sub-genres in all of gaming — Diablo is one of the less intrinsically challenging games I’ve played in the course of writing these histories. If you do find yourself feeling under-powered and over-matched — perhaps because you made poor choices about where to allocate the ability points your character is awarded every time she levels up — you can always restart the game whilst retaining your existing character, complete with her current statistics and all of her current kit. Poor character-building choices or a general lack of skill can, in other words, always be compensated for with patient grinding.

Notice the auto-map overlaid onto the standard display…

In lieu of challenge, Diablo thrives on its polished addictiveness. Vanishingly few of its contemporaries can even begin to touch it in terms of intuitive playability. It’s clear that every last detail — every last window, every last hotkey, every last mouse click — was fussed over for hours and hours, until it was just what it ought to be. The auto-map is a thing of wonder that I have to call out for special praise. In CRPGs of the 1990s, such things are usually found in a separate window on the main display that is always too small for comfort and yet takes up too much precious screen real estate — or the auto-map can only be accessed on a separate screen, leaving you constantly flipping back and forth between the two views as you try to get somewhere. Diablo‘s auto-map, on the other hand, appears as a transparent overlay right on top of the usual display, toggled on and off by pressing the TAB key. Like everything else here, it’s elegant and perfect, a brilliant stroke that could only have come about through dedicated, dogged iteration. You have to be in awe of the craftsmanship of this game. It knows precisely what it wants to be, and it achieves its best self in every respect.

This statement applies equally to the game’s aesthetics, which are nothing short of masterful; whatever Diablo lacks in set-piece storytelling, it makes up for in atmosphere. If I had to describe that atmosphere in one word, it would be “Gothic.” Diablo captures the side of the Middle Ages that all of those Tolkienesque CRPGs cheerfully ignore in the midst of all their elves and halflings romping merrily through the forest: the all-encompassing religion of Christianity, the almost tangible reality of another life that awaits after this one, which is as much a source of fear as comfort in the minds of the people. Diablo taps into something deep and almost primal in the human psyche, having more in common with The Exorcist than The Lord of the Rings, more in common with Hieronymus Bosch than Boris Vallejo. The shocking ending, which I won’t spoil here, is likewise more horror than fantasy. Diablo is lucky it wasn’t released during the Satanic Panic of the 1980s, given that it sports much of what all those concerned parents were looking for in Dungeons & Dragons and not quite finding.

The lair of the Butcher, one of the gorier locations in Diablo. “Fresh meat!”

Matt Uelmen’s amazingly sophisticated soundtrack, recorded partially on real instruments at a time when many games were still relying entirely on tinny MIDI sound fonts, could easily have played behind a big-budget horror movie. The “Town” theme, featuring the best use of a twelve-string guitar since the heyday of the Byrds, is especially unforgettable; it took me back instantly when I heard it again after 25 years away.


All that said, I won’t go so far as to say that Diablo itself is scary. It seems to me that gameplay that revolves around killing hundreds of monsters is incompatible with true horror. Horror depends on a feeling of powerlessness, whereas Diablo is, like almost all CRPGs, a power fantasy at bottom. Nevertheless, it’s as audiovisually focused and accomplished as any game I’ve ever seen. I say this even as I freely acknowledge that its unrelentingly dark atmosphere tends to wear thin with me pretty quickly. (For me, a bit of light and joy brings out the shadows that much more effectively.)

And sadly, that statement pretty much sums up my response to Diablo as a whole, which is the same today as it was 25 years ago. It does what it does brilliantly. I just wish I liked what it does a bit more. Let me tell you how I got on with it when I played it for this article…

Given its titanic importance, my first plan was to play through it three times, once for each of the character classes. I first bashed my way to the finish line as a warrior. As I did so, I admired all of the qualities described above, but I also found the experience a little hollow; I didn’t dread sitting down with the game on the couch after dinner each evening for an hour or two, but neither did I look forward to it all that much — and nor did my wife have to tell me twice that it was time for bed, as she has to when I’m playing some games. I came to regard my Diablo sessions much as I might, say, an old episode of Law & Order: a low-effort something to pass the time, which I could do while chatting intermittently with my wife about completely different things. When I finished the game, I put it on the shelf for several months, intending always to get back to it but never feeling all that excited about doing so. Finally, knowing I had to write this article soon, I forced myself to start a new game as a rogue, hoping that character might be more interesting to play. But this time I found myself actively bored; “been there, done that” was the dominant note. Halfway through, I just couldn’t muster the will to continue. I could admire Diablo for its craftsmanship, but I couldn’t love it.

What am I to make of this? Obviously, I’m in the group of people who just aren’t really in the market for what Diablo is selling — a group who tend to be as vocal in their criticisms as the game’s fans are in their praise. But I’m not eager to join the chest-beating grognards who call Diablo dumbed down, or who shout that it’s not even a real CRPG at all. (Is there anything more tedious than a semantic debate between intractably biased parties?) It’s actually not Diablo‘s simplicity that puts me off; I’m much more likely to scold a game for being too complicated than for being too simple. And then too, over the years I’ve been writing these histories, I’ve found many — perhaps most — games from the 1980s and 1990s to be more rather than less difficult than I really need them to be, so it’s not precisely the lack of challenge that bothers me about Diablo either. Too easy is far, far better in my book than too hard.

On the other hand, I do tend to prefer human-crafted to procedurally-generated content in general, and Diablo doesn’t do anything to disabuse me of that notion. Its randomized nature means that its dungeons can only be a collection of rooms, corridors, and monsters, without the guileful tricks and traps and drama of the best dungeon crawlers of yore. Beyond that, and beyond an aesthetic presentation that isn’t quite to my taste, I think my lack of receptivity to Diablo is to do with the passivity of the experience. I’ve seen it described as a good “hangover game,” what with how little it actually asks of you. Even more tellingly, I’ve seen it called the gaming equivalent of candy: you can eat an awful lot of it without thinking much about it, but it doesn’t leave you feeling all that great afterward.

One nice thing about getting older is that you learn what makes you feel good and bad. I’ve long since learned, for instance, that I’m happiest if I don’t play games for more than a couple of hours per day, even on those rare occasions when I have time for more. But I want those hours to have substance — to yield fun stories to tell, interesting decisions to remember, strategies or puzzle solutions to muse about while I’m cooking dinner or working out or taking a walk, accomplishments to feel good about. For me, Diablo is peculiarly flat; I went, I saw, I clicked on monsters. For me, it feels less like a time waster than a waste of time. I almost find myself wishing the game wasn’t so superbly polished in every particular, just to relieve the monotony.

More substantively, I do see one aspect of Diablo as vaguely ominous in the larger context of gaming history: the way it uses stuff to do the heavy lifting of player motivation. As I mentioned above, “loot drops” became a thing in gaming with this game. Although CRPGs had been tempting and teasing players with the prospect of a new magic sword or armor as long as they had existed, Diablo put that temptation front and center, making it the main driver of its gameplay loop. In doing so, David Brevik and company consciously tapped into something besides the allure of the Gothic that is primal in human psychology. They liked to use the analogy of a slot machine: you clicked endlessly on monsters in the hope that eventually something really good would drop out of one of them. When I hear these anecdotes, I can’t help but think of the glassy-eyed zombies to be found in casinos from Shreveport to Macau, pulling the handles of the one-armed bandits again and again for hours, likewise waiting for something good to drop into their laps. Pat Wyatt, Blizzard’s vice president of research and development at the time of Diablo‘s creation, proffers an even more disturbing metaphor: “Positive reinforcement is one of the hardest types of conditioning to break, which is why pets beg at the table: rewards may not happen very often, but every once in a while you get a scrap, so they keep begging.” In the decades after Diablo, this Pavlovian loop would be exploited mercilessly by cynical game makers, trapping players in unsatisfying cycles of addiction that drained their time and their wallet, leaving them with nothing but a few virtual trinkets to their names in a virtual world that would be gone in a year or two anyway.

In the late 1990s, the dangerous addictiveness of loot drops was most in evidence in multi-player Diablo, as played on Battle.net, which in its early years was a fascinating if ofttimes toxic social laboratory in its own right. I do have more to say about it, but I think I’ll reserve it for a future article which will look at this formative period of online gaming in a more holistic way.

Instead, let me say in conclusion today what I often say when I end a review on a downer note: that no game is for everyone, and no way of having fun is wrong, as long as you aren’t hurting anyone else or yourself. If you love Diablo, you’re in good company. It’s a fine, fine game by any objective measure. Whatever cynicism it might have inspired is on the conscience of the folks who displayed it; this game was made for all the right reasons. It’s a triumph of care and dedication from which many another studio could learn, then and now. Just be sure to remember that there’s a beautiful world out there with plenty of cloudless blue skies to contrast with Diablo‘s perpetually sooty ones, and you’ll be just fine. Click away, my friends, click away!



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: As was the case with my last article, I’m hugely indebted to David L. Craddock for Stay Awhile and Listen Book I and Book II, which I plundered for quotes with all the enthusiasm of a Diablo loot hunter. By all means, check out these books if you’re interested in learning more about the Blizzard story.

Magazine sources include Computer Gaming World of August 1996, December 1996, March 1997, April 1997, and May 1997; Retro Gamer 43 and 103. Online sources include Lee Hutchison’s interview with David Brevik for Ars Technica, the Dev Game Club interview with Brevik, and Brevik’s Diablo post-mortem at the 1996 Game Developers Conference.

Diablo and its controversial expansion Hellfire are available as a single digital purchase at GOG.com.)

Footnotes

Footnotes
1 The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.
 

Tags: ,

A Dialog in Real Time (Strategy)

At the end of the 1990s, the two most popular genres in computer gaming were the first-person shooter and the real-time strategy game. They were so dominant that most of the industry’s executives seemed to want to publish little else. And yet at the beginning of the decade neither genre even existed.

The stories of how the two rose to such heady heights are a fascinating study in contrasts, of how influences in media can either go off like an explosion in a TNT factory or like the slow burn of a long fuse. Sometimes something appears and everyone knows instantly that it’s just changed everything; when the Beatles dropped Sgt. Pepper’s Lonely Hearts Club Band in 1967, there was no doubt that the proverbial goalposts in rock music had just been shifted. Other times, though, influence can take years to make itself felt, as was the case for another album of 1967, The Velvet Underground & Nico, about which Brian Eno would later famously say that it “only sold 10,000 copies, but everyone who bought it formed a band.”

Games are the same. Gaming’s Sgt. Pepper was DOOM, which came roaring up out of the shareware underground at the tail end of 1993 to sweep everything from its path, blowing away all of the industry’s extant conventional wisdom about what games would become and what role they would play in the broader culture. Gaming’s Velvet Underground, on the other hand, was the avatar of real-time strategy, which came to the world in the deceptive guise of a sequel in the fall of 1992. Dune II: The Building of a Dynasty sported its Roman numeral because its transnational publisher had gotten its transatlantic cables crossed and accidentally wound up with two separate games based on Frank Herbert’s epic 1965 science-fiction novelone made in Paris, the other in Las Vegas. The former turned out to be a surprisingly evocative and playable fusion of adventure and strategy game, but it was the latter that would quietly — oh, so quietly in the beginning! — shift the tectonic plates of gaming.

For Dune II, which was developed by Westwood Studios and published by Virgin Games, really was the first recognizable implementation of the genre of real-time strategy as we have come to know it since. You chose one of three warring trading houses to play, then moved through a campaign made up of a series of set-piece scenarios, in which your first goal was always to make yourself an army by gathering resources and using them to build structures that could churn out soldiers, tanks, aircraft, and missiles, all of which you controlled by issuing them fairly high-level orders: “go here,” “harvest there,” “defend this building,” “attack that enemy unit.” Once you thought you were strong enough, you could launch your full-on assault on the enemy — or, if you weren’t quick enough, you might find yourself trying to fend off his attack. What made it so different from most of the strategy games of yore was right there in the name: in the fact that it all played out in real time, at a pace that ranged from the brisk to the frantic, making it a test of your rapid-fire mousemanship and your ability to think on your feet. Bits and pieces of all this had been seen before — perhaps most notably in Peter Molyneux and Bullfrog’s Populous and the Sega Genesis game Herzog Zwei — but Dune II was where it all came together to create a gaming paradigm for the ages.

That said, Dune II was very much a diamond in the rough, a game whose groundbreaking aspirations frequently ran up against the brick wall of its limitations. It’s likely to leave anyone who has ever played almost any other real-time-strategy game seething with frustration. It runs at a resolution of just 320 X 200, giving only the tiniest window into the battlefield; it only lets you select and control one unit at a time, making coordinated attacks and defenses hard to pull off; its scenarios are somewhat rote exercises, differing mainly in the number of enemy hordes they throw against you as you advance through the campaign rather than the nature of the terrain or your objectives. Even its fog of war is wonky: the whole battlefield is blank blackness until one of your units gets within visual range, after which you can see everything that goes on there forevermore, whether any of your units can still lay eyes on it or not. And it has no support whatsoever for the multiplayer free-for-alls that are for many or most players the biggest draw of the genre.

Certainly Virgin had no inkling that they had a nascent ludic revolution on their hands. They released Dune II with more of a disinterested shrug than a fulsome fanfare, having expended most of their promotional energies on the other Dune, which had come out just a few months earlier. It’s a testimony to the novelty of the gameplay experience that it did as well as it did. It didn’t become a massive hit, but it sold well enough to earn its budget back and then some on the strength of reasonably positive reviews — although, again, no reviewer had the slightest notion that he was witnessing the birth of what would be one of the two hottest genres in gaming six years in the future. Even Westwood seemed initially to regard Dune II as a one-and-done. They wouldn’t release another game in the genre they had just invented for almost three years.

But the gaming equivalent of all those budding bedroom musicians who listened to that Velvet Underground record was also out there in the case of Dune II. One hungry, up-and-coming studio in particular decided there was much more to be done with the approach it had pioneered. And then Westwood themselves belatedly jumped back into the fray. Thanks to the snowball that these two studios got rolling in earnest during the mid-1990s, the field of real-time strategy would be well and truly saturated by the end of the decade, the yin to DOOM‘s yang. This, then, is the tale of those first few years of these two studios’ competitive dialog, over the course of which they turned the real-time strategy genre from a promising archetype into one of gaming’s two biggest, slickest crowd pleasers.


Blizzard Studios is one of the most successful in the history of gaming, so much so that it now lends its name to the Activision Blizzard conglomerate, with annual revenues in the range of $7.5 billion. In 1993, however, it was Westwood, flying high off the hit dungeon crawlers Eye of the Beholder and Lands of Lore, that was by far the more recognizable name. In fact, Blizzard wasn’t even known yet as Blizzard.

The company had been founded in late 1990 by Allen Adham and Mike Morhaime, a couple of kids fresh out of university, on the back of a $15,000 loan from Morhaime’s grandmother. They called their venture Silicon & Synapse, setting it up in a hole-in-the-wall office in Costa Mesa, California. They kept the lights on initially by porting existing games from one platform to another for publishers like Interplay — the same way, as it happened, that Westwood had gotten off the ground almost a decade before. And just as had happened for Westwood, Silicon & Synapse gradually won opportunities to make their own games once they had proven themselves by porting those of others. First there was a little auto-racing game for the Super Nintendo called RPM Racing, then a pseudo-sequel to it called Rock ‘n’ Roll Racing, and then a puzzle platformer called The Lost Vikings, which appeared for the Sega Genesis, MS-DOS, and the Commodore Amiga in addition to the Super Nintendo. None of these titles took the world by storm, but they taught Silicon & Synapse what it took to create refined, playable, mass-market videogames from scratch. All three of those adjectives have continued to define the studio’s output for the past 30 years.

It was now mid-1993; Silicon & Synapse had been in business for more than two and a half years already. Adham and Morhaime wanted to do something different — something bigger, something that would be suitable for computers only rather than the less capable consoles, a real event game that would get their studio’s name out there alongside the Westwoods of the world. And here there emerged another of their company’s future trademarks: rather than invent something new from whole or even partial cloth, they decided to start with something that already existed, but make it better than ever before, polishing it until it gleamed. The source material they chose was none other than Westwood’s Dune II, now relegated to the bargain bins of last year’s releases, but a perennial after-hours favorite at the Silicon & Synapse offices. They all agreed as to the feature they most missed in Dune II: a way to play it against other people, like you could its ancestor Populous. The bane of most multiplayer strategy games was their turn-based nature, which left you waiting around half the time while your buddy was playing. Real-time strategy wouldn’t have this problem of downtime.

That became the design brief for Warcraft: Orcs & Humans: remake Dune II but make it even better, and then add a multiplayer feature. And then, of course, actually try to sell the thing in all the ways Virgin had not really tried to sell its inspiration.

To say that Warcraft was heavily influenced by Dune II hardly captures the reality. Most of the units and buildings to hand have a direct correspondent in Westwood’s game. Even the menu of icons on the side of the screen is a virtual carbon copy — or at least a mirror image. “I defensively joked that, while Warcraft was certainly inspired by Dune II, [our] game was radically different,” laughs Patrick Wyatt, the lead programmer and producer on the project. “Our radar mini-map was in the upper left corner of the screen, whereas theirs was in the bottom right corner.”

In the same spirit of change, Silicon & Synapse replaced the desert planet of Arrakis with a fantasy milieu pitting, as the subtitle would suggest, orcs against humans. The setting and the overall look of Warcraft owe almost as much to the tabletop miniatures game Warhammer as the gameplay does to Dune II; a Warhammer license was seriously considered, but ultimately rejected as too costly and potentially too restrictive. Years later, Wyatt’s father would give him a set of Warhammer miniatures he’d noticed in a shop: “I found these cool toys and they reminded me a lot of your game. You might want to have your legal department contact them because I think they’re ripping you off.”

Suffice to say, then, that Warcraft was even more derivative than most computer games. The saving grace was the same that it would ever be for this studio: that they executed their mishmash of influences so well. The squishy, squint-eyed art is stylized like a cartoon, a wise choice given that the game is still limited to a resolution of just 320 X 200, so that photo-realism is simply not on the cards. The overall look of Warcraft has more in common with contemporary console games than the dark, gritty aesthetic that was becoming so popular on computers. The guttural exclamations of the orcs and the exaggerated Monty Python and the Holy Grail-esque accents of the humans, all courtesy of regular studio staffers rather than outside voice actors, become a chorus line as you order them hither and yon, making Dune II seem rather stodgy and dull by comparison. “We felt too many games took themselves too seriously,” says Patrick Wyatt. “We just wanted to entertain people.”

Slavishly indebted though it is to Dune II in all the broad strokes, Warcraft doesn’t neglect to improve on its inspiration in those nitty-gritty details that can make the difference between satisfaction and frustration for the player. It lets you select up to four units and give them orders at the same time by simply dragging a box around them, a quality-of-life addition whose importance is difficult to overstate, one so fundamental that no real-time-strategy game from this point forward would dare not to include it. Many more keyboard shortcuts are added, a less technically impressive addition but one no less vital to the cause of playability when the action starts to heat up. There are now two resources you need to harvest, lumber and gold, in places of Dune II‘s all-purpose spice. Units are now a little more intelligent about interpreting your orders, such that they no longer blithely ignore targets of opportunity, or let themselves get mauled to death without counterattacking just because you haven’t explicitly told them to. Scenario design is another area of marked improvement: whereas every Dune II scenario is basically the same drill, just with ever more formidable enemies to defeat, Warcraft‘s are more varied and arise more logically out of the story of the campaign, including a couple of special scenarios with no building or gathering at all, where you must return a runaway princess to the fold (as the orcs) or rescue a stranded explorer (as the humans).

The orc on the right who’s stroking his “sword” looks so very, very wrong — and this screenshot doesn’t even show the animation…

And, as the cherry on top, there was multiplayer support. Patrick Wyatt finished his first, experimental implementation of it in June of 1994, then rounded up a colleague in the next cubicle over so that they could became the first two people ever to play a full-fledged real-time-strategy game online. “As we started the game, I felt a greater sense of excitement than I’d ever known playing any other game,” he says.

It was just this magic moment, because it was so invigorating to play against a human and know that it wasn’t some stupid AI. It was a player who was smart and doing his absolute best to crush you. I knew we were making a game that would be fun, but at that moment I knew the game would absolutely kick ass.

While work continued on Warcraft, the company behind it was going through a whirlwind of changes. Recognizing at long last that “Silicon & Synapse” was actually a pretty terrible name, Adham and Morhaime changed it to Chaos Studios, which admittedly wasn’t all that much better, in December of 1993. Two months later, they got an offer they couldn’t refuse: Davidson & Associates, a well-capitalized publisher of educational software that was looking to break into the gaming market, offered to buy the freshly christened Chaos for the princely sum of $6.75 million. It was a massive over-payment for what was in all truth a middling studio at best, such that Adham and Morhaime felt they had no choice but to accept, especially after Davidson vowed to give them complete creative freedom. Three months after the acquisition, the founders decided they simply had to find a decent name for their studio before releasing Warcraft, their hoped-for ticket to the big leagues. Adham picked up a dictionary and started leafing through it. He hit pay dirt when his eyes flitted over the word “blizzard.” “It’s a cool name! Get it?” he asked excitedly. And that was that.

So, Warcraft hit stores in time for the Christmas of 1994, with the name of “Blizzard Entertainment” on the box as both its developer and its publisher — the wheels of the latter role being greased by the distributional muscle of Davidson & Associates. It was not immediately heralded as a game that would change everything, any more than Dune II had been; real-time strategy continued to be more of a slowly growing snowball than the ton of bricks to the side of the head that the first-person shooter had been. Computer Gaming World magazine gave Warcraft a cautious four stars out of five, saying that “if you enjoy frantic real-time games and if you don’t mind a linear structure in your strategic challenges, Warcraft is a good buy.” At the same time, the extent of the game’s debt to Dune II was hardly lost on the reviewer: “It’s a good thing for Blizzard that there’s no precedent for ‘look and feel’ lawsuits in computer entertainment.”[1]This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.

Warcraft would eventually sell 400,000 units, bettering Dune II‘s numbers by a factor of four or more. As soon as it became clear that it was doing reasonably well, Blizzard started on a sequel.


Out of everyone who looked at Warcraft, no one did so with more interest — or with more consternation at its close kinship with Dune II — than the folks at Westwood. “When I played Warcraft, the similarities between it and Dune II were pretty… blatant, so I didn’t know what to think,” says the Westwood designer Adam Isgreen. Patrick Wyatt of Blizzard got the impression that his counterparts “weren’t exactly happy” at the slavish copying when they met up at trade shows, though he “reckoned they should have been pleased that we’d taken their game as a base for ours.” Only gradually did it become clear why Warcraft‘s existence was a matter of such concern for Westwood: because they themselves had finally decided to make another game in the style of Dune II.

The game that Westwood was making could easily have wound up looking even more like the one that Blizzard had just released. The original plan was to call it Command & Conquer: Fortress of Stone and to set it in a fantasy world. (Westwood had been calling their real-time-strategy engine “Command & Conquer” since the days of promoting Dune II.) “It was going to have goldmines and wood for building things. Sound familiar?” chuckles Westwood’s co-founder Louis Castle. “There were going to be two factions, humans and faerie folk… pretty fricking close to orcs versus humans.”

Some months into development, however, Westwood decided to change directions, to return to a science-fictional setting closer to that of Dune II. For they wanted their game to be a hit, and it seemed to them that fantasy wasn’t the best guarantee of such a thing: CRPGs were in the doldrums, and the most recent big strategy release with a fantasy theme, MicroProse’s cult-classic-to-be Master of Magic, hadn’t done all that well either. Foreboding near-future stories, however, were all the rage; witness the stellar sales of X-COM, another MicroProse strategy game of 1994. “We felt that if we were going to make something that was massive,” says Castle, “it had to be something that anybody and everybody could relate to. Everybody understands a tank; everybody understands a guy with a machine gun. I don’t have to explain to them what this spell is.” Westwood concluded that they had made the right decision as soon as they began making the switch in software: “Tanks and vehicles just felt better.” The game lost its subtitle to become simply Command & Conquer.

While the folks at Blizzard were plundering Warhammer for their units and buildings, those at Westwood were trolling the Jane’s catalogs of current military hardware and Soldier of Fortune magazine. “We assumed that anything that was talked about as possibly coming was already here,” says Castle, “and that was what inspired the units.” The analogue of Dune II‘s spice — the resource around which everything else revolved — became an awesomely powerful space-born element come to earth known as tiberium.

Westwood included most of the shortcuts and conveniences that Blizzard had built into Warcraft, but went one or two steps further more often than not. For example, they also made it possible to select multiple units by dragging a box around them, but in their game there was no limit to the number of units that could be selected in this way. The keyboard shortcuts they added not only let you quickly issue commands to units and buildings, but also jump around the map instantly to custom viewpoints you could define. And up to four players rather than just two could now play together at once over a local network or the Internet, for some true mayhem. Then, too, scenario design was not only more varied than in Dune II but was even more so than in Warcraft, with a number of “guerilla” missions in the campaigns that involved no resource gathering or construction. It’s difficult to say to what extent these were cases of parallel innovation and to what extent they were deliberate attempts to one-up what Warcraft had done. It was probably a bit of both, given that Warcraft was released a good nine months before Command & Conquer, giving Westwood plenty of time to study it.

But other innovations in Command & Conquer were without any precedent. The onscreen menus could now be toggled on and off, for instance, a brilliant stroke that gave you a better view of the battlefield when you really needed it. Likewise, Westwood differentiated the factions in the game in a way that had never been done before. Whereas the different houses in Dune II and the orcs and humans in Warcraft corresponded almost unit for unit, the factions in Command & Conquer reflected sharply opposing military philosophies, demanding markedly different styles of play: the establishment Global Defense Initiative had slow, strong, and expensive units, encouraging a methodical approach to building up and husbanding your forces, while the terroristic Brotherhood of Nod had weaker but faster and cheaper minions better suited to madcap kamikaze rushes than carefully orchestrated combined-arms operations.

Yet the most immediately obvious difference between Command & Conquer and Warcraft was all the stuff around the game. Warcraft had been made on a relatively small budget with floppy disks in mind. It sported only a brief opening cinematic, after which scenario briefings consisted of nothing but scrolling text and a single voice over a static image. Command & Conquer, by contrast, was made for CD-ROM from the outset, by a studio with deeper pockets that had invested a great deal of time and energy into both 3D animation and full-motion video, that trendy art of incorporating real-world actors and imagery into games. The much more developed story line of Command & Conquer is forwarded by little between-mission movies that, if not likely to make Steven Spielberg nervous, are quite well-done for what they are, featuring as they do mostly professional performers — such as a local Las Vegas weatherman playing a television-news anchorman — who were shot by a real film crew in Westwood’s custom-built blue-screen studio. Westwood’s secret weapon here was Joseph Kucan, a veteran theater director and actor who oversaw the film shoots and personally played the charismatic Nod leader Kane so well that he became the very face of Command & Conquer in the eyes of most gamers, arguably the most memorable actual character ever associated with a genre better known for its hordes of generic little automatons. Louis Castle reckons that at least half of Command & Conquer‘s considerable budget went into the cut scenes.

The game was released with high hopes in August of 1995. Computer Gaming World gave it a pretty good review, four stars out of five: “The entertainment factor is high enough and the action fast enough to please all but the most jaded wargamers.”

The gaming public would take to it even more than that review might imply. But in the meantime…


As I noted in an earlier article, numbered sequels weren’t really commonplace for strategy games prior to the mid-1990s. Blizzard had originally imagined Warcraft as a strategy franchise of a different stripe: each game bearing the name would take the same real-time approach into a completely different milieu, as SSI was doing at the time with their “5-Star General” series of turn-based strategy games that had begun with Panzer General and continued with the likes of Fantasy General and Star General. But Blizzard soon decided to make their sequel a straight continuation of the first game, an approach to which real-time strategy lent itself much more naturally than more traditional styles of strategy game; the set-piece story of a campaign could, after all, always be continued using all the ways that Hollywood had long since discovered for keeping a good thing going. The only snafu was that either the orcs or the humans could presumably have won the war in the first game, depending on which side the player chose. No matter: Blizzard decided the sequel would be more interesting if the orcs had been the victors and ran with that.

Which isn’t to say that building upon its predecessor’s deathless fiction was ever the real point of Warcraft II: Tides of Darkness. Blizzard knew now that they had a competitor in Westwood, and were in any case eager to add to the sequel all of the features and ideas that time had not allowed them to include in the first game. There would be waterways and boats to sail on them, along with oil, a third resource, one that could only be mined at sea. Both sides would get new units to play with, while elves, dwarves, trolls, ogres, and goblins would join the fray as allies of one of the two main racial factions. The interface would be tweaked with another welcome shortcut: selecting a unit and right-clicking somewhere would cause it to carry out the most logical action there without having to waste time choosing from a menu. (After all, if you selected a worker unit and sent him to a goldmine, you almost certainly wanted him to start collecting gold. Why should you have to tell the game the obvious in some more convoluted fashion?)

But perhaps the most vital improvement was in the fog of war. The simplistic implementations of same seen in the first Warcraft and Command & Conquer were inherited from Dune II: areas of the map that had been seen once by any of your units were revealed permanently, even if said units went away or were destroyed. Blizzard now made it so that you would see only a back-dated snapshot of areas currently out of your units’ line of sight, reflecting what was there the last time one of your units had eyes on them. This innovation, no mean feat of programming on the part of Patrick Wyatt, brought a whole new strategic layer to the game. Reconnaissance suddenly became something you had to think about all the time, not just once.

Other improvements were not so conceptually groundbreaking, but no less essential for keeping ahead of the Joneses (or rather the Westwoods). For example, Blizzard raised the screen-resolution stakes, from 320 X 200 to 640 X 480, even as they raised the number of people who could play together online from Command & Conquer‘s four to eight. And, while there was still a limit on the number of units you could select at one time using Blizzard’s engine, that limit at least got raised from the first Warcraft‘s four to nine.

The story and its presentation, however, didn’t get much more elaborate than last time out. While Westwood was hedging its bets by keeping one foot in the “interactive movie” space of games like Wing Commander III, Blizzard was happy to “just” make Warcraft a game. The two series were coming to evince very distinct personalities and philosophies, just as gamers were sorting themselves into opposing groups of fans — with a large overlap of less partisan souls in between them, of course.

Released in December of 1995, Warcraft II managed to shake Computer Gaming World free of some of its last reservations about the burgeoning genre of real-time strategy, garnering four and a half stars out of five: “If you enjoy fantasy gaming, then this is a sure bet for you.” It joined Command & Conquer near the top of the bestseller lists, becoming the game that well and truly made Blizzard a name to be reckoned with, a peer in every sense with Westwood.

Meanwhile, and despite the sometimes bitter rivalry between the two studios and their fans, Command & Conquer and Warcraft II together made real-time strategy into a commercial juggernaut. Both games became sensations, with no need to shirk from comparison to even DOOM in terms of their sales and impact on the culture of gaming. Each eventually sold more than 3 million copies, numbers that even the established Westwood, much less the upstart Blizzard, had never dreamed of reaching before, enough to enshrine both games among the dozen or so most popular computer games of the entire 1990s. More than three years after real-time strategy’s first trial run in Dune II, the genre had arrived for good and all. Both Westwood and Blizzard rushed to get expansion packs of additional scenarios for their latest entries in the genre to market, even as dozens of other developers dropped whatever else they were doing in order to make real-time-strategy games of their own. Within a couple of years, store shelves would be positively buckling under the weight of their creations — some good, some bad, some more imaginative, some less so, but all rendered just a bit anonymous by the sheer scale of the deluge. And yet even the most also-ran of the also-rans sold surprisingly well, which explained why they just kept right on coming. Not until well into the new millennium would the tide begin to slacken.


With Command & Conquer and Warcraft II, Westwood and Blizzard had arrived at an implementation of real-time strategy that even the modern player can probably get on with. Yet there is one more game that I just have to mention here because it’s so loaded with a quality that the genre is known for even less than its characters: that of humor. Command & Conquer: Red Alert is as hilarious as it is unexpected, the only game of this style that’s ever made me laugh out loud.

Red Alert was first envisioned as a scenario pack that would move the action of its parent game to World War II. But two things happened as work progressed on it: Westwood decided it was different enough from the first game that it really ought to stand alone, and, as designer Adam Isgreen says, “we found straight-up history really boring for a game.” What they gave us instead of straight-up history is bat-guano insane, even by the standards of videogame fictions.

We’re in World War II, but in a parallel timeline, because Albert Einstein — why him? I have no idea! — chose to travel back in time on the day of the Trinity test of the atomic bomb and kill Adolf Hitler. Unfortunately, all that’s accomplished is to make world conquest easier for Joseph Stalin. Now Einstein is trying to save the democratic world order by building ever more powerful gadgets for its military. Meanwhile the Soviet Union is experimenting with the more fantastical ideas of Nikola Tesla, which in this timeline actually work. So, the battles just keep getting crazier and crazier as the game wears on, with teleporters sending units jumping instantly from one end of the map to the other, Tesla coils zapping them with lightning, and a fetching commando named Tanya taking out entire cities all by herself when she isn’t chewing the scenery in the cut scenes. Those actually display even better production values than the ones in the first game, but the script has become pure, unadulterated camp worthy of Mel Brooks, complete with a Stalin who ought to be up there singing and dancing alongside Der Führer in Springtime for Hitler. Even our old friend Kane shows up for a cameo. It’s one of the most excessive spectacles of stupidity I’ve ever seen in a game… and one of the funniest.

Joseph Stalin gets rough with an underling. When you don’t have the Darth Vader force grip, you have to do things the old-fashioned way…

Up there at the top is the killer commando Tanya, who struts across the battlefield with no regard for proportion.

Released in the dying days of 1996, Red Alert didn’t add that much that was new to the real-time-strategy template, technically speaking; in some areas such as fog of war, it still lagged behind the year-old Warcraft II. Nonetheless, it exudes so much joy that it’s by far my favorite of the games I’ve written about today. If you ask me, it would have been a better gaming world had the makers of at least a few of the po-faced real-time-strategy games that followed looked here for inspiration. Why not? Red Alert too sold in the multiple millions.



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: the book Stay Awhile and Listen, Book I by David L. Craddock; Computer Gaming World of January 1995, March 1995, December 1995, March 1996, June 1996, September 1996, December 1996, March 1997, June 1997, and July 1997; Retro Gamer 48, 111, 128, and 148; The One of January 1993; the short film included with the Command & Conquer: The First Decade game collection. Online sources include Patrick Wyatt’s recollections at his blog Code of Honor, Dan Griliopoulos’s collection of interviews with Westwood alumni at Funambulism, Soren Johnson’s interview with Louis Castle for his Designer’s Notes podcast, and Richard Moss’s real-time-strategy retrospective for Ars Technica.

Warcraft: Orcs & Humans and Warcraft II: Tides of Darkness, are available as digital purchases at GOG.com. The first Command & Conquer and Red Alert are available in remastered versions as a bundle from Steam.)

Footnotes

Footnotes
1 This statement was actually not correct; makers of standup arcade games of the classic era and the makers of Tetris had successfully cowed the cloning competition in the courts.
 

Tags: , , , , ,