RSS

Monthly Archives: July 2023

Diablo

All of us had become disappointed with computer RPGs because they were going in the opposite direction of where we thought they should be going. They were becoming story- and stat-laden, really appealing to a super-small niche of super RPG geeks — which we were in a way, but that wasn’t really our style.

So, when [David] Brevik mentioned these roguelike games, it was kind of a natural. “Yeah, let’s take that cool, addictive structure and modernize it. Let’s strip away the stuff that’s turning off a lot of game fans from RPGs.”

— Max Schaefer of Blizzard North

A palpable sense of ennui dogged the Consumer Electronics Shows of 1994. The venerable semiannual expo where such landmark gaming hardware as the Atari VCS, the Commodore 64 and Amiga, the Nintendo Entertainment System and Super Entertainment System, and the Sega Genesis had been seen for the first time seemed somehow past its sell-by date now. Attendance at the Summer CES in particular was down in a big way, so much so that the organizers would move the event out of its long-standing home in Chicago’s McCormick Place the following year and turn it into a traveling exhibition in the hope of drumming up some much-needed excitement. In the meantime, the makers of gaming software had an especially underwhelming time of it in Chicago that year: as usual, they were treated as second-class citizens by the organizers, relegated to the hall’s basement so that the choicer spaces were kept free for cutting-edge toasters, refrigerators, and microwave ovens.

Among the games people who were having the worst time of it of all were the folks behind a tiny San Mateo, California, studio called Condor, Incorporated. David Brevik and his co-founders, the brothers Max and Erich Schaefer, were ostensibly at the show to demonstrate their very first finished original game, a Genesis title called Justice League: Task Force. But they knew the game was no great shakes. They had made exactly what their publisher, the financially troubled Japanese giant Sunsoft, had ordered them to make in rather pedantic detail: a blatant clone of yesteryear’s massive hit Street Fighter II, with DC Comics superheroes inserted in place of its inspiration’s pugilists. They felt it was competently executed, but knew as well as anyone that it was no more than a quickie placeholder product for a five-year-old console that was soon due to be superseded by the next-generation Sega Saturn.

Their ulterior motive for being at CES was something else entirely. Brevik had an idea for a computer game called Diablo, which he had been slowly expanding upon ever since he had lived with his family at the foot of the California mountain of that name back in the mid-1980s. Now, he felt its time had come; he desperately wanted to interest a publisher in it. But every executive he talked to at the show starting shaking his head as soon as he saw the first line of the pitch document, stating that it was “a proposal for a role-playing game.” For CRPGs were dead and buried according to the industry’s conventional wisdom, having nothing to offer in an era when multimedia flash and 3D mayhem reigned supreme. They were quaint at best, deadly boring at worst, as their recent sales figures reflected.

Thoroughly disheartened by his proposal’s reception, Brevik duly turned up with the Schaefer brothers at the appointed time to show Justice League to the assembled press. And here they all got a shock. They learned only minutes before taking the stage that Sunsoft had actually arranged to make a second version of the game for the Super Nintendo, sending the same design brief to another little studio, Blizzard Entertainment of Costa Mesa, California. Both development teams could immediately see that the other had done a pretty solid, professional job with a less than inspiring project. Indeed, they were struck by how similar the two end results were to one another.

They soon learned that they had much more in common. Blizzard too had been founded on a shoestring by three games-obsessed kids just out of university, in this case by the names of Allen Adham, Mike Morhaime, and Frank Pearce. And they too had become all too familiar with workaday projects like Justice League, which they too saw as a way for their new, unproven studio to pay its dues on the way to bigger, better things to come. The big difference was that Blizzard was a few years older, and thus that much further along the road to becoming a marquee studio. They had recently been acquired by the educational-software giant Davidson & Associates, whose distributional pipeline they would be able to use to publish their own games under their own imprint. Now, they were hard at work finishing up the project that they hoped would change everything for them: a game for computers only called Warcraft. They took the Condor boys into a cramped back room and showed it to them. “I had no idea at that point that Warcraft would become an historically important game,” says Max Schaefer. “It just looked cool.” A relationship was forged. The Blizzard folks said they were just too busy to think about anything else just then, but they promised to listen to Condor’s pitch for Diablo once Warcraft was out the door.

They were true to their word. In January of 1995, with Warcraft on store shelves and selling well, everyone came together again in Blizzard’s conference room to talk about Diablo. No one in that room was unaware of the concerns that had caused publisher after publisher to walk away from the proposal; in fact, in many ways they shared them. CRPGs had glutted the market just a few years earlier, a bewildering procession of elves and dwarves and dragons. For the hardcore aficionados, all of the different games and series were (and still are) possessed of their own distinctive personalities and intricate subtleties, but it was hard for everybody else to keep Dungeons & Dragons separate from Dungeon Master, Might and Magic separate from The Magic Candle. I have a friend who likes to say that there are only two blues songs: “the fast one and the slow one.” Likewise, one might go so far as to say that for most gamers there were only two CRPGs, the first-person Wizardry style and the overhead Ultima style. As computers had gotten more capable, games of the former type had gotten ever more complex in terms of rules, while those of the latter type had threatened to collapse under the sheer weight of their lore and verbiage, which minuscule computer memories no longer restricted. Those sorts of things were not what the Condor guys were into at all. Sure, they had all played tabletop Dungeons & Dragons as kids, but world-building and storytelling hadn’t been their primary interest. “It was all about killing monsters and finding good stuff,” says Max Schaefer.

And so that was what Diablo was to be about as well. “As games today substitute gameplay with multimedia extravaganzas and strive toward needless scale and complexity,” read the pitch document, “we seek to reinvigorate the hack-and-slash, feel-good gaming audience. Emphasis will be on exploration, conflict, and character development.”

Diablo‘s most direct influence by far was the roguelike games, which David Brevik had played for hundreds upon hundreds of hours while a student at university. From roguelikes it inherited its minimalist narrative — amounting to little more than “make it to the last level and kill the boss of bosses Diablo” — as well as randomized dungeons that would be new with every playthrough, along with the randomized “good stuff” they contained. Brevik’s favorite roguelike of all was Angband, which distinguished itself from the likes of the original Rogue and its spiritual successor NetHack by having a town to serve as the player’s base of operations for her expeditions into the nearby dungeon, resulting in a slightly more relaxed pacing and introducing an economic element. Diablo was to duplicate this structure exactly: “Forays into the dungeon will be broken up by trips to the town located above. In the town, a general store will provide standard equipment and repairs, and will also purchase extra equipment from the player. A temple will provide healing for injured and sick characters. Training and other facilities may also be available.”

In Brevik’s initial vision, Diablo was even to have roguelike perma-death: if the player’s character was killed, “that character will be erased completely from the hard drive, and the player must start over from scratch.” Combat would be turn-based like in a roguelike, but heavily influenced by the game’s secondary inspiration, Julian Gollop’s 1994 strategy classic X-COM; Diablo would use a similar interface and action-points system. If it strikes you as strange that a game that would later be so commonly dismissed as nothing more than a mindless, frantic click-fest could have two such cerebral inspirations as these… well, such are the paradoxes of game development.

At any rate, Blizzard was suitably impressed, and agreed to fund and publish the game described in the pitch document. But several of the Blizzard folks who were present at the meeting have since claimed that they were already thinking about a major change: to make Diablo run in real time. Not long after work began on the game in earnest down in San Mateo, Blizzard began slowly but relentlessly to apply pressure to Condor — more specifically, to David Brevik — to make the switch.

Brevik was appalled. There was a certain kind of moment, familiar to every roguelike player, that he considered essential to recreate in Diablo. It’s that moment when you’re down to your last few hit points and are staring down the maw of a mind flayer or a wyvern, knowing that it’s about to hit you and kill you on its next turn unless you do something really clever and/or get really lucky on your own last turn before it can do so. Do you pull out that potion that you have no idea what it does and drink it down, hoping against hope that it’s a Potion of Protection? Or do you take one last swipe at the monster with your sword, hoping it’s as close to death as you are? Or do you try to get away by running down that nearby staircase, hoping against hope that it misses with its last lunge against your vulnerable backside? Most of the time, of course, you choose wrong and/or don’t get lucky, and another character goes to the graveyard. But every once in a while, it works out, your character lives to fight another day, and you shout and dance around the room and rush to tell your friends about it. That dopamine release is what keeps people coming back to roguelikes again and again. Brevik was understandably loath to lose it.

But the slow drip, drip, drip from Blizzard continued, seeping even into Condor’s own ranks. Knowing this, Allen Adham made a suggestion to Brevik in or around May of 1995: Why not ask your own people? Why not take a vote on whether just to try real time? If it doesn’t work, you can always go back to turn-based.

It was too reasonable a suggestion to refuse. Brevik asked for a show of hands among his own people of those interested in exploring real time, and was dismayed to see almost every hand in the room go up. Acceding to the will of the majority, he retreated into his office to have a good-faith go at something he was sure would never fit with the game he wanted to make. The quicker it was demonstrated to everyone that real time wasn’t a practical possibility, he thought, the quicker they could all get back to more productive endeavors. What followed instead was the project’s kairos moment.

I can remember the moment like it was yesterday. I was sitting and I was coding the game, and I had a warrior with a sword, and there was a skeleton on the other side of the screen. I’d been working on this code to make characters move smoothly, doing a whole bunch of testing, and we’d talked about how the controls would work.

We wanted it to be visceral. Click and swing, click and swing. We wanted it to automatically happen: if you clicked on the monster, your character would go over there and swing.

I remember very vividly: I clicked on the monster, the guy walked over, and he smashed this skeleton, and it fell apart onto the ground.

The light from heaven shone through the office down onto the keyboard. I said, “Oh, my God, this is so amazing!” I knew it was not only the right decision, but that Diablo was just going to be massive. It was really the most defining moment of my career, as well as for that genre of gaming.

A new genre was born in that moment, and it was really quite incredible to be the person coding it and creating it. I was just there by myself coding it up. It was pretty incredible.

Diablo may have lost that suspended instant of supreme tension that Brevik had always seen as essential, but it had gained something else, something that would make it a different sort of game entirely. Kelly Johnson, an artist who worked on the game:

In a turn-based game, when you win, you say, “Cool, my plan worked. I took time, I deliberated, I made a plan, and it worked out.” But in a real-time [game], it’s, “Wow! I won!” It’s visceral. You’re in the moment.

Everyone at Condor, including Brevik, was soon marveling that they had ever imagined Diablo being anything other than a real-time game. Millions of players would eventually feel the same way, as the game’s real-time nature became the core of its very identity.

The Diablo team with Diablo himself. We must hope that the keytar is intended ironically.

But before that could happen, Diablo had to be finished. In their excitement over not being rejected yet again, Condor had secured less than half a million dollars in funding from Blizzard, to support a team that numbered a dozen or more. By the beginning of 1996, that money was running out. The founders dipped deep into their personal bank accounts just to cover payroll, and their employees started racing one another to the bank on payday, knowing that the last checks deposited had a tendency to bounce. Meanwhile Blizzard was soaring. That Christmas, they had released Warcraft II, a refinement of its predecessor that blew up massively; it would sell 3 million copies before all was said and done.

The Schaefer brothers and David Brevik were stunned when their publisher came to them and asked whether they would be interested in being acquired; Blizzard was suddenly flush with cash, and the brain trust there was very, very excited about Diablo‘s prospects, such that they wanted to have it all for themselves. For the people making Diablo, the unexpected offer was a lifeline materializing out of thin air in front of a drowning man. In March of 1996, Condor became Blizzard North.

It was Blizzard that had pushed the erstwhile Condor to make Diablo run in real time. Now, it would be Blizzard South that drove another core feature into being. The initial pitch document had included “two-player and multiplayer game sessions via modem or network.” Since actual work had begun on the game, however, that aspiration had been all but forgotten. Yet Blizzard South knew how important multiplayer could be for a game in this new era of widespread network connectivity. They knew that multiplayer deathmatches had made DOOM what it was, and they knew that, long after players had finished Warcraft II‘s single-player campaign, it was multiplayer that kept them going there as well, turning the game into a veritable institution. They wanted all that for Diablo, so much so that they made their only significant technical intervention into its development, sending programmers up to San Mateo to apply their Warcraft II expertise to Diablo‘s multiplayer mode.

For Blizzard had huge plans for multiplayer games in general. Everyone could sense that a large percentage of future gaming would take place between real people on the Internet, that the “LAN parties” of the current age were just a temporary stopgap. Yet gaming over long distances was still technically challenging for the user, even as sessions had to be pre-planned with buddies who had bought the same game you had; spontaneous, pick-up-and-play matches were impossible. Various third-party companies were experimenting with ways to change both of these things, but everything was in a nascent, febrile state. Having money to spend as they did, Blizzard decided to introduce a game hosting and matchmaking service for their customers, under the name (and the Internet URL) of Battle.net. And they decided to offer it to buyers of their games for the low, low price of free, on the logic that the boxed-game sales it would generate would easily pay for its upkeep. It was a revolutionary idea, one that would prove as important to Blizzard’s rise into gaming’s stratosphere as any of their individual titles, iconic as they were. Thanks to Battle.net, you would always be able to find someone to play with, then be in a game with them within seconds. Patches would download automatically when you logged onto the service, a first step toward the always-online mentality that has taken over since. And Diablo was the very first Battle.net-enabled game. If it had achieved nothing else, it would be historically notable for this fact alone.

With Diablo being refined into an ever more effortless, frictionless experience, it was inevitable that another legacy of the roguelikes would fall away. The Southerners told the Northerners that perma-death just wouldn’t fly in the modern commercial market. David Brevik kvetched, but there was no way he was going to win this argument. Even if it hadn’t started out that way, Diablo was evolving into a lean-back rather than a lean-forward sort of game, designed to be more fun than it was demanding. Mistakes would happen in a game like that, and nobody wanted to lose a character he had spent eight hours building because he got distracted by the pizza guy ringing the doorbell. By way of compromise, the Southerners did agree to allow only one save slot, which fit in nicely with the game’s ethic of simplicity anyway. And of course, if anyone really wanted to play Diablo like a roguelike, there was nothing but the temptation of that extant last save file preventing it.

Warcraft II had made Blizzard one of the biggest names in mainstream gaming, on a level with id Software of DOOM and Quake fame and Westwood Studios, the makers of Command & Conquer, Blizzard’s great rival in the real-time-strategy space. Everything Blizzard did was now of interest to obsessive gamers. Diablo was to be their first game that ran under Windows 95 rather than MS-DOS; like Battle.net, this was another outcome of the company’s guiding principle of frictionless ease in all things. In the summer of 1996, Blizzard arranged to have a two-level demo of Diablo included on a Microsoft DirectX sampler disc. Interest in the game exploded. It became easily the most anticipated title of the 1996 holiday season.

That fact makes the next bit that much more remarkable. When the last possible instant to send the game out to be burned onto hundreds of thousands of CDs and shipped to stores all over the country in time for the Christmas buying season arrived, Blizzard took a long, hard look at its current state. It wasn’t in terrible shape, but it still had its fair share of minor niggles here and there. The vast majority of publishers would have said it was good enough and shipped it at this point — after all, they could always patch it later, right? (Wasn’t that one of the points of Battle.net?) But Blizzard decided to wait, resigning themselves to letting Christmas slip by without a major new release from them. It was better, they judged, to make sure Diablo was just exactly perfect when it did ship. More than anything else, it would be this thoroughgoing focus on quality — quality at almost any cost — that would make Blizzard one of the most extraordinary success stories in the entire history of gaming. From the beginning, their tender-aged founders understood something that eluded a bizarre number of their more grizzled peers: that one’s reputation is one’s most precious business asset of all, being laborious to build up and disconcertingly easy to lose. In an industry fueled by short-term hype, they took the long view. “If you truly put the game first,” says Allen Adham, “then decisions like holding a product an extra couple of months, even if it means missing Christmas, become fairly clear.” Gamers came to know that Blizzard would never let them down, and this knowledge fueled the company’s rise. The sacrificing of tens of thousands of sales the following month led to millions and millions of sales over the following decade.

So, Diablo missed the Christmas deadline, but not by much: the first copies wended their way onto store shelves between Christmas and New Years, when lots of younger gamers had gift checks from uncles and aunts and grandparents burning holes in their pockets. Others trotted down to their local software store and traded some less desirable Christmas present for Diablo. Retailers fended off the return-season blues by turning Diablo‘s release into an event, plastering posters all over their walls and filling their display windows with mannequins of the devil on the cover. All told, it’s questionable whether the belated release really hurt Diablo very much at all, even in the shortest of terms. By spring, it was clear both from the sales reports and from the level of activity on Battle.net that Diablo was the hottest computer game in the world. It was blowing up huge, even by comparison with Warcraft II. Diablo‘s sales surpassed 1 million units within months.



Diablo‘s eventual impact on the culture and practices of computer gaming was arguably more pronounced than that of any individual title since DOOM. It introduced phrases like “loot drop” into the gamer lexicon; it was the pioneer of a new era of easy online multiplayer gaming, between friends and strangers alike; it single-handedly dragged the entire genre of the CRPG back into public favor. This long shadow can make it oddly difficult to discuss as just a game. When I went back to play it recently for the first time in a quarter of century — boy, I’m getting old! — I was impressed if not blown away by the experience. And yet, despite my best efforts, I couldn’t quite avoid allowing my opinions to be colored by some of what Diablo has wrought. We’ll get to that in due course. But first, Diablo the game…[1]The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.

When you start a new adventure in the world of Diablo, you first choose your character from three fantasy archetypes: the warrior, who is best at bashing things with his big old sword; the rogue, who fights a little more surgically, preferring the bow and arrow; or the mage, who unlike his counterparts is pretty good with spells from the outset. But you don’t spend any time fussing about with statistics. You’re dropped into the hardscrabble village of Tristram, which has had the misfortune to be built over a demon’s not-so-final resting place, as soon as you’ve given your character a name. In Tristram, you can buy and sell in a few different shops and talk to a handful of villagers, but it’s all kept very short and sweet. Before you know it, you’ll be in the first dungeon, which is found beneath the graveyard of the local church.

You’ll have to fight your way through sixteen dungeon levels in all, divided into four sets of four that open up one after another, presenting ever more powerful monsters for your ever more powerful character to battle. In keeping with the game’s roguelike heritage, each level is procedurally generated. There is a modicum of story, even a cut scene here and there, but nothing you ever need to think too much about. (Although a fairly elaborate backstory does appear in the manual, it too is nothing you need to concern yourself with if you don’t want to. It was tacked on very late in development by Blizzard South, who realized that some gamers at least still liked to see such things.) There are also some pre-scripted quests to carry out, selected randomly from a pool of possibilities each time you start a new game. Most of these are given to you by the townspeople when you talk to them — but, again, all are extremely basic, coming down to “kill this monster” or “collect this object” (which, come to think of it, always involves killing the monster guarding it).

In practice, playing Diablo is a very simple loop. You go into the depths and make as much progress as you can against the hordes of enemies that await you there. Then you return topside to sell off the stuff you’ve collected that you don’t need, heal up and buy any potions or other equipment you think you’re going to need, and go downstairs again. Rinse and repeat, until you meet and hopefully kill Diablo himself. Unlike the typical epic CRPG, Diablo is intended to be a game you play over and over again. Thus the average playthrough takes only ten hours or so, as opposed to the hundred or more of its weightier brethren.

Blizzard North’s stated goal was to make Diablo “so easy your mom could play it.” Setting aside the condescension of their choice of words, they certainly achieved their goal in spirit. Fighting monsters is simply a matter of clicking on them, which causes your character to whack them with his melee weapon or fire off an arrow or spell at them. Tactics in the dungeons come down to common sense: whittling away at the edges of large groups of monsters instead of charging right into the middle of them, using doorways and narrow corridors to your advantage, keeping a healthy distance and using ranged attacks if you’re playing a rogue or a mage. That said, it does pay to learn the monsters’ strengths and weaknesses and tailor your attacks to them: skeletons, for example, are more vulnerable to attacks by blunt weapons such as maces than edged weapons such as swords.

The biggest source of tension is the question of when you should leave off in the dungeon and return to the town for succor. Usually when you die, it’s because you’ve pressed your luck just a bit too much. On the whole, though — and ironically given its line of descent through one of the most infamously unforgiving sub-genres in all of gaming — Diablo is one of the less intrinsically challenging games I’ve played in the course of writing these histories. If you do find yourself feeling under-powered and over-matched — perhaps because you made poor choices about where to allocate the ability points your character is awarded every time she levels up — you can always restart the game whilst retaining your existing character, complete with her current statistics and all of her current kit. Poor character-building choices or a general lack of skill can, in other words, always be compensated for with patient grinding.

Notice the auto-map overlaid onto the standard display…

In lieu of challenge, Diablo thrives on its polished addictiveness. Vanishingly few of its contemporaries can even begin to touch it in terms of intuitive playability. It’s clear that every last detail — every last window, every last hotkey, every last mouse click — was fussed over for hours and hours, until it was just what it ought to be. The auto-map is a thing of wonder that I have to call out for special praise. In CRPGs of the 1990s, such things are usually found in a separate window on the main display that is always too small for comfort and yet takes up too much precious screen real estate — or the auto-map can only be accessed on a separate screen, leaving you constantly flipping back and forth between the two views as you try to get somewhere. Diablo‘s auto-map, on the other hand, appears as a transparent overlay right on top of the usual display, toggled on and off by pressing the TAB key. Like everything else here, it’s elegant and perfect, a brilliant stroke that could only have come about through dedicated, dogged iteration. You have to be in awe of the craftsmanship of this game. It knows precisely what it wants to be, and it achieves its best self in every respect.

This statement applies equally to the game’s aesthetics, which are nothing short of masterful; whatever Diablo lacks in set-piece storytelling, it makes up for in atmosphere. If I had to describe that atmosphere in one word, it would be “Gothic.” Diablo captures the side of the Middle Ages that all of those Tolkienesque CRPGs cheerfully ignore in the midst of all their elves and halflings romping merrily through the forest: the all-encompassing religion of Christianity, the almost tangible reality of another life that awaits after this one, which is as much a source of fear as comfort in the minds of the people. Diablo taps into something deep and almost primal in the human psyche, having more in common with The Exorcist than The Lord of the Rings, more in common with Hieronymus Bosch than Boris Vallejo. The shocking ending, which I won’t spoil here, is likewise more horror than fantasy. Diablo is lucky it wasn’t released during the Satanic Panic of the 1980s, given that it sports much of what all those concerned parents were looking for in Dungeons & Dragons and not quite finding.

The lair of the Butcher, one of the gorier locations in Diablo. “Fresh meat!”

Matt Uelmen’s amazingly sophisticated soundtrack, recorded partially on real instruments at a time when many games were still relying entirely on tinny MIDI sound fonts, could easily have played behind a big-budget horror movie. The “Town” theme, featuring the best use of a twelve-string guitar since the heyday of the Byrds, is especially unforgettable; it took me back instantly when I heard it again after 25 years away.


All that said, I won’t go so far as to say that Diablo itself is scary. It seems to me that gameplay that revolves around killing hundreds of monsters is incompatible with true horror. Horror depends on a feeling of powerlessness, whereas Diablo is, like almost all CRPGs, a power fantasy at bottom. Nevertheless, it’s as audiovisually focused and accomplished as any game I’ve ever seen. I say this even as I freely acknowledge that its unrelentingly dark atmosphere tends to wear thin with me pretty quickly. (For me, a bit of light and joy brings out the shadows that much more effectively.)

And sadly, that statement pretty much sums up my response to Diablo as a whole, which is the same today as it was 25 years ago. It does what it does brilliantly. I just wish I liked what it does a bit more. Let me tell you how I got on with it when I played it for this article…

Given its titanic importance, my first plan was to play through it three times, once for each of the character classes. I first bashed my way to the finish line as a warrior. As I did so, I admired all of the qualities described above, but I also found the experience a little hollow; I didn’t dread sitting down with the game on the couch after dinner each evening for an hour or two, but neither did I look forward to it all that much — and nor did my wife have to tell me twice that it was time for bed, as she has to when I’m playing some games. I came to regard my Diablo sessions much as I might, say, an old episode of Law & Order: a low-effort something to pass the time, which I could do while chatting intermittently with my wife about completely different things. When I finished the game, I put it on the shelf for several months, intending always to get back to it but never feeling all that excited about doing so. Finally, knowing I had to write this article soon, I forced myself to start a new game as a rogue, hoping that character might be more interesting to play. But this time I found myself actively bored; “been there, done that” was the dominant note. Halfway through, I just couldn’t muster the will to continue. I could admire Diablo for its craftsmanship, but I couldn’t love it.

What am I to make of this? Obviously, I’m in the group of people who just aren’t really in the market for what Diablo is selling — a group who tend to be as vocal in their criticisms as the game’s fans are in their praise. But I’m not eager to join the chest-beating grognards who call Diablo dumbed down, or who shout that it’s not even a real CRPG at all. (Is there anything more tedious than a semantic debate between intractably biased parties?) It’s actually not Diablo‘s simplicity that puts me off; I’m much more likely to scold a game for being too complicated than for being too simple. And then too, over the years I’ve been writing these histories, I’ve found many — perhaps most — games from the 1980s and 1990s to be more rather than less difficult than I really need them to be, so it’s not precisely the lack of challenge that bothers me about Diablo either. Too easy is far, far better in my book than too hard.

On the other hand, I do tend to prefer human-crafted to procedurally-generated content in general, and Diablo doesn’t do anything to disabuse me of that notion. Its randomized nature means that its dungeons can only be a collection of rooms, corridors, and monsters, without the guileful tricks and traps and drama of the best dungeon crawlers of yore. Beyond that, and beyond an aesthetic presentation that isn’t quite to my taste, I think my lack of receptivity to Diablo is to do with the passivity of the experience. I’ve seen it described as a good “hangover game,” what with how little it actually asks of you. Even more tellingly, I’ve seen it called the gaming equivalent of candy: you can eat an awful lot of it without thinking much about it, but it doesn’t leave you feeling all that great afterward.

One nice thing about getting older is that you learn what makes you feel good and bad. I’ve long since learned, for instance, that I’m happiest if I don’t play games for more than a couple of hours per day, even on those rare occasions when I have time for more. But I want those hours to have substance — to yield fun stories to tell, interesting decisions to remember, strategies or puzzle solutions to muse about while I’m cooking dinner or working out or taking a walk, accomplishments to feel good about. For me, Diablo is peculiarly flat; I went, I saw, I clicked on monsters. For me, it feels less like a time waster than a waste of time. I almost find myself wishing the game wasn’t so superbly polished in every particular, just to relieve the monotony.

More substantively, I do see one aspect of Diablo as vaguely ominous in the larger context of gaming history: the way it uses stuff to do the heavy lifting of player motivation. As I mentioned above, “loot drops” became a thing in gaming with this game. Although CRPGs had been tempting and teasing players with the prospect of a new magic sword or armor as long as they had existed, Diablo put that temptation front and center, making it the main driver of its gameplay loop. In doing so, David Brevik and company consciously tapped into something besides the allure of the Gothic that is primal in human psychology. They liked to use the analogy of a slot machine: you clicked endlessly on monsters in the hope that eventually something really good would drop out of one of them. When I hear these anecdotes, I can’t help but think of the glassy-eyed zombies to be found in casinos from Shreveport to Macau, pulling the handles of the one-armed bandits again and again for hours, likewise waiting for something good to drop into their laps. Pat Wyatt, Blizzard’s vice president of research and development at the time of Diablo‘s creation, proffers an even more disturbing metaphor: “Positive reinforcement is one of the hardest types of conditioning to break, which is why pets beg at the table: rewards may not happen very often, but every once in a while you get a scrap, so they keep begging.” In the decades after Diablo, this Pavlovian loop would be exploited mercilessly by cynical game makers, trapping players in unsatisfying cycles of addiction that drained their time and their wallet, leaving them with nothing but a few virtual trinkets to their names in a virtual world that would be gone in a year or two anyway.

In the late 1990s, the dangerous addictiveness of loot drops was most in evidence in multi-player Diablo, as played on Battle.net, which in its early years was a fascinating if ofttimes toxic social laboratory in its own right. I do have more to say about it, but I think I’ll reserve it for a future article which will look at this formative period of online gaming in a more holistic way.

Instead, let me say in conclusion today what I often say when I end a review on a downer note: that no game is for everyone, and no way of having fun is wrong, as long as you aren’t hurting anyone else or yourself. If you love Diablo, you’re in good company. It’s a fine, fine game by any objective measure. Whatever cynicism it might have inspired is on the conscience of the folks who displayed it; this game was made for all the right reasons. It’s a triumph of care and dedication from which many another studio could learn, then and now. Just be sure to remember that there’s a beautiful world out there with plenty of cloudless blue skies to contrast with Diablo‘s perpetually sooty ones, and you’ll be just fine. Click away, my friends, click away!



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.



(Sources: As was the case with my last article, I’m hugely indebted to David L. Craddock for Stay Awhile and Listen Book I and Book II, which I plundered for quotes with all the enthusiasm of a Diablo loot hunter. By all means, check out these books if you’re interested in learning more about the Blizzard story.

Magazine sources include Computer Gaming World of August 1996, December 1996, March 1997, April 1997, and May 1997; Retro Gamer 43 and 103. Online sources include Lee Hutchison’s interview with David Brevik for Ars Technica, the Dev Game Club interview with Brevik, and Brevik’s Diablo post-mortem at the 1996 Game Developers Conference.

Diablo and its controversial expansion Hellfire are available as a single digital purchase at GOG.com.)

Footnotes

Footnotes
1 The commentary in this article deals only with the original Diablo. An expansion pack to the game called Hellfire, created out-of-house by the Sierra subsidiary Synergistic Software, was released in late 1997. The relationship between Blizzard North and Synergistic was plagued with discord from first to last, and David Brevik and many of his colleagues have since disowned many elements of Hellfire as fatal dilutions of their vision. So, we’ll honor Blizzard North’s original intentions here and stick to the base game.
 

Tags: ,

Going Rogue

When a beleaguered Netscape announced in January of 1998 that it would release the source code to its browser for everyone to tinker with and improve upon, the news shook the worlds of technology and business to their foundations. This open-source “revolution,” as even many in the mainstream press took to calling it, had sprung up seemingly out of nowhere to challenge the conventional wisdom and perhaps the very livelihood of traditional tech giants like Microsoft. For the next several years, you couldn’t open a trade journal or a newspaper’s business section without seeing some mention of the open-source movement and its leading exemplar, the robust and yet totally free — in all senses of the word — operating system Linux. Linux and other software like it was, an eye-opening number of people said, destined to destroy Microsoft’s vaunted Windows monopoly any day now.

The movement’s Little Red Book came in the form of Eric S. Raymond’s 1997 essay “The Cathedral and the Bazaar.” Originally presented as a comparison of a top-down versus a bottom-up methodology in the context of open-source projects, the central metaphor quickly got blurred in the minds of the public into a broader comparison of closed source versus open source, with Raymond’s tacit acquiescence. In this telling, the cathedral was Microsoft’s software-development model, in which a closeted priesthood bestowed programs upon a grateful populace on its own terms and on its own schedule. The bazaar was the hacker way, in which the people came together in a spirit of delightfully chaotic egalitarianism to make software for themselves, sharing their source code in the name of the greater good. “No closed-source developer can match the pool of talent the Linux community can bring to bear on a problem,” wrote Raymond. “The closed-source world cannot win an evolutionary arms race with open-source communities that can put orders of magnitude more skilled time into a problem.” Thanks to Linux and the other open-source tools it enabled, he predicted elsewhere, Microsoft’s eagerly anticipated Windows 2000, the latest incarnation of its server-grade NT operating system, would “be either cancelled or dead on arrival. Either way, it will turn into a horrendous train wreck, the worst strategic disaster in Microsoft’s history.”

Alas, Raymond proved a less effective prophet than pundit. Not only was it not a failure upon its eventual release, but Windows 2000 evolved in 2001 into the consumer-grade Windows XP, by many standards the most successful single version of Windows in history.

Like that of all revolutions that have passed their heyday of strident ideology, the most extreme rhetoric of the late 1990s open-source movement can seem overheated if not downright silly today, the blinkered product of a tiny strata of metaphorical inside cats who have concluded, rather conveniently for themselves, that the most important social-justice campaign of their age is one that can be waged from behind their keyboards and monitors, just the place where they happen to feel most comfortable. As for the ideas they introduced into the public discourse: they were real, valid, and in many ways incredibly valuable, but in the end they would be woven into the fabric of existing corporate-software production practices rather than burning down the old ways wholesale.

For rigid ideology seldom makes a good fit with the real world; pragmatically mixed national economies, for example, succeed vastly better than dogmatically capitalist or communist ones. Similarly, instead of continuing to sort itself into two opposing camps at eternal loggerheads, the modern software ecosystem has learned to take the best from both sides to wind up with a sort of mixed economy of its own. The cleverest actors have learned to combine the cathedral and the bazaar in ways that maximize the strengths of each: Google builds its proprietary Web browser Chrome atop an open-source engine known as Chromium; Apple constructed the OS X desktop on the solid foundation of an open-source operating system known as Darwin; Android mobile phones and tablets have Linux at their core. Even Microsoft now embeds an optional “Linux subsystem” into Windows, as the cats lie down with the dogs.

The reasons for open source’s failure to more comprehensively conquer the world aren’t that hard to divine; they’re actually front and center in some of the movement’s founding principles. The editors of the grandiosely titled 1999 anthology Open Sources: Voices from the Revolution — one of those books whose very name clues you into the window of time in which it was published — wrote that “most open-source projects began with frustration: looking for a tool to do a job and finding none, or finding one that was broken or poorly maintained. Eric Raymond began fetchmail this way; Larry Wall began Perl this way; Linus Torvalds began Linux this way.” The latter two of these projects at least have remained among the most essential of the workhorses that make the Internet function, strong arguments for the superiority of the open-source model for developing some types of software.

But it appears that the same is not true for all types of software. A model in which programmers create only the programs that they most want to have threatens to yield a universe of software which is interesting and attractive only to programmers. Even Eric Raymond had to acknowledge that the production of software with mass appeal is only partially a “technical problem.”

It’s [also] a problem in ergonomic design and interface psychology, and hackers have historically been poor at it. That is, while hackers can be very good at designing interfaces for other hackers, they tend to be poor at modeling the thought processes of the other 95 percent of the population well enough to write interfaces that J. Random End-User and his Aunt Tillie will pay to buy. Computers are tools for human beings. Ultimately, therefore, the challenges of designing hardware and software must come back to designing for human beings — all human beings.

Open source has never entirely made this leap. It’s for this reason that its biggest success stories have come in the realm of back-end software rather than user-facing applications. Witness the long, frustrating history of “Linux on the desktop,” which, in an echo of the old hacker joke about strong artificial intelligence, has been perpetually just a few years away from world domination ever since the late 1990s. There is no theoretical bar to visual designers and experts in ergonomic psychology joining open-source projects, and in some times and places this has even happened. And yet the broad field of open source is still dominated by programmers writing software for themselves and for one another.

Game development joins graphical user interfaces as another notable area where the bazaar model doesn’t quite seem to do the trick. The open-source methodology excels at solving purely technical problems, but the making of a great game is a technical problem only in part — usually, not even the most important part. Consider the case of one of the most critically lauded games of the late 1990s, Valve’s Half-Life. It was a triumph of design and aesthetics, not of technology; its engine was borrowed from id Software’s two-and-a-half-year-old Quake, a technological showstopper in its day which has aged far less gracefully. It would seem that the best way — or perhaps the only way – to create a great game from whole cloth is through a priesthood with a strong and distinctive design and aesthetic vision.

Those open-source games which have become relatively popular have tended to build upon previous game designers’ visions in much the same way that Chrome is built on Chromium: think FreeCiv or Open Transport Tycoon Deluxe, worthy projects that are nevertheless more interested in making workmanlike technical improvements to their inspirations than bold fundamental leaps in design. The open-source movement has had the most pronounced impact on gaming in the form of tools, both for making games and for playing them. I could never have embarked with you on this journey through history that we’ve been on for over a decade now without the likes of DOSBox, ScummVM, UAE, VICE, and many, many other open-source emulators and utilities of all descriptions. I am deeply grateful to the many talented programmers who have given their time to them in order to keep our digital past accessible. Still, they do remain purely technical projects, not creative ones in the sense of the games which they enable to run on modern hardware.

The one ghetto of gaming where open-source projects have been able to forge a strong design and aesthetic sensibility all their own — a sensibility with no obvious antecedents in commercial, closed-source games — turns out upon examination to be not quite the anomaly it might first appear. The “roguelike” sub-genre of the CRPG dates all the way back to 1980, well before the modern open-source movement came to be. But, like that movement, it was a product of an institutional-computing hacker culture that had been around since the 1950s, in which proprietary software was regarded as not so much immoral as simply unheard of. It stands today as a fine example of open source at its best — and equally of what it does less well. Call it the exception that proves the rule.



In Hackers: Heroes of the Computer Revolution, his classic chronicle of the first few decades of institutional hackerdom, Steven Levy writes about the appeal that Adventure, a game that would lend its name to an entire genre, held for the first people to play it on the big multi-user DEC computers of the late 1970s.

In a sense, Adventure was a metaphor for computer programming itself — the deep recesses you explored in the Adventure world were akin to the basic, most obscure levels of the machine that you’d be traveling in when you hacked assembly code. You could get dizzy trying to remember where you were in both activities. Indeed, Adventure proved as addicting as programming…

Rogue, a game which would lend its name to a sub-genre that had even more appeal to the programming mindset, was itself a direct outgrowth of Adventure, with a couple of key elements added to the mix.

Michael Toy and Glenn Wichman were undergraduates at the University of California, Santa Cruz when they first encountered Adventure. Like so many others, they were absolutely entranced. The only drawback was that, once they finally beat the game for the first time, there wasn’t much more to be done; the puzzles were always the same, meaning that beating it again became a rote exercise. And there weren’t yet any other games like it. So, the pair started to talk about creating a game of their own, one that would play a little bit differently. What if, rather than building their game around a collection of pre-crafted set-piece puzzles, they made one that would offer up a new world to the player every single time through the magic of random procedural generation? That way, you could keep playing it forever, even after beating it once or twice or a dozen times. Even Toy and Wichman themselves would be able to have fun with it, given that they too would never know what sort of world they would be entering next.

But what exactly might such a game look like in practice? It wasn’t at all clear; the problem of describing a procedurally generated world in English prose like that used by Adventure was effectively insoluble in the context of the time. Then Toy stumbled upon a new programming library for the Unix operating system (the predecessor to and inspiration of Linux). The brainchild of a University of California, Berkeley student named Ken Arnold, “curses” let you arrange text however you wanted on a terminal screen, letting you change the contents of any one of the 1920 cells that made up a typical 80-character by 24-line display any time you wanted to; this made it possible to reserve different regions of the display for different sorts of information. Earlier games which hadn’t had access to curses, such as Adventure, had had to content themselves with teletype-like interactions: a continuous scrolling stream of text which, once fired at the screen, could only be forgotten. But curses changed all that at a stroke. You could use it to put up menus, maps, charts, and just about anything else you could write or draw using the ASCII character set, updating them all independently of one another.

It gave Toy and Wichman a viable path forward with their fondly imagined infinitely replayable game. For, while textual descriptions of a procedurally generated world were a nonstarter, showing a symbolic, visual representation of one using curses was another matter.

Avid players of tabletop Dungeons & Dragons, Toy and Wichman tried to recreate on the computer the dungeon-delving expeditions they enjoyed with their friends, exploring a network of rooms and tunnels filled with monsters to fight, traps and other hindrances to defuse, and treasures to collect. Whereas the main dish of Adventure had been set-piece puzzles, with only a side dish of dynamic logistical challenges — an expiring light source, an inventory limit, a pesky wandering thief with a sharp sword — the nature of their game meant that it would have to be all logistics. In making this switch, they half-accidentally invented not just the first roguelike but one of the first CRPGs, full stop. We cannot give them complete credit for that genre, mind you: other proto-CRPGs were being created at the same time on the PLATO system at the University of Illinois and on the earliest home microcomputers as well, as other Dungeons & Dragons fanatics also tried to bring the tabletop experience to the computer. Still, by all indications Toy and Wichman made the leap without knowing what anyone else was up to.

It was Wichman who came up with the name of Rogue:

I think the name just came to me. Names needed to be short because you invoked a program by typing its name in a command line. I liked the idea of a rogue. We were coming from a Dungeons & Dragons background, but we were creating a single-player game. You weren’t going down into the dungeon with a party. The idea was that this is a person going off on his or her own. It captured the theme very succinctly.

To depict their world, Toy and Wichman invented the iconography (textography?) that has remained the standard for roguelikes to this day. The walls of rooms were made from horizontal and vertical dashes (“-” and “|”), the tunnels between them from hash marks (“#”), doors from plus signs (“+”), treasure from dollar signs (“$”), monsters of varius types from any and all letters and symbols that weren’t already being used for something else. The focus of it all was your titular rogue, depicted as a forlorn little at-sign (“@”) adrift in this sea of promise and danger. The textual austerity of it all could become weirdly atmospheric. “You’d see a letter ‘T’ on the screen and it would startle you, because you knew it was a troll,” says Wichman.

Rogue

The goal of the game was to find a MacGuffin called the Amulet of Yendor, hidden 25 dungeon levels or so deep, and return it to the surface. Doing so would require fighting ever more dangerous monsters, building up your character as you did so in classic RPG fashion, both through the experience points you gained from killing them and the equipment you collected. From the first, Rogue was intended to be hard — hard enough to challenge the very people who had made it. This is another quality that has remained a core value of the sub-genre which Rogue invented.

You didn’t know what the stuff you found actually did. Would that yellow potion restore your health, or would it kill you instantly? The safest way to know for sure was to use an “identification” scroll on your new finds, but such things were rare and precious, and ironically had to be themselves identified first. In a pinch, you might just have to try on that new ring or armor and see what happened, praying as you did so that it wasn’t cursed.

Food was the most essential resource of all; while you could eat the corpses of many monsters, some of them would make you sick and some of them would get their posthumous revenge by outright killing you. (Roguelikes are a bit like the old saw about the Australian Outback: everything in them seems to be able to kill you.) The only way to have a chance of winning was to play the game over and over again, slowly ferreting out its secrets and devising optimal strategies in the course of dying again and again and again. Even once you got really good, the difference between success and failure could still come down to sheer dumb luck, as “CRPG Addict” Chet Bolingbroke noted in his articles about the game: “Sometimes you might find a two-handed sword +1 on the first level; other times, you’ll find three poison potions and a cursed dagger.” Rogue‘s own co-creator Glenn Wichman admits that he has never legitimately won it.

Rogue, in other words, flagrantly violated almost all of the modern rules of progressive game design: it was unfair in countless ways and about as unwelcoming to newcomers as a game can be. It was a comedian telling jokes at the poor player’s expense, its later levels stocked with rust monsters that instantly destroyed her hard-won magical armor (until she learned to take it off before fighting them) and rattlesnakes that poisoned her (until she learned that the only practical way to combat them was to chuck whatever junk was to hand at them from a distance). And death was an irrevocable state. Although you could save a game of Rogue and come back to it later, this was intended only for the purpose of resuming an interrupted session: the save file was deleted as soon as you restored it. There were no second chances in Rogue; a single ill-considered move, or a single errant key press, or just a simple stroke of random bad luck, could and usually did erase hours of careful, steady progress.

And yet people found it strangely compelling. This was doubtless partially down to the times; there weren’t a lot of games available to play, which meant that the amount of time and energy required to get good at this one could seem more like an advantage than a disadvantage. But there was also more to it than that, as is indicated by the survival of the roguelike sub-genre right down to the present day, with all of its legendary difficulty intact. Rogue seemed to scratch a different itch than most games, a rash from which hackers seemed particularly prone to suffer. Very few successfully retrieved the Amulet of Yendor, but that only made the prospect of doing so that much more tempting. In the hyper-competitive culture of hackerdom, beating Rogue became a badge of honor almost on a par with writing some super-useful, super-elegant program that made everyone else jealous.

All of this didn’t happen instantly. Like most games on the big institutional computers, Rogue was a work in progress for years after the first version of it went up at UC Santa Cruz, probably in 1980. In 1982, Michael Toy got kicked out of the university for spending too much time tinkering with Rogue and not enough keeping up with his classwork. He took a job in UC Berkeley’s computer lab instead, splintering the partnership that had taken Rogue this far. Wichman now dropped off the scene, to be replaced at Berkeley by, of all people, Ken Arnold, the very hacker whose curses library had inspired the initial creation of Rogue. Toy and Arnold continued to expand and refine the game until they left Berkeley in 1984.

It was during this period that Rogue got really popular, spreading far and wide with the Unix operating system on which it ran, by now the overwhelming hacker favorite. Rogue became an almost equivalent touchstone of hacker culture, being played obsessively everywhere from Bell Labs to the Nevada Test Site. The game’s creators were thrilled when they learned that both Ken Thompson and Dennis Ritchie — living gods among hackers, the creators of Unix itself — were major fans of the game; Ritchie jokingly called it the biggest single waster of CPU cycles in computing history. When Toy attempted to commercialize Rogue in 1984 by releasing an MS-DOS port through the publisher Epyx, he felt justified in advertising it as “the most popular game running on Unix” and “the most popular game on college campuses.”

By the time Rogue hit microcomputers, its partial inspiration Adventure had spawned its own thriving corner of the home-computer-games market, where companies like Infocom sold hundreds of thousands of slickly packaged parser-driven text adventures. But home users proved markedly less receptive to Rogue after its belated arrival. Even after Wichman came back on the scene to help Toy make prettier, semi-graphical versions of the game for the Apple Macintosh, Atari ST, and Commodore Amiga, Rogue didn’t sell all that many copies. Wichman could only conclude that the audience that had made it such a hit on the big computers “wasn’t the audience that was looking for games in software stores.” It was a fair assessment: roguelikes would remain staples of hacker culture, but would never make inroads into the flashier commercial-games market.

Epyx’s Rogue was one of the last artifacts of that company’s original, cerebral “Automated Simulations” identity, appearing the same year that Summer Games and Impossible Mission cemented its new image as a purveyor of slick, audiovisually polished, action-oriented titles. Small wonder that Rogue seemed to get lost in the marketing shuffle.

The Amiga Rogue was a graphical affair, but that didn’t do much for its sales.

In this as in so many other respects, Rogue laid down the template for all of the roguelikes to come as thoroughly as Adventure did for its progeny. But there was one important exception, albeit one external to the game itself: Toy, Wichman, and Arnold didn’t release their source code to the public, clinging to the role of the high priests of a cathedral rather than embracing the bazaar model of software development. “In retrospect, it would have been better to share,” admits Arnold. Yet it isn’t that surprising that they didn’t. Open source had yet to become an ideological movement, even among the hardcore hacker contingent to which Rogue‘s fathers belonged. And they did, after all, have hopes of commercializing the game, even if those hopes ultimately failed to come to complete fruition.

As it was, the lack of source code meant that those who dreamed of building a better Rogue had no choice but to start from scratch. Among the first to do so was a group of boys who hung out together in the computer lab at Lincoln-Sudbury High School in Sudbury, Massachusetts, at the dawn of the 1980s. The school’s single modest DEC PDP-11 minicomputer wasn’t wired to the Internet, but the gang nevertheless encountered Rogue early in its history: in the summer of 1981, when their mentor, a young teacher named Brian Harvey, finagled an invitation for them to go out to Berkeley for a few weeks, to see what life was like in the big leagues of institutional computing. One of the kids who went was named Jay Fenlason. He fell in love with Rogue at first sight, managing to play it for about eight hours by his own estimate during the visit. He returned to Massachusetts determined to make a game just like it. He corralled his buddies into an unlikely game-development team, and over the course of the next year they made Hack, working strictly from their memories of the game they had seen at Berkeley.

That initial version of Hack has been lost, leaving behind only scattered anecdotes. However, all indications are that it wasn’t any remarkable advance over Rogue in itself. What made it important — indeed, what changed everything for the nascent roguelike sub-genre — was the decision Fenlason and his friends made to give away not just their executable but their source code as well.

To celebrate their graduation in 1982, the computer-lab gang packaged up the source code to all of the programs they had written, Hack among them, and sent it to an organization called USENIX, a computing-research nonprofit that maintained a file archive for its members. The source bore a simple notice at the top, saying that anyone who wished to was free to make improvements to the software and distribute them, as long as due credit was given to the original creators as well and as long as they shared the updated source. Having done that, the youngsters who had made Hack went their separate ways, having no idea what the game they had loosed upon the world would someday grow into.

At first, their lack of expectations seemed more than justified; while Rogue went everywhere in hackerdom, Hack went nowhere. Then, in early 1984, a thirty-something Dutch mathematician and programmer named Andries Brouwer, who worked at the Amsterdam research center Mathematisch Centrum, chanced to troll through USENIX’s file archive, looking for interesting software. Just as Don Woods had rescued Will Crowther’s incomplete game of Adventure from oblivion back in 1977, Brouwer now stumbled across Hack and did it the same service. He tightened up the code and the gameplay, and then started adding new features, which he tested on his colleagues at Mathematisch Centrum, most of whom became certifiable Hack addicts. Beginning on December 17, 1984, he uploaded each new version to the Internet as well.

Brouwer added the concept of character classes to the game, introducing six of them; no rogue was to be found among them, but they did include the likes of a tourist and an archeologist, evidence of a quirky sense of humor that would continue to mark the game forevermore. He added shops in the dungeon for buying and selling equipment, and made the dungeon deeper; it now went down 40 levels, the last ten a special region called Hell that demanded magical protection from fire and a teleport spell to even enter. No longer did you find the Amulet of Yendor just lying around somewhere down there in the depths; now you had to defeat a Wizard of Yendor to get your mitts on it. To these big enhancements he added a wealth of smaller details that were likewise destined to remain indelible parts of the game, such as a dog or cat companion to accompany you on your expedition and the ability to write messages on the floor for various purposes.

For years, players of Rogue had been sharing their tips and travails on the Usenet group net.games.rogue. It was here that Brouwer now announced his new roguelike. The community there pounced upon Hack, which, if not clearly better than Rogue, did have the virtue of being subtly different from a game which most of them had already played to death. The volume of Hack-related traffic grew so extreme that, just one month after Brouwer had uploaded his game for the first time, the group net.games.hack came into being to accommodate it. “Please stop posting articles about Hack to net.games.rogue and use this new group instead,” wrote a Usenet administrator pointedly.

Brouwer kept his fire hose of additions and improvements spurting until July of 1985, when he pronounced himself satisfied with the game and moved on to other things. But, thanks to the fact that he had honored the wishes of Jay Fenlason and company and publicly released his source code, Hack could continue to morph and grow after his departure in a way that Rogue had not been able to after Michael Toy and Ken Arnold left Berkeley. Ports and modified versions were soon popping up everywhere. It was exciting in a way, but it became a bit too much like the babble of a bazaar. Three hackers, by the names of Mike Stephenson, Izchak Miller, and Janet Walz, decided that a little bureaucracy wouldn’t be amiss. They decided to create a sort of curated version of the game, incorporating changes from anyone who wished to contribute to the project, as long as they were well-coded, worthwhile, and not game-breaking. Because their home base was net.games.hack, they named their version of the game NetHack. Its first official release came in July of 1987; its most recent one as of this writing came out in February of 2023. I suspect that there will be many, many more before NetHack‘s full history can be written.

NetHack is an answer for every player of traditional adventure games who has ever asked why she can’t just bash a door open instead of searching hither and yon for the key.

The semi-anonymous wizards behind the NetHack curtain are known simply as the DevTeam. For 36 years, this rotating cast of characters has maintained and added to the game, making it one of if not the most systemically complex ever created, even as it retains in its canonical version an entirely textual display focused around a little wandering at-sign. Experienced players delight in ferreting out the emergent possibilities provided by the sheer depth of NetHack‘s systems. “The DevTeam thinks of everything,” goes a saying among players.

To wit: use a pair of gloves to pick up a dead cockatrice, a creature which turns any living thing it touches to stone, then bash your enemies with it to turn them to stone. (This technique is known among the NetHack cognoscenti as “wielding the rubber chicken.”) Of course, you’ll need to use a pick axe afterward to separate the statues of your enemies that are left behind from the loot they were carrying…

Or combine a Wand of Polymorph with a Ring of Polymorph Control to eliminate the middleman, as it were, turning yourself into a cockatrice. You can lay eggs in this form, which you can pick up and carry around once you revert to your natural form, throwing them at your enemies like grenades while you gleefully sing “Rainy Day Women #12 and 35.”

The possibilities are endless. NetHack even keeps track of the phases of the moon in the real world and uses them to influence your luck; this leads to devotees clearing their calendars once per month in order to maximize their chances when the moon is full.

NetHack has become an institution of old-school hacker culture, and with it an icon of the open-source movement. None other than Eric Raymond was the first to create an optional graphical skin for the game (a move that prompted considerable controversy). And well before he wrote The Cathedral and the Bazaar, he wrote the first manual for NetHack. Small wonder that it joined Rogue and Adventure as one of the very few games memorialized in 1996’s New Hacker’s Dictionary — edited by, you guessed it, Eric S. Raymond. DevTeam founding member Mike Stephenson has no doubts about NetHack‘s importance, not only as a standalone game but as a model for software development: “We predated open source [as a movement], but I do think we helped to promote the idea of making software available for public use without cost. I think the other thing that really contributed to the concept of open source is that NetHack has, and still does, accept bug reports and feature ideas from anyone.”

NetHack became the standard bearer of the roguelike sub-genre almost from the moment of its first release, and has never had its status in this regard seriously challenged. That said, hundreds of other roguelikes were made after it, and some even before it. The most important among them are arguably Moria and Angband. The former arrived at a complete form already in 1983, when it became the first game of this type to offer an above-ground town to serve as a base for your dungeon expeditions; this gave it a significantly different feel, more like, to put things in the terms of Dungeons & Dragons, an ongoing campaign than a single adventure module. Moria directly inspired 1990’s Angband, a much more complex implementation of the same approach, which, like NetHack, is still in active development today. Some players prefer NetHack‘s relentlessly escalating challenge, others Angband‘s somewhat more relaxed pacing and more free-form structure — but make no mistake, Angband too will kill you in a heartbeat if you let your guard down. And in it as well, dead is dead, permanently.

This roguelike “family tree” shows how the most historically and currently popular games in the sub-genre relate to one another.

This brings us back around to a statement I made at the outset: that roguelikes are the exception that proves the rule of open-source game development — and just possibly of open-source software development in general. The cast of thousands who contribute to them do so in order to make exactly the games that they want to play, which in the abstract is the best of all possible reasons to make a game. The experience they end up with is, unsurprisingly, much like high-wire programming at its most advanced, presenting players with an immense, multi-faceted system to be explored and mastered. And there is absolutely nothing wrong with this.

Still, it does seem to me that roguelikes tend to bring out some of the worst as well as the best of the hacker ethic, what with their insistence that they’re only for the “hardcore” and their lack of empathy for the newcomer. Few things in this world are less attractive than a nerd beating his chest. Robert Koeneke, the creator of Moria, admits that while he was working on it, “if anyone managed to win, I immediately found out how, and ‘enhanced’ the game to make it harder.” Likewise, for every cool interaction to be discovered in NetHack, there’s a cheap, heartless death in store, like stumbling down a staircase whilst carrying a cockatrice and turning yourself to stone, or missing a stirrup whilst trying to mount a horse and breaking your neck, or incinerating yourself by firing off your Wand of Lightning too close to a wall, or getting killed by your own pet dog when you attempt to use your Ring of Conflict to get that nearby band of orcs fighting one another. NetHack is the sort of game that likes to give you a fake Amulet of Yendor, than laugh at you when you scurry all the way back to the surface with it and think you’re about to win.

As with so much in life, one’s relationship to roguelikes comes down to questions of priorities. As someone who likes to play a variety of games, I’ve never done more than dabble in these ones. For the time required to get even minimally competent at them is more than I’m willing to invest in any single game — or that I can invest, if I want to keep doing what I do on this site.

Meanwhile the amount of time and effort required to get good at a game like NetHack is staggering, even if you’re far smarter and more diligent than I am. It took Chet Bolingbroke 262 hours of trying to win at NetHack for the first time — and that was playing in a fashion that many purists would consider illegitimate, by looking up spoilers on the game’s many interconnected components rather than learning strictly through experience, not to mention playing an old version that is much less complex than the current ones. Was it worth the time investment? He has his doubts. “Permadeath just sucks,” he concludes. Even Eric Raymond feels today that NetHack may have gone too far: “There was a natural tendency for the devs to see the game from the point of view of someone who played it constantly and obsessively. Thus, over time, their notion of not making it ‘too easy’ gradually ratcheted up the difficulty level to the point where you really couldn’t enjoy it casually anymore.” NetHack displays, in other words, open-source software’s usual Achilles heel, its developers’ inability to put themselves in the shoes of people who aren’t just like them.

Then again, it isn’t as if this represents some deep moral failing; there’s nothing wrong with being niche. Many or most lovers of NetHack and other roguelikes have never won them and quite probably never will, finding satisfaction merely in the trying, in hoping to get a little further than last time and walk away with some entertaining stories to share. Far be it from me to begrudge them their pleasures. Although I doubt that I will ever become a big fan of roguelikes, I do derive a quiet sort of satisfaction from knowing that things so implacably committed to being their own idiosyncratic selves exist in this world.

And if roguelikes will never go mainstream, that doesn’t mean they haven’t influenced the mainstream. Next time, we’ll learn how one of the most popular of all the slick commercial games of the late 1990s grew out of this odd little corner of hackerdom…



Did you enjoy this article? If so, please think about pitching in to help me make many more like it. You can pledge any amount you like.


(Sources: I highly recommend David L. Craddock’s book Dungeon H@acks: How NetHack, Angband, and Other Roguelikes Changed the Course of Video Games, a treasure trove of information that I have only touched upon here. The CRPG Addict blog is full of stories about what it’s like to actually play Rogue, Hack, 1987-vintage NetHack, 1989-vintage NetHackMoria, and Angband among other roguelikes, along with some more historical notes. I’m immensely indebted to David for all of his original research and to Chet for spending the hundreds of hours on these games that I couldn’t spare.

Other print sources include the books Hackers: Heroes of the Computer Revolution by Steven Levy, The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary by Eric S. Raymond, and Open Sources: Voices from the Revolution edited by Chris DiBona, Sam Ockman, and Mark Stone; Byte of March 1984 and February 1987; Acorn User of February 1997; Computer Power User of March 2008. Other online sources include Glenn Wichman’s “Brief History of Rogue,” “The Best Game Ever” by Wagner James Au at Salon, “Playing the Open Source Game” by Shawn Hargreaves, “Freeing an Old Game” by Ben Asselstine at Free Software Magazine, and a retrospective on NetHack by Dave “Fargo” Kosak of GameSpy.

Much more information about all of the games mentioned in this article, and roguelikes in general, can be found at RogueBasin, as can download links for all of them.)

 

Tags: , , ,