RSS

Tag Archives: electronic arts

The Designer’s Designer

Dan Bunten delivers the keynote at the 1990 Game Developers Conference.

Dan Bunten and his little company Ozark Softscape could look back on a tremendous 1984 as that year came to an end. Seven Cities of Gold had been a huge success, Electronic Arts’s biggest game of the year, doing much to keep the struggling publisher out of bankruptcy court by selling well over 100,000 copies. Bunten himself had become one the most sought-after interviewees in the industry. Everyone who got the chance to speak with him seemed to agree that Seven Cities of Gold was only the beginning, that he was destined for even greater success.

As it turned out, though, 1984 would be the high-water mark for Bunten, at least in terms of that grubbiest but most implacable metric of success in games: quantity of units shifted. The years that followed would be frustrating as often as they would be inspiring, as Bunten pursued a vision that seemed at odds with every trend in the industry, all the while trying to thread the needle between artistic fulfillment and commercial considerations.


In the wake of Seven Cities of Gold‘s success, EA badly wanted a follow-up with a similar theme, so much so that they offered Bunten a personal bonus of $5000 to make it Ozark’s next project. The result was Heart of Africa, a game which at first glance looks like precisely the sequel EA was asking for but that actually plays quite differently. Instead of exploring the Americas as Hernán Cortés during the 1600s, it has you exploring Africa as an intrepid Victorian adventurer (“Livingston, I presume?”). In keeping with the changed time and location, your goal isn’t to conquer the land for your country — Africa had, for better or for worse, already been thoroughly partitioned among the European nations by 1890, the year in which the game takes place — but simply to discover and to map. In the best tradition of Victorian adventure novels like King Solomon’s Mines, your ultimate goal is to find the tomb of a mythical Egyptian pharaoh. Bunten later admitted that the differences from Heart of Africa‘s predecessor weren’t so much a product of original design intent as improvisation after he had bumbled into an historical context that just wouldn’t work as a more faithful sequel.

Indeed, Bunten in later years dismissed Heart of Africa, his most adventure-like game ever and his last ever that was single-player only, as nothing more than “a game done to please EA”: “I honestly didn’t want to do the project.” Its biggest problem hinges on the fact that its environment is randomly generated each time you start a new game, itself an attempt to remedy the most obvious failing of adventure games as a commercial proposition: their lack of replayability. Yet the random maps can never live up to what a hand-crafted map, designed for challenge and dramatic effect, might have been; the “story” in Heart of Africa is all too clearly just a bunch of shifting interchangeable parts. Bunten later acknowledged that “the attempt to make a replayable adventure game made for a shallow product (which seems true in every other case designers have tried it as well). I guess that if elements are such that they can be randomly shifted than they [aren’t] substantive enough to make for a compelling game. So, even though I don’t like linear games, they seem necessary to have the depth a good story needs.”

Heart of Africa did quite well for EA upon its release in 1985 — well enough, in fact, to become Bunten’s third most successful game of all time. Yet the whole experience left a bad taste in his mouth. He came away from the project determined to return to the guiding vision behind his first game for EA, the commercially unsuccessful but absolutely brilliant M.U.L.E.: a vision of computer games that people played together rather than alone. In the future, he would continue to compromise at times on the style and subject matter of his games in order to sell them to his publishers, but he would never again back away from his one great principle. All of his games henceforward would be multiplayer — first, foremost, and in one case exclusively. In fact, that one case would be his very next game.

The success of his previous two games having opened something of a window of opportunity with EA, Bunten charged ahead on what he would later describe as his single “most experimental game.” Robot Rascals is a multiplayer scavenger hunt in which two physical decks of cards are integral to the game. Each player controls a robot, and must use it to collect the four items shown on the cards in her hand and return with them to home base in order to win. The game lives on the razor’s edge of pure chaos, the product both of random events generated by the computer and of a second deck of cards — the “specials” — which among other things can force players to draw new item cards, trash their old cards, or trade cards among one another; thus everyone’s goals are shifting almost constantly. As always in a Dan Bunten game, there are lots of thoughtful features here, from ways to handicap the game for players of different ages or skill levels to three selectable levels of overall complexity. He designed it to be “a game that anyone could play” rather than one limited to “special-interest groups like role-playing people or history buffs.” It can be a lot of fun, even if it’s not quite on the level of M.U.L.E. (then again, what is, right?). But this latest bid to make computer games acceptable family entertainment wound up selling hardly at all upon its release in 1986, ending Bunten’s two-game commercial hot streak.

By this point in Bunten’s career, changes in his personal life were beginning to have a major impact on the games he made. In 1985, while still working on Heart of Africa, he had divorced his second wife and married his third, with all the painful complications such disruptions entail when one is leaving children behind with the former spouse. In 1986, he and his new wife moved from Little Rock, Arkansas, to Hattiesburg, Mississippi, so she could complete a PhD. This event marked the effective end of Ozark Softscape as anything but a euphemism for Dan Bunten himself and whatever programmers and artists he happened to contract work out to. The happy little communal house/office where Dan and Bill Bunten, Jim Rushing, and Alan Watson had created games, with a neighborhood full of eager testers constantly streaming through the living room, was no more; only Watson continued to work on Bunten’s games from Robot Rascals on, and then more as just another hired programmer than a valued design voice. Even after moving back to Little Rock in 1988, Bunten would never be able to recapture the communal alchemy of 1982 to 1985.

Coupled with these changes were other, still more ominous ones in Dan Bunten himself. Those who knew him during these years generally refer only vaguely to his “problems,” and this discretion of course does them credit; I too have no desire to psychoanalyze the man. What does seem clear, however, is that he was growing increasingly unhappy as time wore on. He became more demanding of his colleagues, difficult enough to work with that many of them decided it just wasn’t worth it, even as he became more erratic in his own habits, perhaps due to an alcohol intake that struck many as alarming.

Yet Bunten was nothing if not an enigmatic personality. At the same time that close friends were worrying about his moodiness and his drinking, he could show up someplace like The Computer Game Developers Conference and electrify the attendees with his energy and ideas. Certainly his eyes could still light up when he talked about the games he was making and wanted to make. The worrisome questions were how much longer he would be allowed to make those games in light of their often meager sales, and, even more pressingly, why his eyes didn’t seem to light up about much else in his life anymore.

But, to return to the firmer ground of the actual games he was continuing to make: Modem Wars, his next one, marked the beginning of a new chapter in his tireless quest to get people playing computer games together. “We’ve failed at gathering people around the computer,” Bunten said before starting work on it. “We’re going to have to connect them out of the back by connecting their computers to each other.” He would make, in other words, a game played by two people on two separate computers, connected via modem.

Modem Wars was known as Sport of War until just prior to its release by EA in 1988, and in many ways that was a better title. Its premise is a new version of Bunten’s favorite sport of football, played not by individual athletes but by infantry, artillery, and even aircraft, if you can imagine such a thing. One might call it a mashup between two of his early designs for SSI: the strategic football simulator Computer Quarterback and the proto-real-time-strategy game Cytron Masters.

It’s the latter aspect that makes Modem Wars especially visionary. The game was nothing less than an online real-time-strategy death match years before the world had heard of such a thing. While a rudimentary artificial intelligence was provided for single-player play, it was made clear by the game’s very title that this was strictly a tool for learning to play rather than the real point of the endeavor. Daniel Hockman’s review of Modem Wars for Computer Gaming World ironically describes the qualities of online real-time strategy as a potential “problem” and “marketing weakness” — the very same qualities which a later generation would take as the genre’s main attractions:

A sizable number of gamers are not used to thinking in real-time situations. They can spend hours ordering tens of thousands of men into mortal combat, but they wimp out when they have to think under fire. They want to play chess instead of speed chess. They want to analyze instead of act. As the enemy drones zero in on their comcen, they throw up their hands in frustration when it’s knocked out before they can extract themselves from the maelstrom of fire that has engulfed them.

Whether because gamers really were daunted by this need to think on their feet or, more likely, because of the relative dearth of fast modems and stable online connections in 1988, Modem Wars became another crushing commercial disappointment for Bunten. EA declared themselves “hesitant” to keep pursuing this direction in the wake of the game’s failure. Rather than causing Bunten to turn away from multiplayer gaming, this loss of faith caused him to turn away from EA.

In the summer of 1989, MicroProse Software announced that they had signed a five-year agreement with Bunten, giving them first rights to all of the games he made during that period. The great hidden driver behind the agreement was MicroProse’s own star designer Sid Meier, who had never hidden his enormous admiration for Bunten’s work. Bunten doubtless hoped that a new, more supportive publisher would mark the beginning of a new, more commercially successful era in his career. And in the beginning at least, such optimism would, for once, prove well-founded.

Known at first simply as War!, then as War Room, and finally as Command H.Q., Bunten’s first game for MicroProse was aptly described by its designer as being akin to an abstract, casual board game of military strategy, like Risk or Axis & Allies. The big wrinkle was that this beer-and-pretzels game was to be played in real time rather than turns. But, perhaps in response to complaints about his previous game like those voiced by Daniel Hockman above, the pace is generally far less frenetic this time around. Not only can the player select an overall speed, but the program itself actually takes charge to speed up the action when not much is happening and slow it down when things heat up. Although a computer opponent is provided, the designer’s real focus was once more on modem-to-modem play.

But, whatever its designer’s preferences, MicroProse notably de-emphasized the multiplayer component in their advertising upon Command H.Q.‘s release in 1990, and this, combined with a more credible artificial intelligence for the computer opponent, gave it more appeal to the traditional wargame crowd than Modem Wars had demonstrated. Ditto a fair measure of evangelizing done by Computer Gaming World, with whom Bunten had always had a warm relationship, having even authored a regular column there for a few years in the mid-1980s. The magazine’s lengthy review concluded by saying, “This is the game we’ve all been waiting for”; they went on to publish two more lengthy articles on Command H.Q. strategy, and made it their “Wargame of the Year” for 1990. For all these reasons, Command H.Q. sold considerably better than had Bunten’s last couple of games; one report places its total sales at around 75,000 units, enough to make it his second most successful game ever.

With that to buoy his spirits, Bunten made big plans for his next game, Global Conquest. “Think of it as Command H.Q. meets Seven Cities of Gold meets M.U.L.E.,” he said. Drawing heavily from Command H.Q. in particular, as well as the old grand-strategy classic Empire, he aimed to make a globe-spanning strategy game where economics would be as important as military maneuvers. He put together a large and vocal group of play testers on CompuServe, and tried to incorporate as many of their suggestions as possible, via a huge options panel that allowed players to customize virtually every aspect of the game, from the rules themselves to the geography and topography of the planet they were fighting over, all the way down to the look of the icons representing the individual units. This time, up to four humans could play against one another in a variety of ways: they could all play together by taking turns on one computer, or they could each play on their own computer via a local-area network, or four players could share two computers that were connected via modem. The game was turn-based, but with an interesting twist designed to eliminate analysis paralysis: when the first player mashed the “next turn” button, everyone else had just twenty seconds to finish up their own turns before the execution phase began.

In later years, Dan Bunten himself had little good to say about what would turn out to be his last boxed game. In fact, he called it his absolute “worst game” of all the ones he had made. While play-testing in general is a wonderful thing, and every designer should do as much of it as possible, a designer also needs to keep his own vision for what kind of game he wants to make at the forefront. In the face of prominent-in-their-own-right, opinionated testers like Computer Gaming World‘s longtime wargame scribe Alan Emrich, Bunten failed to do this, and wound up creating not so much a single coherent strategy game as a sort of strategy-game construction set that baffled more than it delighted. “This game was a hodgepodge rather than an integration,” he admitted several years later. “It was just the opposite of the KISS doctrine. It was a kitchen-sink design. It had everything. Build your own game by struggling through several options menus.” He acknowledged as well that the mounting unhappiness in his personal life, which had now led to a divorce from his third wife, was making it harder and harder to do good work.

Released in 1992, Global Conquest under-performed commercially as well. In addition to the game’s intrinsic failings, it didn’t help matters that MicroProse had just five months prior released Sid Meier’s Civilization, another exercise in turn-based grand strategy on a global scale, also heavily influenced by Empire, that managed to be far more thematically and texturally ambitious while remaining more focused and playable as a game — albeit without the multiplayer element that was so important to Bunten.

But of course, there’s more to a game than whether it’s played by one person or more than one, and it strikes me as reasonable to question whether Bunten was beginning to lose his way as a designer in other respects even as he stuck so obstinately to his multiplayer guns. Setting aside their individual strengths and failings, the final three boxed games of Bunten’s career, with their focus on “wars” and “command” and “conquest,” can feel a little disheartening when compared to what came before. Games like M.U.L.E., Robot Rascals, and to some extent even Seven Cities of Gold and Heart of Africa had a different, friendlier, more welcoming personality. This last, more militaristic trio feels like a compromise, the product of a Dan Bunten who said that, if he couldn’t bring multiplayer gaming to the masses, he would settle for the grognard crowd, indulging their love for guns and tanks and bombs. So be it. Now, though, he was about to give that same crowd the shock of their lives.

In November of 1992, just months after completing the supremely masculine wargame Global Conquest, Dan Bunten had sexual-reassignment surgery, becoming the woman Danielle “Dani” Bunten Berry. (For continuity’s sake, I’ll generally continue to refer to her by the shorthand of “Bunten” rather than “Berry” for the remainder of this article.) It’s not for us to speculate about the personal trauma that must have accompanied such a momentous decision. What we can and should take note of, however, is that it was an unbelievably brave decision. For all that we still have a long way to go today when it comes to giving transsexuals the rights and respect they deserve, the early 1990s were a far less enlightened time than even our own on this issue. And it wasn’t as if Bunten could take comfort in the anything-goes anonymity of a New York City or San Francisco.  Dan Bunten had lived, and as Dani Bunten now continued to live, in the intensely conservative small-town atmosphere of Little Rock, Arkansas. Many of those closest to her disowned her, including her mother and her ex-wives, making it heartbreakingly difficult for her to maintain a relationship with her children. She had remained in Little Rock all these years, at no small cost to her career prospects, largely because of these ties of blood, which she had believed to be indissoluble. This rejection, then, must have felt like the bitterest of betrayals.

Dan Bunten with his beverage of choice.

The games industry as well, with its big-breasted damsels in distress and its machine-gun-toting male heroes, wasn’t exactly notable for its enlightened attitudes toward sex and gender. Many of Bunten’s old friends and colleagues would see her for the first time after her surgery and convalescence at the Game Developers Conference scheduled for April of 1993, and they looked forward to that event with almost as much trepidation as Bunten herself must have felt. It was all just so very unexpected. To whatever extent they had carried around a mental image of a man who would choose to become a woman, Dan Bunten didn’t fit the profile at all. He had been the games’ industry own Ozark Mountains boy, a true son of the South, always ready with his “folksy mountain humor” (read, “dirty jokes”). His rangy frame stood six feet two inches tall. He loved nothing more than a rough-and-tumble game of back-lot football, unless it be beer and poker afterward. As his three ex-wives and three children attested, he had certainly seemed to like women, but no one had ever imagined that he liked them enough to want to be one. What were they supposed to say to him — er, to her — now?

They needn’t have worried. Dani Bunten handled her coming-out party with the same low-key grace and humor she would display for the rest of her life as a woman. She said that she had made the switch to do her part to redress the gender imbalance inside the industry, and to help improve the aesthetics of game designers to match the improving aesthetics of their games. The tension dissipated, and soon everyone got into the spirit of the thing. A straw poll named Dani Bunten the game designer most likely to appear on the Oprah Winfrey Show. A designer named Gordon Walton had a typical experience: “I was put off when she made the change to become Dani, until the minute I spoke to her. It was clear to me she was much happier as Dani, and if anything an even more incredible person.” Another GDC regular remembered the “unhappy man” from the 1992 event, “sitting on the hallway floor drinking and smoking,” and contrasted him with the “happy woman” he now saw.

No one with any interest in the inner workings of those strangest of creatures, their fellow humans, could fail to be fascinated by Bunten’s dispatches from both sides of the gender divide. “Aren’t there things you’ve always wanted to know about women but were afraid to ask?” she said. “Well, now’s your chance!”

I had to learn a lot to actually “count” as a woman! I had to learn how to walk, speak, dress as a woman. Those little things which are necessary so that other people don’t [feel] alienated.There’s a little summary someone gave me to make clear what being a woman means: as a woman you have to sing when you speak, dance when you walk, and you have to open your heart… I know how stereotypical that sounds, but it is true! Speech for a man is something completely different: the melody of speech is fast, monotone, and decreases at the end of a sentence. Sometimes, this still happens to me, and people are always irritated. Female speech is a little bit like song – we have a lot more melody and different speech patterns. Walking is really a bit like dancing: slower and connected, with a lot of subtle movements. I enjoyed it at once.

She had few filters when talking about the nitty-gritty details:

One of the saddest changes I had to deal with after my operation was the fact that I couldn’t aim anymore when urinating. Boys — I have two little sons and a daughter — simply love to aim.

Bunten said that, in keeping with her new identity, she didn’t feel much desire to design any more wargames; this led to the end of her arrangement with MicroProse. By way of compensation, Electronic Arts that year released a nicely done “commemorative edition” of Seven Cities of Gold, complete with dramatically upgraded graphics and sound to suit the times. Bunten had little to nothing to do with the project, but it sold fairly well, and perhaps helped to remind her of her roots.

In the same spirit, Bunten’s first real project after her transformation became a new version of M.U.L.E. EA’s founder Trip Hawkins had always named that game as one of his all-time favorites, and had frequently stated how disappointed he was that it had never gotten the attention it deserved. Now, Hawkins had left his day-to-day management role at EA to run 3DO, a spin-off company peddling a multimedia set-top box for the living room. Hawkins thought M.U.L.E. would be perfect for the platform, and recruited Bunten to make it happen. It was a dream project; showing excellent taste, she still regarded M.U.L.E. as the best thing she had ever done. But the dream quickly began to sour.

3DO first requested that, instead of taking turns managing their properties on the map, players all be allowed to do so simultaneously. Bunten somewhat reluctantly agreed. And then:

As soon as I added the simultaneity, it instantly put into their heads, “Why can’t we shoot at each other?” And I said, “No guns.” And they said, “What about bombs? Can we drop a bomb in front of you? It won’t hurt you. It will be a cartoon thing, it will just slow you down.” And I said, “You don’t get it. It’s changing the whole notion of how this thing works!”

[3DO is] staking its future on the idea of a new generation of hardware and therefore, you’d assume, a new generation of software, but they said, “No, our market is still 18 to 35, male. We need something with action, something with intensity.” Chrome and sizzle. Ugh.

In the end, Bunten walked out, disappointed enough that she seriously considered getting out of games altogether, going so far as to apply for jobs as the industrial engineer Dan Bunten had once been before his first personal computer came along.

Instead she found a role with a new company called Mpath as a design and strategy consultant. The goal of that venture was to bring multiplayer gaming to the new frontier of the World Wide Web, and its founders included her fellow game designer Brian Moriarty, of Infocom and LucasArts fame. She also studied the elusive concept of “games for girls” in association with a think tank set up by Microsoft co-founder Paul Allen; some of her proposals would later come to market as the products of Purple Moon, Brenda Laurel’s brief-lived but important publisher of games for girls aged 8 to 14.

Offers to do conventional boxed games as sole designer, however, weren’t forthcoming; how much that was down to lingering personal prejudices against her for her changed sex and how much to the fact that the games she wanted to make just weren’t considered commercially viable must always be open for debate. Refusing as usual to be a victim, Bunten said that her “priorities had shifted” since her change anyway: “I don’t identify myself with the job as strongly as before.” Deciding that, for her, heaven was other people after a life spent programming computers, she devoured anthropology texts and riffed on Karl Jung’s theories of a collective unconscious. “Literature, anthropology, and even dance,” she noted, “have a good deal more to teach designers about human drives and abilities than the technologists of either end of California, who know silicon and celluloid but not much else.” So, she bided her time as a designer, waiting for a more inclusive ludic future to arrive. At the 1997 GDC, she described a prescient vision of “small creative shops” freed from the inherent conservatism of the “distribution trap” by the magic of the Internet.

That future would indeed come to pass — but, sadly, not in time for Dani Bunten Berry to see it. Shortly after delivering that speech, she went to see her doctor about a persistent cough, whereupon she was diagnosed with an advanced case of lung cancer. In one of those cruel ironies which always seem to dog the lives of us poor mortals, she had finally kicked a lifelong habit of heavy smoking just a few months before.

She appeared in public for the last time in May of 1998. The occasion was, once again, the Game Developers Conference, where she had always shone so. She struggled audibly for breath as she gave the last presentation of her life, entitled “Do Online Games Still Suck?,” but her passion carried her through. At the end of the conference, at a special ceremony held aboard the Queen Mary in Long Beach Harbor, she was presented with the first ever GDC Lifetime Achievement Award. The master of ceremonies for that evening was her friend and colleague Brian Moriarty, who knew, like everyone else in attendance, that the end was near. He closed his heartfelt tribute thus:

It is no exaggeration to characterize tonight’s honoree as the world’s foremost authority on multiplayer computer games. Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes the prototypes of the defining art form of the 21st century.

As one of those historians, I can only heartily concur with his assessment.

It would be nice to say that Dani Bunten passed peacefully to her rest. But, as anyone with any experience with lung cancer will recognize, that just isn’t how the disease works. Throughout her life, she had done nothing the easy way, and her death — ugly, painful, and slow — was no exception. On the brighter side, she did reconcile to some extent with her mother and other family members and friends who had rejected her. The end came on July 3, 1998. Rather incredibly in light of the prodigious, multifaceted life she had lived, she was just 49 years old.

It’s a life which resists pigeonholing or sloganeering. Bunten herself explicitly rejected the role of transgender advocate, inside or outside of the games industry. Near the end of her life, she expressed regret for her decision to change her physical sex, saying she could have found ways to live in a more gender-fluid way without taking such a drastic step. Whether this was a reasoned evaluation or a product of the pain and trauma of terminal illness must remain, like so much else about her, an enigma.

What is clear, however, is that Bunten, through the grace and humor with which she handled her transition and through her refusal to go away and hide thereafter as some might have wished, taught others in the games industry who were struggling with similar issues of identity that a new gender need not mean a decisive break with every aspect of one’s past — that a prior life in games could continue to be a life in games even with a different pronoun attached. She did this in a quieter way than the speechifying some might have wished for from her, but, nevertheless, do it she did. Jessica Mulligan, who transitioned from male to female a few years after her, remembers meeting Bunten shortly before her own sexual-reassignment surgery, hoping to hear some “profound words on The Transition”: “While I was looking for spiritual guidance, she was telling me where to shop for shoes. Talk about keeping someone honest! Every change in our personal lives is profound to us. You still have to pay attention to the nuts and bolts or the change is meaningless.”

Danielle Bunten Berry does her makeup.

For some, of course — even for some with generally good intentions — Danielle Bunten Berry’s transgenderism will always be the defining aspect of her life, her career in games a mere footnote to that other part of her story. But that’s not how she would have wanted it. She regarded her games as her greatest legacy after her children, and would doubtless want to be remembered as a game designer above all else.

Back in 1989, after Modem Wars had failed in the marketplace, Electronic Arts decided that the lack of “a network of people to play” was a big reason for its failure. The great what-if question pertaining to Bunten’s career is what she might have done in partnership with an online network like CompuServe, which could have provided stable connectivity along with an eager group of players and all the matchmaking and social intrigue anyone could ask for. She finally began to explore this direction late in her life, through her work with Mpath. But what might have happened if she had made the right connections — forgive the pun! — earlier? We can only speculate.

As it is, though, it’s true that, in terms of units shifted and profits generated, there have been far more impressive careers. She suffered the curse of any pioneer who gets too far out in front of the culture. All of her eleven games combined probably sold no more than 400,000 copies at the outside, a figure some prominent designers’ new games can easily better on their first week today. Certainly her commercial disappointments far outnumber her successes. But then, sales aren’t the only metric by which to measure success.

Dani Bunten, one might say, is the designer’s designer. Greg Costikyan once told what happened when he offered to introduce Warren Spector — one of those designers who can sell more games in a week than Bunten did in a lifetime — to her back in the day: “He regretfully refused; he had loved M.U.L.E. so much he was afraid he wouldn’t know what to say. He would sound like a blithering fanboy and be embarrassed.” Chris Crawford calls the same title simply “the best computer-game design of all time.” Brenda Laurel dedicated Purple Moon’s output to Bunten. Sid Meier was so taken with Seven Cities of Gold that Pirates!, Railroad Tycoon, and Civilization, his trilogy of masterpieces, can all be described as extensions in one way or another of what Bunten first wrought. And Seven Cities of Gold was only Meier’s second favorite Bunten game: he loved M.U.L.E. so much that he was afraid to even try to improve on it.

Ironically, the very multiplayer affordances that Bunten so steadfastly refused to give up on, much to the detriment of her income, continue to make it difficult for her games to be seen at their best today. M.U.L.E. can be played as its designer really intended it only on an Atari 8-bit computer — real or emulated — with four vintage joysticks plugged in and four players holding onto them in a single living room; that is, needless to say, not a trivial thing to arrange in this day and age. Likewise, the need to have the exceedingly rare physical cards to hand has made it impossible for most people to even try out Robot Rascals today. (It took me months to track down a pricey German edition on eBay.) And Bunten’s final run of boxed games, reliant on ancient modem hookups as they are, are even more difficult to play with others today than they were in their own time.

Dani Bunten didn’t have an easy life, internally or externally. She remained always an enigma — the life of the party who goes home alone, the proverbial stranger among her best friends. One person who knew her after she became a woman claimed she still had a “shadowed, slightly haunted look, even when she was smiling.” Given the complicated emotions that are still stirred up in so many of us by transgenderism, that may have been projection. On the other hand, though, it may have been perception. Even Bunten’s childhood had been haunted by the specter of familial discord and possibly abuse, to such an extent that she refused to talk much about it. But she did once tell Greg Costikyan that she grew up loving games mainly because it was only when playing them that her family wasn’t “totally dysfunctional.”

I think that for Dani Bunten games were most of all a means of communication, a way of punching through that bubble of ego and identity that isolates all of us to one degree or another, and that perhaps isolated her more so than most. Thus her guiding vision became, as Sid Meier puts it, “the family gathered around the computer.” After all, it’s a small step to go from communicating to connecting, from connecting to loving. She openly stated that she had made Robot Rascals for her own family most of all: “They’ve never played my games. I think they found them too esoteric or complex. I wanted something that I could enjoy with them, that they’d all be able to relate to.” The tragedy for her — perhaps a key to the essential sadness many felt at Bunten’s core, whether she was living as a man or a woman — is that reality never quite lived up to that Norman Rockwell dream of the happy family gathered around a computer; her daughter, the duly appointed caretaker of her legacy, still calls M.U.L.E. “boring and tedious” today. But the dream remains, and her games have given those of us privileged to discover them great joy and comfort in the midst of lives that have admittedly — hopefully! — been far easier than that of their creator. And so I’ll close, in predictable but unavoidable fashion, with Danielle Bunten Berry’s most famous quote — a quote predicable precisely because it so perfectly sums up her career: “No one on their death bed ever said, ‘I wish I had spent more time alone with my computer!'” Words to live by, my fellow gamers. Words to live by.

Danielle Bunten Berry, 1949-1998.

(Sources: Compute! of March 1989, December 1989, April 1990, January 1992, and December 1993; Questbusters of May 1986; Commodore Power Play of June/July 1986; Commodore Magazine of July 1987, October 1988, and June 1989; Ahoy! of March 1987; Computer Gaming World of January/February 1987, May 1988, February 1989, February 1990, December 1990, February 1991, March 1991, May 1991, April 1992, June 1992, August 1992, June 1993, August 1993, July 1994, September 1995, and October 1998; Family Computing of January 1987; Compute!’s Gazette of August 1989; The One of April 1991; Game Players PC Entertainment of September 1992; Game Developer of February/March 1995, July 1998, September 1998, and October 1998; Electronic Arts’s newsletter Farther of Winter 1986; Power Play of January 1995; Arkansas Times of February 8 2012. Online sources include the archived contents of the old World of Mule site, the archived contents of a Danielle Bunten Berry tribute site, the Salon article “Get Behind the M.U.L.E.”, and Bunten’s interview at Halcyon Days.)

 
28 Comments

Posted by on November 16, 2018 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Lost Files of Sherlock Holmes

In 1989, Trip Hawkins reluctantly decided to shift Electronic Arts’s strategic focus from home computers to videogame consoles, thereby to “reach millions of customers.” That decision was reaching fruition by 1992. For the first time that year, EA’s console games outsold those they published for personal computers. The whole image of the company was changing, leaving behind the last vestiges of the high-toned “software artists” era of old in favor of something less intellectual and more visceral — something aimed at the mass market rather than a quirky elite.

Still, corporate cultures don’t change overnight, and the EA of 1992 continued to release some computer games which were more in keeping with their image of the 1980s than that of this new decade. One of the most interesting and rewarding of these aberrations — call them the product of corporate inertia — was a game called The Lost Files of Sherlock Holmes, whose origin story doesn’t exactly lead one to expect a work of brilliance but which is in fact one of the finest, most faithful interpretations of the legendary detective in the deerstalker cap ever to make its way onto a monitor screen.

The initial impetus for Lost Files was provided by an EA producer named Christopher Erhardt. After studying film and psychology at university, Erhardt joined the games industry in 1987, when he came to Infocom to become the in-house producer for their latter-day lineup of graphical games from outside developers, such as Quarterstaff, BattleTech: The Crescent Hawk’s Inception, and Arthur: The Quest for Excalibur. When Infocom was shuttered in 1989, he moved on to EA in the same role, helming a number of the early Sega Genesis games that did so much to establish the company’s new identity. His success on that front gave him a fair amount of pull, and so he pitched a pet idea of his: for a sort of computerized board game that would star Sherlock Holmes along with a rotating cast of suspects, crimes, and motives, similar to the old 221B Baker Street board game as well as a classic computer game from Accolade called Killed Until Dead. It turned out that EA’s management weren’t yet totally closed to the idea of computer games that were, as Erhardt would later put it, “unusual and not aimed at the mass market” — as long, that is, as they could be done fairly inexpensively.

Mythos Software. On the top row are James Ferguson, Elinor Mavor, and Scott Mavor. On the bottom row are John Dunn and David Wood.

In order to meet the latter condition, Erhardt enlisted a tiny Tempe, Arizona, company known as Mythos Software — not to be confused with the contemporaneous British strategy-games developer Mythos Games. This Mythos was being run by one James Ferguson, its fresh-out-of-university founder, from the basement of his parents’ house. He was trying to break into the wider world of software development that lay outside the bounds of the strictly local contracts he had fulfilled so far; his inexperience and eagerness ensured that Mythos would work cheap. And in addition to cut-rate pricing, Ferguson had another secret weapon to deploy: an artist named Scott Mavor who had a very special way with pixel graphics, a technique that EA’s in-house employees would later come to refer to as “the Mavor glow.” The highly motivated Mythos, working to Erhardt’s specifications, created a demo in less than two weeks that was impressive enough to win the project a tentative green light.

Eric Lindstrom and R.J. Berg.

Another EA employee, a technical writer named Eric Lindstrom, saw the demo and suggested turning what had been planned as a computerized board game into a more narratively ambitious point-and-click adventure game. When Erhardt proved receptive to the suggestion, Lindstrom put together the outline of a story, “The Mystery of the Serrated Scalpel.” He told Erhardt that he knew the perfect person to bring the story to life: one of his colleagues among EA’s manual writers, a passionate Sherlock Holmes aficionado — he claimed to have read Arthur Conan Doyle’s complete canon of Holmes stories “two or three times” — named R.J. Berg.

The project’s footing inside EA was constantly uncertain. Christopher Erhardt says he “felt like I was playing the Princess Bride, and the dread pirate Roberts was coming It was always, ‘Yep – we may cancel it.'” But in the end the team was allowed to complete their point-and-click mystery, despite it being so utterly out of step with EA’s current strategic focus, and it was quietly released in the fall of 1992.

I find the critical dialog that followed, both in the immediate wake of Lost Files‘s release and many years later in Internet circles, to be unusually interesting. In particular, I’d like to quote at some length from Computer Gaming World‘s original review, which was written by Charles Ardai, one of the boldest and most thoughtful — and most entertaining — game reviewers of the time; this I say even as I find myself disagreeing with his conclusions far more often than not. His review opens thus:

If there is any character who has appeared in more computer games than Nintendo’s plump little goldmine, Mario, it has to be Sherlock Holmes. There have been almost a dozen Holmes-inspired games over the years, one of the best being Sherlock Holmes Consulting Detective, which is currently available in two different CD-ROM editions from ICOM. Other valiant attempts have included Imagic’s Sherlock Holmes in Another Bow, in which Holmes took a sea voyage with Gertrude Stein, Picasso, Thomas Edison, and Houdini, among others; and Infocom’s deadly serious Sherlock: Riddle of the Crown Jewels.

The difference between Holmes and Mario games, however, is that new Mario games are always coming out because the old ones sold like gangbusters, while new Sherlock Holmes games come out in spite of the fact that their predecessors sold like space heaters in the Sahara. It is noteworthy that, until ICOM, no company had ever released more than one Sherlock Holmes game, while all the Mario games come from the same source. It is also worth noting that the Holmes curse is not limited to games: the last few Holmes movies, such as Without a Clue and Young Sherlock Holmes, were not exactly box-office blockbusters.

The paradox of Sherlock Holmes can be stated so: while not that many people actually like the original Sherlock Holmes stories, everyone seems to think that everyone else adores them. Like Tarzan and Hawkeye, Holmes is a literary icon, universally known and much-beloved as a character in the abstract — not, however, as part of any single work. Finding someone who has actually read and enjoyed the writing of Edgar Rice Burroughs, James Fenimore Cooper, or Arthur Conan Doyle requires the patience of Diogenes. Most people know the character from television and the movies, at best; at worst, from reviews of television shows and movies they never bothered to see.

So, why do new Holmes adaptations surface with such regularity? Because the character is already famous and the material is in the public domain (thereby mitigating the requisite licensing fees associated with famous characters of more recent vintage. Batman or Indiana Jones, for instance.) Another answer is that Sherlock Holmes is seen as bridging the gap between entertainment and literature. Game companies presumably hope to cash in on the recognition factor and have some of the character’s ponderous respectability rub off on their product. They also figure that they can’t go wrong basing their games on a body of work that has endured for almost a century.

Unfortunately for them, they are wrong. There are only so many copies of a game that one can sell to members of the Baker Street Irregulars (the world’s largest and best-known Sherlock Holmes fan club), and a vogue for Victoriana has never really caught on among the rest of the game-buying population. The result is that, while Holmes games have been good, bad, and indifferent, their success has been uniformly mediocre.

This delightfully cynical opening gambit is so elegantly put together that one almost hesitates to puncture its cogency with facts. Sadly, though, puncture we must. While there were certainly Sherlock Holmes games released prior to Lost Files that flopped, there’s no evidence to suggest that this was the fault of the famous detective with his name on the box, and plenty of evidence to the contrary: that his name could, under the right circumstances, deliver at least a modest sales boost. In addition to the Sherlock Holmes Consulting Detective CD-ROM productions, a counter-example to Ardai’s thesis that’s so huge even he has to acknowledge it — the first volume of that series sold over 1 million units — there’s also the Melbourne House text adventure Sherlock; that game, the hotly anticipated followup to the bestselling-text-adventure-of-all-time The Hobbit, likely sold well over 100,000 units in its own right in the much smaller market of the Europe of 1984. Even Infocom’s Riddle of the Crown Jewels, while by no means a smash hit, sold significantly better than usual for an Infocom game in the sunset of the company’s text-only era. (Nor would I describe that game as “deadly serious” — I could go with “respectful” at most — but that’s perhaps picking nits.)

Still, setting aside those inconvenient details, it’s worth considering this broader question of just why there have been so many Sherlock Holmes games over the years. Certainly the character doesn’t have the same immediate appeal with the traditional gaming demographic as heavyweight properties like Star Wars and Star Trek, Frodo Baggins and Indiana Jones — or, for that matter, the born-in-a-videogame Super Mario. The reason for Sherlock’s ubiquity in the face of his more limited appeal is, of course, crystal clear, as Ardai recognizes: he’s in the public domain, meaning anyone who wishes to can make a Sherlock Holmes game at any time without paying anyone.1

If you’re going to do Sherlock Holmes, you just have to get the fog right.

As such, Holmes occupies a nearly unique position in our culture. He’s one of the last great fictional icons, historically speaking, who’s so blessedly free of intellectual-property restrictions. Absolutely everyone, whether they’ve ever read a story or seen a film featuring him or not, knows him. The only characters with a remotely similar degree of recognizability who postdate him are Dracula, the Wizard of Oz, and Peter Pan — and neither of the latter two at least presents writers with quite the same temptation to tell new story after story after story.

As is noted in Lost Files‘s manual, Sherlock Holmes has become such an indelible part of our cultural memory that when we see him we experience a sort of vicarious nostalgia for a London none of us ever knew: “Gas lamps, the sound of horses’ hooves, steam locomotives, and romantic street cries. And then there is the atmosphere of that cozy room in Baker Street: Holmes in his armchair before a roaring coal fire, legs stretched out before him, listening with Dr. Watson to yet another bizarre story.” One might say that Sherlock Holmes gets the chronological balance just right, managing to feel both comfortably, nostalgically traditional and yet also relevant and relatable. In contrast to the Victorian scenery around him, his point of view as a character feels essentially modern, applicable to modern modes of storytelling. I’m not sure that any other fictional character combines this quality to quite the same extent with a freedom from copyright lawyers. These factors have fostered an entire creative subculture of Sherlockia which spans the landscape of modern media, dwarfing Arthur Conan Doyle’s canonical four novels and 56 short stories by multiple orders of magnitude.

The relative modernity of Sherlock Holmes is especially important in the context of interactive adaptations. The player of any narrative-driven game needs a frame of reference — needs to understand what’s expected of her in the role she’s expected to play. Thankfully, the divide between Sherlock Holmes and the likes of C.S.I. is a matter of technology rather than philosophy; Sherlock too solves crimes through rationality, combining physical evidence, eyewitness and suspect interviews, and logical deduction to reach a conclusion. Other legendary characters don’t share our modern mindset; it’s much more difficult for the player to step into the role of an ancient Greek hero who solves problems by sacrificing to the gods or an Arthurian knight who views every event as a crucible of personal honor. (Anyone doubtful of Sherlock Holmes’s efficacy in commercial computer games should have a look at the dire commercial history of Arthurian games.)

With so much material to make sense of, post-Doyle adventures of Sherlock Holmes get sorted on the basis of various criteria. One of these is revisionism versus faithfulness. While some adaptations go so far as to transport Sherlock and his cronies hook, line, and sinker into our own times, others make a virtue out of hewing steadfastly to the character and setting described by Arthur Conan Doyle. This spirit of Sherlockian fundamentalism, if you will, is just one more facet of our long cultural dialog around the detective, usually manifesting as a reactionary return to the roots when other recent interpretations are judged to have wandered too far afield.

No matter how much the Sherlockian fundamentalists kick and scream, however, the fact remains that the Sherlock Holmes of the popular imagination has long since become a pastiche of interpretations reflecting changing social mores and cultural priorities. That’s fair enough in itself — it’s much of the reason why Doyle’s timeless sleuth remains so timeless — but it does make it all too easy to lose sight of Holmes and Watson as originally conceived in the stories. Just to cite the most obvious example: Holmes’s famous deerstalker cap is never mentioned in the text of the tales, and only appeared once in the illustrations that originally accompanied them (that instance being a picture drawn by Sidney Paget for “The Boscombe Valley Mystery”). The deerstalker became such an iconic part of the character only after it was sported by the actor Basil Rathbone as an item of daily wear — an odd choice for the urban Holmes, given that it was, as the name would imply, a piece of hunting apparel normally worn by sporting gentlemen in the countryside — in a long series of films, beginning with The Hound of the Baskervilles in 1939.

Although Lost Files doesn’t go so far as to forgo the deerstalker — there are, after all, limits to these things — it does generally try to take its cue from the original stories rather than the patchwork of interpretations that followed them. Berg:

I definitely aimed for Holmesian authenticity. I’d like to think that, if he were alive, Doyle would like the game. After all, the characters of Holmes and Watson have been manipulated quite a bit by the various media they’ve appeared in, especially the films. For example, the Watson of Lost Files is definitely Doyle’s Watson, competent and intelligent, rather than the bumbling character portrayed in many of the movies. I also wanted to retain Holmes’s peculiar personality. He’s really not that likable a character; he’s arrogant, a misogynist, and extremely smug.

This spirit of authenticity extends to the game’s portrayal of Victorian London. There are, I’ve always thought, two tiers when it comes to realistic portrayals of real places in fiction. Authors on the second tier have done a whole lot of earnest research into their subject, and they’re very eager to show it all to you, filling your head with explicit descriptions of things which a person who actually lived in that setting would never think twice about, so ingrained are they in daily existence. Authors on the top tier, by contrast, have seemingly absorbed the setting through their pores, and write stories that effortlessly evoke it without beating you over the head with all the book research they did to reach this point of imaginative mastery.

Indeed, Sherlock. Leaving the cozy environs of 221B Baker Street.

Lost Files largely meets the latter bar as it sends you around to the jewelers and tobacconists, theaters and pubs, opulent mansions and squalid tenements of fin-de-siècle London. The details are there for when you need them or decide to go looking for them; just try mousing around the interior of 221B Baker Street. (“A typical sitting-room chair. The sheen of its wine-red velveteen covering shows that it is well-used. A dark purple silk dressing gown with a rolled collar is carelessly crumpled on the seat and the antimacassar requires changing.”) More impressive, though, is the way that the game just breathes its setting in that subtle way that can only be achieved by a writer with both a lighter touch and countless hours of immersion in the period at his command. For example Berg spent time reading Charles Dickens as well as Arthur Conan Doyle in order to capture the subtle rhythms of Victorian English in his conversations. This version of Holmes’s London isn’t the frozen-in-amber museum exhibit it sometimes comes off as in other works of Sherlockia. “We wanted a dirty game,” says Eric Lindstrom. “We wanted people to feel that people were burning coal, that they could see who was walking in the streets. Just as it was in London at the time.”

There is, however, one important exception to the game’s rule of faithfulness to the original stories: Lost Files presents a mystery that the reader can actually solve. In light of the place Holmes holds in our cultural memory as the ultimate detective, one of the great ironies of Doyle’s stories is that they really aren’t very good mysteries at all by the standard of later mystery fiction — a standard which holds a good mystery to be an implicit contest between writer and reader, in which the reader is presented with all the clues and challenged to solve the case before the writer’s detective does so. Doyle’s stories cheat egregiously by this standard, hiding vital evidence from the reader, and often positing a case’s solution on a chain of conjecture that’s nowhere near as ironclad as the great detective presents it to be. Eric Lindstrom:

The [original] stories do not work the way we are used to today. They are not whodunnits; whodunnits only became popular later. Readers have virtually no way of finding out who the culprit is. Sometimes the offender does not even appear in the plot. These are adventure stories narrated from the perspective of Dr. Watson.

For obvious reasons, Lost Files can’t get away with being faithful to this aspect of the Sherlock Holmes tradition. And so the mystery it presents is straight out of Arthur Conan Doyle — except that it plays fair. Notably, you play as Holmes himself, not, as in the original stories, as Watson. Thus you know what Holmes knows, and the game can’t pull the dirty trick on you, even if it wanted to, of hiding information until the big reveal at the end. Many other works of Sherlockia — even the otherwise traditionalist ones — adapt the same approach, responding to our post-nineteenth-century perception of what a good mystery story should be.

And make no mistake: “The Case of the Serrated Scalpel” is a very good mystery indeed. I hesitate to spoil your pleasure in it by saying too much, and so will only state that what begins as the apparently random murder of an actress in an alley behind the Regency Theatre — perhaps by Jack the Ripper, leaving Whitechapel and trying his hand in the posher environs of Mayfair? — keeps expanding in scope, encompassing more deaths and more and more Powerful People with Secrets to Keep. As I played, I was excited every time I made a breakthrough. Even better, I felt like a detective, to perhaps a greater extent than in any computer game I’ve ever played. Among games in general, I can only compare the feeling of solving this mystery to that of tackling some of the more satisfying cases in the Sherlock Holmes Consulting Detective tabletop game.

Part of the reason the mystery comes together so well is just down to good adventure-game design principles, of the sort which eluded so many other contemporary practitioners of the genre. Berg:

The idea was to produce a game that was different from existing adventures, which I frankly felt were often tedious. We wanted to eliminate the elements that tend to detract from the reality of the experience — things like having to die in order to learn some crucial information, constantly having to re-cover the same territory, and the tendency to simply pick up and use every object you encounter. We wanted to give players a deeper experience.

So, there are none of the dreaded adventure-game dead ends in Lost Files. More interestingly, the design does, as Berg alludes above, mostly eschew the typical use-unlikely-object-in-unlikely-place model of gameplay. Tellingly, the few places where it fails to do so are the weakest parts of the game.

As I’ve noted before, the classic approach to the adventure game, as a series of physical puzzles to solve, can be hugely entertaining, but it almost inevitably pushes a game toward comedy, often in spite of its designers’ best intentions. Most of us have played alleged interactive mysteries that leave you forever messing about with slider puzzles and trivial practical problems of the sort that any real detective would solve in five minutes, just by calling for backup. In Infocom’s Sherlock: Riddle of the Crown Jewels, for example, you learn that a stolen ruby is hidden in the eye of the statue of Lord Nelson on top of Nelson’s Column, and then get to spend the next little while trying to get a pigeon to fetch it for you instead of, you know, just telling Inspector Lestrade to send out a work crew. Lost Files does its level best to resist the siren call of the trivial puzzle, and, with only occasional exceptions, it succeeds. Thereby is the game freed to become one of the best interactive invocations of a classic mystery story ever. You spend your time collecting and examining physical evidence, interviewing suspects, and piecing together the crime’s logic, not solving arbitrary road-block puzzles. Lost Files is one of the few ostensibly serious adventure games of its era which manages to maintain the appropriate gravitas throughout, without any jarring breaks in tone.

This isn’t to say that it’s po-faced or entirely without humorous notes; the writing is a consistent delight, filled with incisive descriptions and flashes of dry wit, subtle in all the ways most computer-game writing is not. Consider, for example, this description of a fussy jeweler: “The proprietor is a stern-looking woman, cordial more through effort than personality. She frequently stares at the cleaning girl who tidies the floor, to make sure she is still hard at work.” Yes, this character is a type more than a personality — but how deftly is that type conveyed! In two sentences, we come to know this woman. I’d go so far as to call R.J. Berg’s prose on the whole better than that of the rather stolid Arthur Conan Doyle, who tended to bloviate on a bit too much in that all too typical Victorian style.

The fine writing lends the game a rare quality that seems doubly incongruous when one considers the time in which it was created, when multimedia was all the rage and everyone was rushing to embrace voice acting and “interactive movies.” Ditto the company which published it, who were pushing aggressively toward the forefront of the new mass-media-oriented approach to games. In spite of all that, darned if Lost Files doesn’t feel downright literary — thoughtful, measured, intelligent, a game to take in slowly over a cup of tea. Further enhancing the effect is its most unique single technical feature: everything you do in the game is meticulously recorded in an in-game journal kept by the indefatigable Dr. Watson. The journal will run into the hundreds of onscreen “pages” by the time you’re all done. It reads surprisingly well too; it’s not inconceivable to imagine printing it out — the handy option to print it or save it to a file is provided — and giving it to someone else to read with pleasure. That’s a high standard indeed, one which vanishingly few games could meet. But I think that The Lost Files of Sherlock Holmes just about manages it.

Having given so much praise to Lindstrom and Berg’s design and writing, I have to give due credit as well to Mythos Software’s efforts to bring it all to life. The interface of Lost Files is thoroughly refined and pleasant to work with, a remarkable achievement considering that this was the first point-and-click graphic adventure to be made by everyone involved. An optional but extremely handy hotspot finder minimizes the burden of pixel hunting, and the interface is full of other thoughtful touches, like a default action that is attached to each object; this saves you more often than not from having to make two clicks to carry out an action.

Finally, praise must be given to Scott Mavor’s “Mavor glow” graphics as well. To minimize the jagged edges typical of pictures drawn in the fairly low resolution of 256-color VGA graphics, Mavor avoided sharp shifts in color from pixel to pixel. Instead he blended his edges together gradually, creating a lovely, painterly effect that does indeed almost seem to glow. Scott’s mother Elinor Mavor, who worked with him to finish up the art in the latter stages of the project:2

Working with just 256 colors, Scott showed me how he created graduating palettes of each one, which allowed him to do what he called “getting rid of the dots” in each scene. To further mute the pixels, he kept the colors on the darker side, which also enhanced the Victorian mood.

Weaving the illusion of continuous-tone artwork with all those little “dots” made us buggy-eyed after a long day’s work. One night, I woke up, went into the bathroom, turned on the light, and the world just pixilated in front of me. Scary imprints on my retinas had followed me away from the computer monitor, rendering my vision as a pointillistic painting à la George Seurat.

While the graphics of its contemporaries pop out at you with bright, bold colors, the palette of Lost Files of Sherlock Holmes smacks more of the “brown sauce” of the old masters — murky, mysterious, not initially jaw-dropping but totally in keeping with the mood of the script. As you, playing the diligent detective, begin to scan them carefully, the pictures reveal more and more details of the sort that are all too easy to overlook at a quick glance. It makes for an unusually mature aesthetic statement, and a look that can be mistaken for that of no other game.

Backstage at the opera.

Given all its strengths, I find it surprising that Lost Files has gotten more than its share of critical flak over the years. I have a theory as to why that should be, but before I get to that I’ll let one of the naysayers share his point of view. Even after admitting that the story is “a ripping yarn,” the graphics atmospheric, the period details correct, and the writing very good, Charles Ardai concludes his review thusly:

Don’t get me wrong: the dialogue is well-written, the choices are entertaining, and in most cases the actions the game requires the player to perform are very interesting. The story is good and the game is a pleasure to watch. Yet, that is what one does — watch.

This game wants, more than anything in the world, to be a Sherlock Holmes movie. Though it would be a very good one if it were, it is not. Therefore, it is deeply and resoundingly unsatisfying. The plot unfolds quite well, with plenty of twists, but the player has no more control over it than he would if he were reading a novel. The player is, at best, like an actor in a play. Unfortunately, said actor has not been given a copy of the script. He has to hit his marks and say his lines by figuring out the cues given by the other characters and reading his lines off the computer equivalent of cue cards.

If this is what one wants — a fine Sherlock Holmes pastiche played out on the computer screen, with the player nominally putting the lead character through his paces — fine. “The Case of the Serrated Scalpel” delivers all that one could hope for in that vein. If one wants a game — an interactive experience in which one’s decisions have an effect on what happens — this piece of software is likely to disappoint.

The excellent German podcast Stay Forever criticized the game along similar — albeit milder — lines in 2012. And in his mostly glowing 2018 review of the game for The Adventure Gamer joint-blogging project, Joe Pranevich as well noted a certain distancing effect, which he said made him feel not so much like he was playing Sherlock Holmes and solving a mystery as watching Sherlock do the solving. The mystery, he further notes — correctly — can for the most part be solved by brute force by the patient but obtuse player, simply by picking every single conversation option when talking to every single character and showing each of them every single object you’ve collected.

At the extreme, criticisms like these would seem to encroach on the territory staked out by the noted adventure-game-hater Chris Crawford, who insists that the entire genre is a lie because it cannot offer the player the ability to do anything she wants whenever she wants. I generally find such complaints to be a colossal bore, premised on a misunderstanding of what people who enjoy adventure games find most enjoyable about them in the first place. But I do find it intriguing that these sorts of complaints keep turning up so often in the case of this specific game, and that they’re sometimes voiced even by critics generally friendly to the genre. My theory is that the mystery of Lost Files may be just a little bit too good: it’s just enticing enough, and just satisfying enough to slowly uncover, that it falls into an uncanny valley between playing along as Sherlock Holmes and actually being Sherlock Holmes.

But of course, playing any form of interactive fiction must be an imaginative act on the part of the player, who must be willing to embrace the story being offered and look past the jagged edges of interactivity. Certainly Lost Files is no less interactive than most adventure games, and it offers rich rewards that few can match if you’re willing to not brute-force your way through it, to think about and really engage with its mystery. It truly is a game to luxuriate in and savor like a good novel. In that spirit, I have one final theory to offer you: I think this particular graphic adventure may be especially appealing to fans of textual interactive fiction. Given its care for the written word and the slow-build craftsmanship of its plotting, it reminds me more of a classic Infocom game than most of the other, flashier graphic adventures that jostled with it for space on store shelves in the early 1990s.

Which brings me in my usual roundabout fashion to the final surprising twist in this very surprising game’s history. After its release by a highly skeptical EA, its sales were underwhelming, just as everyone had been telling Christopher Erhardt they would be all along. But then, over a period of months and years, the game just kept on selling at the same slow but steady clip. It seemed that computer-owning Sherlock Holmes aficionados weren’t the types to rush out and buy games when they were hot. Yet said aficionados apparently did exist, and they apparently found the notion of a Sherlock Holmes adventure game intriguing when they finally got around to it. (Somehow this scenario fits in with every stereotype I carry around in my head about the typical Sherlock Holmes fan.) Lost Files‘s sales eventually topped the magical 100,000-unit mark that separated a hit from an also-ran in the computer-games industry of the early- and mid-1990s.

It wasn’t a very good idea, but they did it anyway. R.J. Berg on a sound stage with an actress, filming for the 3DO version of Lost Files of Sherlock Holmes. Pictures like this were in all the games magazines of the 1990s. Somehow such pictures — not to mention the games that resulted from them — seem far more dated than Pong these days.

Lost Files of Sherlock Holmes may not have challenged the likes of John Madden Football in the sales sweepstakes, but it did make EA money, and some inside the company did notice. In 1994, they released a version for the 3DO multimedia console. For the sake of trendiness, this version added voice acting and inserted filmed footage of actors into the conversation scenes, replacing the lovely hand-drawn portraits in the original game and doing it no new aesthetic favors in the process. In 1996, with the original still selling tolerably well, most of the old team got back together for a belated sequel — The Lost Files of Sherlock Holmes: Case of the Rose Tattoo — that no one would ever have dreamed they would be making a couple of years before.

But then, almost everything about the story of Lost Files is unlikely, from EA of all companies deciding to make it — or, perhaps better said, deciding to allow it to be made — to a bunch of first-time adventure developers managing to put everything together so much better than many established adventure-game specialists were doing at the time. And how incredibly lucky for everyone involved that such a Sherlock Holmes devotee as R.J. Berg should have been kicking around writing manuals for EA, just waiting for an opportunity like this one to show his chops. I’ve written about four Sherlock Holmes games now in the course of this long-running history of computer gaming — yet another measure of the character’s cultural ubiquity! — and this one nudges out Riddle of the Crown Jewels to become the best one yet. It just goes to show that, no matter how much one attempts to systematize the process, much of the art and craft of making games comes down to happy accidents.

(Sources: Compute! of April 1993 and June 1993; Computer Gaming World of February 1993; Questbusters of September 1988 and December 1992; Electronic Games of February 1993. Online sources include Elinor Mavor’s remembrances of the making Lost Files of Sherlock Holmes, the comprehensive Game Nostalgia page on the game, the Stay Forever podcast episode devoted to the game, Joe Pranevich’s playthrough for The Adventure Gamer, the archived version of the old Mythos Software homepage, and Jason Scott’s “Infocom Cabinet” of vintage documents.

Feel free to download Lost Files of Sherlock Holmes from right here, in a format designed to be as easy as possible to get running under your platform’s version of DOSBox or using ScummVM.)


  1. There have been occasional questions about the extent to which Sherlock Holmes and his supporting cast truly are outside all bounds of copyright, usually predicated on the fact that the final dozen stories were published in the 1920s, the beginning of the modern copyright era, and thus remain protected. R.J. Berg remembers giving “two copies of the game and a really trivial amount of money” to Arthur Conan Doyle’s aged granddaughter, just to head off any trouble on that front. When a sequel to Lost Files of Sherlock Holmes was published in 1996, no permission whatsoever was sought or demanded. 

  2. Scott Mavor died of cancer in 2008 

 
 

Tags: , , ,

Whither the Software Artist? (or, How Trip Hawkins Learned to Stop Worrying and Love the Consoles)

One of the places we ran the “Can a computer make you cry?” [advertisement] was in Scientific American. Scientific American readers weren’t even playing videogames. Why the hell are you wasting any of this really expensive advertising? You’re competing with BMW for that ad.

— Trip Hawkins (EA Employee #1)

Consumers were looking for a brand signal for quality. They didn’t lionize the game makers as these creators to fawn over. They thought of the game makers almost as collaborators in their experience. So apostatizing didn’t make sense to the consumers.

— Bing Gordon (EA Employee #7)

In the ’80s that was an interesting experiment, that whole trying-to-make-them-into-rock-stars kind of thing. It was certainly a nice way to recruit top talent. But the reality is that computer programmers and artists and designers are not rock stars. It may have worked for the developers, but I don’t think it had any impact on consumers.

— Stewart Bonn (EA Employee #19)

One of the stories that gamers most love to tell each other is that of Electronic Arts’s fall from grace. If you’re sufficiently interested in gaming history to be reading this blog, you almost certainly know the story in the broad strokes: how Trip Hawkins founded EA in 1982 as a haven for “software artists” doing cutting-edge work; how he put said artists front and center in rock-star-like poses in a series of iconic advertisements, the most famous of which asked whether a computer could make you cry; how he wrote on the back of every stylish EA “album cover” not about EA as a company but as “a collection of electronic artists who share a common goal to fulfill the potential of personal computing”; and how all the idealism somehow dissipated to give us the EA of today, a shambling behemoth that crushes more clever competitors under its sheer weight as it churns out sequel after sequel, retread after retread. The exact point where EA became the personification of everything retrograde and corporate in gaming varies with the teller; perhaps the closest thing to a popular consensus is the rise of John Madden Football and EA Sports in the early 1990s, when the last vestiges of software artistry in the company’s advertisements were replaced by jocks shouting, “It’s in the game!” Regardless of the specifics, though, everyone agrees that It All Went Horribly Wrong at some point. The story of EA has become gamers’ version of a Biblical tragedy: “For what shall it profit a man, if he shall gain the whole world, and lose his own soul?”

Of course, as soon as one starts pulling out Bible quotes, it profits to ask whether one has gone too far. And, indeed, the story of EA is often over-dramatized and over-simplified. Questions of authenticity and creativity are always fraught; to imagine that anyone is really in the arts just for the art strikes me as hopelessly naive. The EA of the early 1980s wasn’t founded by artists but rather by businessmen, backed by venture capitalists with goals of their own that had little to do with “fulfilling the potential of personal computing.” Thus, when the software-artists angle turned out not to work so well, it didn’t take them long to pivot. This, then, is the history of that pivot, and how it led to the EA we know today.


Advertising is all about image making — about making others see you in the light in which you wish to be seen. Without realizing that they were doing anything of the sort, EA’s earliest marketers cemented an image into the historical imagination at the same time that they failed in their more practical task of crafting a message that resonated with the hoped-for customers of their own time. The very same early EA advertising campaign which speaks so eloquently to so many today actually missed the mark entirely in its own day, utterly failing to set the public imagination afire with this idea of programmers and game designers as rock stars. When Trip Hawkins sent Bill Budge — the programmer of his who most naturally resembled a rock star — on an autograph-signing tour of software stores and shopping malls, it didn’t lead to any outbreak of Budgomania. “Nobody would ever show up,” remembers Budge today, still wincing at the embarrassment of sitting behind a deserted autograph booth.

Nor were customers flocking into stores to buy the games EA’s rock stars had created. Sales remained far below initial projections during the eighteen months following EA’s official launch in June of 1983, and the company skated on the razor’s edge of bankruptcy on multiple occasions. While their first year yielded the substantial hits Pinball Construction Set, Archon, and One-on-One, 1984 could boast only one comparable success story, Seven Cities of Gold. Granted, four hits in two years was more than plenty of other publishers managed, but EA had been capitalized under the expectation that their games would open up whole new demographics for entertainment software. “The idea was to make games for 28-year-olds when everybody else was making games for 13-year-olds,” says Bing Gordon, Trip Hawkins’s old university roommate and right-hand man at EA. When those 28-year-olds failed to materialize, EA was left in the lurch.

For better or for worse, One-on-One is the spiritual forefather of the unstoppable EA Sports lineup of today.

The most important architect of EA’s post-launch retrenchment was arguably neither Trip Hawkins nor Bing Gordon, but rather Larry Probst, who left the free-falling Activision to join EA as vice president for sales in 1984. Probst, who had worked at the dry-goods giants Johnson & Johnson and Clorox before joining Activision, had no particular attachment to the idea of software artists. He rather looked at the business of selling games much as he had that of selling toilet paper and bleach. He asked himself how EA could best make money in the market that existed rather than some fanciful new one they hoped to create. Steve Peterson, a product manager at EA, remembers that others “would still talk about how we were trying to create new forms of entertainment and break new boundaries.” But Probst, and increasingly Trip Hawkins as well, had the less high-minded goal of “going public and being a billion-dollar company.”

Probst had the key insight that distribution, more so than software artists or perhaps even product quality in the abstract, was the key to success in an industry that, following a major downturn in home computing in general in 1984, was only continuing to get more competitive. EA therefore spurned the existing distribution channels, which were nearly monopolized by SoftSel, the great behind-the-scenes power in the software industry to which everyone else was kowtowing; SoftSel’s head, Robert Leff, was the most important person in software that no one outside the industry had ever heard of. Instead of using SoftSel, EA set up their own distribution network piece by painful piece, beginning by cold-calling the individual stores and offering cut-rate deals in order to tempt them into risking the wrath of Leff and ordering from another source.

Then, once a reasonable distribution network was in place, EA leveraged the hell out of it by setting up a program of so-called “Affiliated Labels” — other publishers who would pay EA instead of a conventional distributor like SoftSel to get their products onto store shelves. It was a well-nigh revolutionary idea in game publishing, attractive to smaller publishers because EA was ready and able to help out with a whole range of the logistical difficulties they were always facing, from packaging and disk duplication to advertising campaigns. For EA, meanwhile, the Affiliated Labels yielded huge financial rewards and placed them in the driver’s seat of much of the industry, with the power of life and death over many of their smaller ostensible competitors.

Unsurprisingly, Activision, the only other publisher with comparable distributional clout, soon copied the idea, setting up a similar program of their own. But even as they did so, EA, seemingly always one step ahead, was becoming the first American publisher to send games — both their own and those of others — directly to Europe without going through a European intermediary like Britain’s U.S. Gold label.

There was always something a bit contrived, in that indelible Silicon Valley way, about how EA chose to present themselves to the world. Here we have Bing Gordon, head of technology Greg Riker, and producer Joe Ybarra indulging in some of the creative play which, an accompanying article is at pains to tell us, was constantly going on around the office.

Larry Probst’s strategy of distribution über alles worked a treat, yielding explosive growth that more than made up for the company’s early struggles. In 1986, EA became the biggest computer-game publisher in the United States and the world, with annual revenues of $30 million. Their own games were doing well, but were assuming a very different character from the “simple, hot, and deep” ideal of the launch — a phrase Trip Hawkins had once loved to apply to games that were less stereotypically nerdy than the norm, that he imagined would be suitable for busy young adults with a finger on the pulse of hip pop culture. Now, having failed to attract that new demographic, EA adjusted their product line to appeal to those who were already buying computer games. A case in point was The Bard’s Tale, EA’s biggest hit of 1985, a hardcore CRPG that might take a hundred hours or more to complete — fodder for 13-year-olds with long summer vacations to fill rather than 28-year-olds with jobs and busy social calendars.

If “simple, hot, and deep” and programmers as rock stars had been two of the three pillars of EA’s launch philosophy, the last was the one written into Hawkins’s original mission statement as “stay with floppy-disk-based computers only.” Said statement had been written, we should remember, just as the first great videogame fad, fueled by the Atari VCS, was passing its peak and beginning the long plunge into what would go down in history as the Great Videogame Crash of 1983. At the time, it certainly wasn’t only the new EA who believed that the toy-like videogame consoles were the past, and that more sophisticated personal computers, running more sophisticated games, were the future. “I think that computer games are fundamentally different from videogames,” said Hawkins on the Computer Chronicles television show. “It becomes a question of program size, when you want to know how good a program can I have, how much can I do with it, and how long will it take before I’m bored with it.” This third pillar of EA’s strategy would take a bit longer to fall than the others, but fall it would.

The origins of EA’s loss of faith in the home computer in general as the ultimate winner of the interactive-entertainment platform wars can ironically be traced to their decision to wholeheartedly endorse one computer in particular. In October of 1984, Greg Riker, EA’s director of technology, got the chance to evaluate a prototype of Commodore’s upcoming Amiga. His verdict upon witnessing this first truly multimedia personal computer, with its superlative graphics and sound, was that this was the machine that could change everything, and that EA simply had to get involved with it as quickly as possible. He convinced Trip Hawkins of his point of view, and Hawkins managed to secure Amiga Prototype Number 12 for the company within weeks. In the months that followed, EA worked to advance the Amiga with if anything even more enthusiasm than Commodore themselves: developing libraries and programming frameworks which they shared with their outside developers; writing tools internally, including what would become the Amiga’s killer app, Deluxe Paint; documenting the Interchange File Format, a set of standard specifications for sharing pictures, sounds, animations, and music across applications. All of these things and more would remain a part of the Amiga platform’s basic software ecosystem throughout its existence.

When the Amiga finally started shipping late in 1985, EA actually made a far better public case for the machine than Commodore, taking out a splashy editorial-style advertisement just inside the cover of the premiere issue of the new AmigaWorld magazine. It showed the eight Amiga games EA would soon release and explained “why Electronic Arts is committed to the Amiga,” the latter headline appearing above a photograph of Trip Hawkins with his arm proprietorially draped over the Amiga on his desk.

Trip Hawkins with an Amiga

But it all turned into an immense disappointment. Initially, Commodore priced the Amiga wrong and marketed it worse, and even after they corrected some of their worst mistakes it perpetually under-performed in the American marketplace. For Hawkins and EA, the whole episode planted the first seeds of doubt as to whether home computers — which at the end of the day still were computers, requiring a degree of knowledge to operate and associated in the minds of most people more with work than pleasure — could really be the future of interactive entertainment as a mass-media enterprise. If a computer as magnificent as the Amiga couldn’t conquer the world, what would it take?

Perhaps it would take a piece of true consumer electronics, made by a company used to selling televisions and stereos to customers who expected to be able to just turn the things on and enjoy them — a company like, say, Philips, who were working on a new multimedia set-top box for the living room that they called CD-I. The name arose from the fact that it used the magical new technology of CD-ROM for storage, something EA had been begging Commodore to bring to the Amiga to no avail. EA embraced CD-I with the same enthusiasm they had recently shown for the Amiga, placing Greg Riker in personal charge of creating tools and techniques for programming it, working more as partners in CD-I’s development with Philips than as a mere third-party publisher.

Once again, however, it all came to nought. CD-I turned into one of the most notorious slow-motion fiascos in the history of the games industry, missing its originally planned release date in the fall of 1987 and then remaining vaporware for years on end. In early 1989, EA finally ran out of patience, mothballing all work on the platform unless and until it became a viable product; Greg Riker left the company to go work for Microsoft on their own CD-ROM research.

CD-I had cost EA a lot of money to no tangible result whatsoever, but it does reveal that the idea of gaming on something other than a conventional computer was no longer anathema to them. In fact, the year in which EA gave up on CD-I would prove the most pivotal of their entire history. We should therefore pause here to examine their position in 1989 in a bit more detail.

Despite the frustrating failure of the Amiga and CD-I to open a new golden age of interactive entertainment, EA wasn’t doing badly at all. Following years of steady growth, annual revenue had now reached $63 million, up 27 percent from 1988. EA was actively distributing about 100 titles under their own imprint, and 250 more under the imprint of the various Affiliated Labels, who had become absolutely key to their business model, accounting for some 45 percent of their total revenues. About 80 percent of their revenues still came from the United States, with 15 percent coming from Europe — where EA had set up a semi-independent subsidiary, the Langley, England-based EA Europe, in 1987 — and the remainder from the rest of the world. The company was extremely diversified. They were producing software for ten different computing platforms worldwide, had released 40 separate titles that had earned them at least $1 million each, and had no single title that accounted for more than 6 percent of their total revenues.

What we have here, then, is a very healthy business indeed, with multiple revenue streams and cash in the bank. The games they released were sometimes good, sometimes bad, sometimes mediocre; EA’s quality standards weren’t notably better or worse than the rest of their industry. “We tried to create a brand that fell somewhere between Honda and Mercedes,” admits Bing Gordon, “but a lot of the time we shipped Chevy.” Truth be told, even in the earliest days the rhetoric surrounding EA’s software artists had been a little overblown; many of the games their rock stars came up with were far less innovative than the advertising that accompanied them. The genius of Larry Probst had been to explicitly recognize that success or failure as a games publisher had as much to do with other factors as it did with the actual games you released.

For all their success, though, no one at EA was feeling particularly satisfied with their position. On the contrary: 1989 would go down in EA’s history as the year of “crisis.” As successful as they had become selling home-computer software, they remained big fish in a rather small pond, a situation out of keeping with the sense of overweening ambition that had been a part of the company’s DNA since its founding. In 1989, about 4 million computers were being used to play games on a regular or semi-regular basis in American homes, enough to fuel a computer-game industry worth an estimated $230 million per year. EA alone owned more than 25 percent of that market, more than any competitor. But there was another, related market in which they had no presence at all: that of the videogame consoles, which had returned from the dead to haunt them even as they were consolidating their position as the biggest force in computer games. The country was in the grip of Nintendo mania. About 22 million Nintendo Entertainment Systems were already in American homes — a figure accounting for 24 percent of all American households — and cartridge-based videogames were selling to the tune of $1.6 billion per year.

Unlike many of their peers, EA hadn’t yet suffered all that badly under the Nintendo onslaught, largely because they had already diversified away from the Commodore 64, the low-end 8-bit computer which had been the largest gaming platform in the world just a couple of years before, and which the NES was now in the process of annihilating. But still, the future of the computer-games industry in general felt suddenly in doubt in a way that it hadn’t since at least the great home-computer downturn of 1984. A sizable coalition inside EA, including Larry Probst and most of the board of directors, pushed Trip Hawkins hard to get EA’s games onto the consoles. Fearing a coup, he finally came around. “We had to go into the [console-based] videogame business, and that meant the world of mass-market,” Hawkins remembers. “There were millions of customers we were going to reach.”

But through which door should they make their entrance? Accustomed to running roughshod over his Affiliated Labels, Hawkins wasn’t excited about the prospect of entering Nintendo’s walled garden, where the shoe would be on the other foot, thanks to that company’s infamously draconian rules for its licensees. Nintendo’s standard contract demanded that they receive the first $12 from every game a licensee sold, required every game to go through an exhaustive review process before publication, and placed strict limits on how many games a licensee was allowed to publish per year and how many units they were allowed to manufacture of each one. For EA, accustomed to being the baddest hombre in the Wild West that was the computer-game marketplace, this was well-nigh intolerable. Bing Gordon insists even today that, thanks to all of the fees and restrictions, no one other than Nintendo was doing much more than breaking even on the NES during this, the period that would go down in history as the platform’s golden age.

So, EA decided instead to back a dark horse: the much more modern Sega Genesis, which hadn’t even been released yet in North America. It was built around the same 16-bit Motorola 68000 CPU found in computers like the Commodore Amiga and Apple Macintosh, with audiovisual capabilities not all that far removed from the likes of the Amiga. The Genesis would give designers and programmers who were used to the affordances of full-fledged computers a far less limiting platform than the NES to work with, and it offered the opportunity to get in on the ground floor of a brand-new market, as opposed to the saturated NES platform. The only problem was that Sega’s licensing fees were comparable to those of Nintendo, even though they could only offer their licensees access to a much more uncertain pool of customers.

Determined to play hardball, Hawkins had a team of engineers reverse-engineer the Genesis, sufficient to let them write games for it with or without Sega’s official development kit. Then he met with Sega again, telling them that, if they refused to adjust their licensing terms, he would release games on the console without their blessing, forcing them to initiate an ugly court battle of the sort that was currently raging between Nintendo and Atari if they wished to bring him to heel. That, he was gambling, was expense and publicity of a sort which Sega simply couldn’t afford. And Sega evidently agreed with his assessment; they accepted a royalty rate half that being demanded by Nintendo. By this roundabout method, EA became the first major American publisher to support the new console, and from that point forward the two companies became, as Hawkins puts it, “good partners.”

EA initially invested $2.5 million in ten games for the Genesis, some of them original to the console, some ports of their more popular computer games. They started shipping the first of them in June of 1990, ten months after the Genesis itself had first gone on sale in the United States. This first slate of EA Genesis titles arrived in a marketplace that was still starving for quality games, just as Hawkins had envisioned it would be. Among them was the game destined to become the face of the new, mass-market-oriented EA: John Madden Football, a more action-oriented re-imagining of a 1988 computer game of the same name.

John Madden Football debuted as a rather cerebral, tactics-heavy computer game in 1988, just another in an EA tradition of famous-athlete-endorsed sports games stretching back to 1983’s (Dr. J and Larry Bird Go) One-on-One. No one in 1988 could have imagined what it would come to mean in the years to come for either its publisher or its spokesman/mascot, both of whom would ride it to iconic heights in American pop culture.

The Sega Genesis marked the third time EA had taken a leap of faith on a new platform. It was the first time, however, that their faith paid off. About 25 percent of the games EA sold in 1990 were for the Genesis. And when the console really started to take off in 1991, fueled not least by their own games, EA was there to reap the rewards. In that year, four of the ten best-selling Genesis games were published by EA. At the peak of their dominance, EA alone was publishing about 35 percent of all the games sold for the Genesis. Absent the boost their games gave it early on, it’s highly questionable whether the Genesis would have succeeded at all in the United States.

In the beginning, few of EA’s outside developers had been terribly excited about writing for the consoles. One of them remembers Hawkins “reading us the riot act” just to get them onboard. Indeed, Hawkins claims today that about 15 percent of EA’s internal employees were so unhappy with the new direction that they quit. Certainly his latest rhetoric could hardly have been more different from that of 1983:

I knew we had to let go of our attachment to machines that the public did not want to buy, and support the hardware that the public would embrace. I made this argument on the grounds of delivering customer satisfaction, and how quality is in the eye of the beholder. If the customer buys a Genesis, we want to give him the best we can for the machine he bought and not resent the consumer for not buying a $1000 computer.

By this point, Hawkins had finally bit the bullet and done a deal with Nintendo, who, in the face of multiple government investigations and lawsuits over their business practices, were becoming somewhat more generous with both their competitors and licensees. When games like Skate or Die, a port of a Commodore 64 hit that just happened to be perfect for the Nintendo and Sega demographics as well, started to sell in serious numbers on the consoles, Hawkins’s developers’ aversion started to fade in the face of all that filthy lucre. Soon the developers of Skate or Die were happily plunging into a sequel which would be a console exclusive.

Even the much-dreaded oversight role played by Nintendo, in which they reviewed every game before allowing it to be published, proved less onerous than expected. When Will Harvey, the designer of an action-adventure called The Immortal, finally steeled himself to look at Nintendo’s critique thereof, he was happily surprised to find the list of “suggestions” to be very helpful on the whole, demonstrating real sensitivity to the effect he was trying to achieve. Even Bing Gordon, who had been highly skeptical of getting into bed with Nintendo, had to admit in the end that “the rating system is fair. On a scale from zero to a hundred, where zero meant the system was totally manipulated for Nintendo’s self-interest and a hundred meant that it was absolutely democratic, they’d probably get a ninety. I’ve seen a little bit of self-interest, but this is America, the land of self-interest.”

Although EA cut their Nintendo teeth on the NES, it was on the long-awaited follow-up console, 1991’s Super Nintendo, that they really began to thrive. That machine boasted capabilities similar to those of the Sega Genesis, meaning EA already had games ready to port over, along with developers with considerable expertise in writing for a more advanced species of console. Just in time for the Christmas of 1991, EA released a new version of John Madden FootballJohn Madden Football ’92 — simultaneously on the Super Nintendo and the Genesis. The sequel had been created, according to the recollections of several EA executives, against the advice of market researchers and retailers: “All you’re going to do is obsolete our old game.” But Trip Hawkins remembered how much, as a kid, he had loved the Strat-O-Matic Football board game, for which a new set of player and team cards was issued every year just before the beginning of football season, ensuring that you could always recreate in the board game the very same season you were watching every Sunday on television. So, he ignored the objections of the researchers and the retailers, and John Madden Football ’92 became an enormous hit, by far the biggest EA had yet enjoyed on any platform — thus inaugurating, for better or for worse, the tradition of annual versions of gaming’s most evergreen franchise. Like clockwork, we’ve gotten a new Madden every single year since, a span of time that numbers a quarter-century and change as of this writing.

All of this had a transformative effect on EA’s bottom line, bringing on their biggest growth spurt yet. Revenues increased from $78 million in 1990 to $113 million in 1991; then they jumped to $175 million in 1992, accompanied by a two-for-one stock split that was necessary to keep the share price, which had been at $10 just a few years before, from exceeding $50. In that year, six of the fifteen most popular console games, across all platforms, were published by EA. Their Sega Genesis games alone generated $77 million, 18 percent more than the entirety of the company’s product portfolio had managed in 1989. This was also the first year that EA’s console games in the aggregate outsold their offerings for computers. They were leaving no doubt now as to where their primary loyalty lay: “The 16-bit consoles are far better for games than PCs. The Genesis is a very sophisticated machine…” The disparity between the two sides of the company’s business would only continue to get more pronounced, as EA’s sales jumped by an extraordinary 70 percent — to $298 million — in 1993, a spurt fueled entirely by console-game sales.

But, despite all their success on the consoles, EA — and especially their founder, Trip Hawkins — continued to chafe under the restrictions of the walled-garden model of software distribution. Accordingly, Hawkins put together a group inside EA to research the potential for a CD-ROM-based multimedia set-top box of their own, one that would be used for more than just playing games — sort of a CD-I done right. “The Japanese videogame companies,” he said, “are too shortsighted to see where this is going.” In contrast to their walled gardens, his box would be as open as possible. Rather than a single new hardware product, it would be a set of hardware specifications and an operating system which manufacturers could license, which would hopefully result in a situation similar to the MS-DOS marketplace, where lots of companies competed and innovated within the bounds of an established standard. The marketplace for games and applications as well on the new machine would be far less restricted than the console norm, with a more laissez-faire attitude to content and a royalty fee of just $3 per unit sold.

In 1991, EA spun off the venture under the name of 3DO. Hawkins turned most of his day-to-day responsibilities at EA over to Larry Probst in order to take personal charge of his new baby, which took tangible form for the first time with the release of the Panasonic “Real 3DO Player” in late 1993. It and other implementations of the 3DO technology managed to sell 500,000 units worldwide — 200,000 of them in North America — by January of 1995. Yet those numbers were still a pittance next to those of the dedicated game consoles, and the story of 3DO became one of constant flirtations with success that never quite led to that elusive breakthrough moment. As 3DO struggled, Hawkins’s relations with his old company worsened. He believed they had gone back on promises to support his new venture wholeheartedly; “I didn’t feel like I was leaving EA, but it turned out that way,” he says today with lingering bitterness. The long, frustrating saga of 3DO wouldn’t finally straggle to a bankruptcy until 2003.

EA, meanwhile, was flying ever higher absent their founder. Under Larry Probst — always the most hard-nosed and sober-minded of the executive staff, the person most laser-focused on the actual business of selling videogames — EA cemented their reputation as the conservative, risk-averse giant of their industry. This new EA was seemingly the polar opposite of the company that had once asked with almost painful earnestness if a computer could make you cry. And yet, paradoxically, it was a place still inhabited by a surprising number of the people who had come up with that message. Most prominent among them was Bing Gordon, who notes cryptically today only that “people’s ideals get tested in the face of love or money.” Part of the problem — assuming one judges EA’s current less-than-boldly-innovative lineup of franchises to be a problem — may be a simple buildup of creative cruft that has resulted from being in business for so long. Every franchise that debuts in inspiration and innovation, then goes on to join John Madden Football on the list of EA perennials, sucks some of the bandwidth away that might otherwise have been devoted to the next big innovator.

In the summer of 1987, when EA was still straddling the line between their old personality and their new, Trip Hawkins wrote the following lines in their official newsletter — lines which evince the keenly felt tension between art and commerce that has become the defining aspect of EA’s corporate history for so many in the years since:

Unfortunately, simply being creative doesn’t always mean you’ll be wildly successful. Van Gogh sold only one painting during his lifetime. Lots of people would still rather go see Porky’s Revenge IV, ignoring well-produced movies like Amadeus or Chariots of Fire. As a result, film producers take fewer risks, and we get less variety, and pretty soon the Porky’s and Rambo clones are all you can find on a Friday night. Software developers have the same problem. (To this day, all of us M.U.L.E. fans wonder why the entire world hasn’t fallen in love with our favorite game.)

The only way to solve the problem is to do it together. On our end, we’ll keep innovating, researching, experimenting with new ways to use this new medium; on your end, you can support our efforts by taking an occasional risk, by buying something new and different… maybe Robot Rascals, or Make Your Own Murder Party.

You may be very pleasantly surprised — and you’ll help our software artists live to innovate another day.

Did EA go the direction they did because of gamers’ collective failure to support their most innovative, experimental work? Does it even matter if so? The more pragmatic among us might note that the EA of today is delivering games that millions upon millions of people clearly want to play, and where’s the harm in that?

Still, as we look upon this industry that has so steadfastly refused to grow up in so many ways, there remain always those pictures of EA’s first generation of software artists — pictures that, yes, are a little pretentious and a lot contrived, but that nevertheless beckon us to pursue higher ideals. They’ve taken on an identity of their own now, quite apart from the history of the company that once splashed them across the pages of glossy lifestyle magazines. Long may they continue to inspire.

(Sources: the book Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay and Game Over: How Nintendo Conquered the World by David Sheff; Harvard Business School’s case study “Electronic Arts in 1995”; ACE of April 1990; Amazing Computing of July 1992; Computer Gaming World of March 1988, October 1988, and June 1989; MicroTimes of April 1986; The One of November 1988; Electronic Arts’s newsletter Farther from Summer 1987; AmigaWorld premiere issue; materials relating to the Software Publishers Association included in the Brøderbund archive at the Strong Museum of Play; the episode of the Computer Chronicles television series entitled “Computer Games.” Online sources include “We See Farther — A History of Electronic Arts” at Gamasutra, “How Electronic Arts Lost Its Soul” at Polygon, and Funding Universe‘s history of Electronic Arts.)

 
 

Tags: , ,

Peter Molyneux’s Kingdom in a Box

Peter Molyneux, circa 1990

Peter Molyneux, circa 1990.

I have this idea of a living world, which I have never achieved. It’s based upon this picture in my head, and I can see what it’s like to play that game. Every time I do it, then it maybe gets closer to that ideal. But it’s an ambitious thing.

— Peter Molyneux

One day as a young boy, Peter Molyneux stumbled upon an ant hill. He promptly did what young boys do in such situations: he poked it with a stick, watching the inhabitants scramble around as destruction rained down from above. But then, Molyneux did something that set him apart from most young boys. Feeling curious and maybe a little guilty, he gave the ants some sugar for energy and watched quietly as they methodically undid the damage to their home. Just like that, he woke up to the the idea of little living worlds with lots of little living inhabitants — and to the idea of he himself, the outsider, being able to affect the lives of those inhabitants. The blueprint had been laid for one of the most prominent and influential careers in the history of game design. “I have always found this an interesting mechanic, the idea that you influence the game as opposed to controlling the game,” he would say years later. “Also, the idea that the game can continue without you.” When Molyneux finally grew bored and walked away from the ant hill on that summer day in his childhood, it presumably did just that, the acts of God that had nearly destroyed it quickly forgotten. Earth — and ants — abide.

Peter Molyneux was born in the Surrey town of Guildford (also hometown of, read into it what you will, Ford Prefect) in 1959, the son of an oil-company executive and a toy-shop proprietor. To hear him tell it, he was qualified for a career in computer programming largely by virtue of being so hopeless at everything else. Being dyslexic, he found reading and writing extremely difficult, a handicap that played havoc with his marks at Bearwood College, the boarding school in the English county of Berkshire to which his family sent him for most of his teenage years. Meanwhile his less than imposing physique boded ill for a career in the military or manual labor. Thankfully, near the end of his time at Bearwood the mathematics department acquired a Commodore PET,  while the student union almost simultaneously installed a Space Invaders machine. Seeing a correspondence between these two pieces of technology that eluded his fellow students, Molyneux set about trying to program his own Space Invaders on the PET, using crude character glyphs to represent the graphics that the PET, being a text-only machine, couldn’t actually draw. No matter. A programmer had been born.

These events, followed shortly by Molyneux’s departure from Bearwood to face the daunting prospect of the adult world, were happening at the tail end of the 1970s. Like so many of the people I’ve profiled on this blog, Molyneux was thus fortunate enough to be born not only into a place and circumstances that would permit a career in games, but at seemingly the perfect instant to get in on the ground floor as well. But, surprisingly for a fellow who would come to wear his huge passion for the medium on his sleeve — often almost as much to the detriment as to the benefit of his games and his professional life — Molyneux took a meandering path filling fully another decade to rise to prominence in the field. Or, to put it less kindly: he failed, repeatedly and comprehensively, at every venture he tried for most of the 1980s before he finally found the one that clicked.

Perhaps inspired by his mother’s toy shop, his original dream was to be not so much a game designer as a computer entrepreneur. After earning a degree in computer science from Southampton University, he found himself a job working days as a systems analyst for a big company. By night, he formed a very small company called Vulcan in his hometown of Guildford to implement a novel scheme for selling blank disks. He wrote several simple programs: a music creator, some mathematics drills, a business simulator, a spelling quiz. (The last, having been created by a dyslexic and terrible speller in general, was a bit of a disaster.) For every ten disks you bought for £10, you would get one of the programs for free along with your blank disks. After placing his tiny advertisement in a single magazine, Molyneux was so confident of the results that he told his local post office to prepare for a deluge of mail, and bought a bigger mailbox for his house to hold it all. He got five orders in the first ten days, less than fifty in the scheme’s total lifespan — along with about fifty more inquiries from people who had no interest in the blank disks but just wanted to buy his software.

Taking their interest to heart, Molyneux embarked on Scheme #2. He improved the music creator and the business simulator and tried to sell them as products in their own right. Even years later he would remain proud of the latter in particular — his first original game, which he named Entrepreneur: “I really put loads of features into it. You ran a business and you could produce anything you liked. You had to do things like keep the manufacturing line going, set the price for your product, decide what advertising you wanted, and these random events would happen.” With contests all the rage in British games at the time, he offered £100 to the first person to make £1 million in Entrepreneur. The prize went unclaimed; the game sold exactly two copies despite being released near the zenith of the early-1980s British mania for home computers. “Everybody around me was making an absolute fortune,” Molyneux remembers. “You had to be a complete imbecile in those days not to make a fortune. Yet here I was with Entrepreneur and Composer, making nothing.” He wasn’t, it appeared, very good at playing his own game of entrepreneurship; his own £1 million remained far out of reach. Nevertheless, he moved on to the next scheme.

Scheme #3 was to crack the business and personal-productivity markets via a new venture called Taurus, initiated by Molyneux and his friend Les Edgar, who were later joined by one Kevin Donkin. Molyneux having studied accounting at one time in preparation for a possible career in the field (“the figures would look so messy that no one would ever employ me”), it was decided that Taurus would initially specialize in financial software with exciting names like Taurus Accounts, Taurus Invoicing, and Taurus Stock Control. Those products, like all the others Molyneux had created, went nowhere. But now came a bizarre story of mistaken identity that… well, it wouldn’t make Molyneux a prominent game designer just yet, but it would move him further down the road to that destination.

Commodore was about to launch the Amiga in Britain, and, this being early on when they still saw it as potential competition for the IBMs of the world, was looking to convince makers of productivity software to write for the machine.  They called up insignificant little Taurus of all people to request a meeting to discuss porting the “new software” the latter had in the works to the Amiga. Molyneux and Edgar assumed Commodore must have somehow gotten wind of a database program they were working on. In a state of no small excitement, they showed up at Commodore UK’s headquarters on the big day and met a representative. Molyneux:

He kept talking about “the product,” and I thought they were talking about the database. At the end of the meeting, they say, “We’re really looking forward to getting your network running on the Amiga.” And it suddenly dawned on me that this guy didn’t know who we were. Now, we were called Taurus, as in the star sign. He thought we were Torus, a company that produced networking systems. I suddenly had this crisis of conscience. I thought, “If this guy finds out, there go my free computers down the drain.” So I just shook his hand and ran out of that office.

An appropriately businesslike advertisement for Taurus's database manager gives no hint of what lies in the company's futures.

An appropriately businesslike advertisement for Taurus’s database manager gives no hint of what actually lies in the company’s future…

By the time Commodore figured out they had made a terrible mistake, Taurus had already been signed as official Amiga developers and given five free Amigas. They parlayed those things into a two-year career as makers of somewhat higher-profile but still less than financially successful productivity software for the Amiga. After the database, which they named Acquisition and declared “the most complete database system conceived on any microcomputer” — Peter Molyneux’s habit of over-promising, which gamers would come to know all too well, was already in evidence — they started on a computer-aided-design package called X-CAD Designer. Selling in the United States for the optimistic prices of $300 and $500 respectively, both programs got lukewarm reviews; they were judged powerful but kind of incomprehensible to actually use. But even had the reviews been better, high-priced productivity software was always going to be a hard sell on the Amiga. There were just three places to really make money in Amiga software: in personal-creativity software like paint programs, in video-production tools, and, most of all, in games. In spite of all of Commodore’s earnest efforts to the contrary, the Amiga had by now become known first and foremost as the world’s greatest gaming computer.

The inspiration for the name of Bullfrog Software.

The inspiration for Bullfrog Software.

Molyneux and his colleagues therefore began to wind down their efforts in productivity software in favor of a new identity. They renamed their company Bullfrog after a ceramic figurine they had lying around in the “squalor” of what Molyneux describes as their “absolutely shite” office in a Guildford pensioner’s attic. Under the new name, they planned to specialize in games — Scheme #4 for Peter Molyneux. “We had a simple choice of hitting our head against a brick wall with business software,” he remembers, “or doing what I really wanted to do with my life anyway, which was write games.” Having made the choice to make Bullfrog a game developer, their first actual product was not a game but a simple drum sequencer for the Amiga called A-Drum. Hobgoblins and little minds and all the rest. When A-Drum duly flopped, they finally got around to games.

A friend of Molyneux’s had written a budget-priced action-adventure for the Commodore 64 called Druid II: Enlightenment, and was looking for someone to do an Amiga conversion. Bullfrog jumped at the chance, even though Molyneux, who would always persist in describing himself as a “rubbish” programmer, had very little idea how to program an action game. When asked by Enlightenment‘s publisher Firebird whether he could do the game in one frame — i.e., whether he could update everything onscreen within a single pass of the electron gun painting the screen to maintain the impression of smooth, fluid movement — an overeager Molyneux replied, “Are you kidding me? I can do it in ten frames!” It wasn’t quite the answer Firebird was looking for. But in spite of it all, Bullfrog somehow got the job, producing what Molyneux describes as a “technically rather poor” port of what had been a rather middling game in the first place. (Molyneux’s technique for getting everything drawn in one frame was to simply keep shrinking the size of the display until even his inefficient routines could do the job.) And then, as usual for everything Molyneux touched, it flopped. But Bullfrog did get two important things out of the project: they learned much about game programming, and they recruited as artist for the project one Glenn Corpes, who was not only a talented pixel pusher but also a talented programmer and fount of ideas almost the equal of Molyneux.

Despite the promising addition of Corpes, the first original game conjured up by the slowly expanding Bullfrog fared little better than Enlightenment. Corpes and Kevin Donkin turned out a very of-its-time top-down shoot-em-up called Fusion, which Electronic Arts agreed to release. Dismissed as “a mixture of old ideas presented in a very unexciting manner” by reviewers, Fusion was even less impressive technically than had been the Enlightenment port, being plagued by clashing colors and jittery scrolling — not at all the sort of thing to impress the notoriously audiovisually-obsessed Amiga market. Thus Fusion flopped as well, keeping Molyneux’s long record of futility intact. But then, unexpectedly from this group who’d shown so little sign of ever rising above mediocrity, came genius.

To describe Populous as a stroke of genius would be a misnomer. It was rather a game that grew slowly into its genius over a considerable period of time, a game that Molyneux himself considers more an exercise in evolution than conscious design. “It wasn’t an idea that suddenly went ‘Bang!'” he says. “It was an idea that grew and grew.” And its genesis had as much to do with Glenn Corpes as it did with Peter Molyneux.

Every Populous world is built out of combinations of just 16 blocks.

Every Populous world is built out of combinations of just 56 blocks.

It all began when Corpes started showing off a routine he had written which let him build isometric landscapes out of three-dimensional blocks, like a virtual Lego set. You could move the viewpoint about the landscape, raising and lowering the land by left-clicking to add new blocks, right-clicking to remove them. Molyneux was immediately sure there was a game in there somewhere. His childhood memory of the ant farm leaping to mind, he said, “Let’s have a thousand people running around on it.”

Populous thus began with those little people in lieu of ants, wandering independently over Corpes’s isometric landscapes in real time. When they found a patch they liked, they would settle down, building little huts. Since, this being a computer game, the player would obviously need something to do as well, Molyneux started adding ways for you, as a sort of God on high, to influence the people’s behavior in indirect ways. He added something he called a “Papal Magnet,” a huge ankh you could place in the world to draw your people toward a given spot. But there would come a problem if the way to the Ankh happened to be blocked by, say, a lake. Molyneux claims he added Populous‘s most basic mechanic, the thing you spend by far the most time doing when playing the game, as a response to his “incompetence” as a coder and resulting inability to write a proper path-finding algorithm: when your people get stuck somewhere, you can, subject to your mana reserves — even gods have limits — raise or lower the land to help them out. With that innovation, Populous from the player’s perspective became largely an exercise in terraforming, creating smooth, even landscapes on which your people can build their huts, villages, and eventually castles. As your people become fruitful and multiply, their prayers fuel your mana reserves.

Next, Molyneux added warfare to the picture. Now you would be erecting mountains and lakes to protect your people from their enemies, who start out walking about independently on the other side of the world. The ultimate goal of the game, of course, is to use your people to wipe out your enemy’s people before they do the same to you; this is a very Old Testament sort of religious experience. To aid in that goal, Molyneux gradually added lots of other godly powers to your arsenal, more impressive than the mere raising and lowering of land if also far more expensive in terms of precious mana: flash floods, earthquakes, volcanic eruptions, etc. You know, all your standard acts of God, as found in the Bible and insurance claims.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Parts of Populous were prototyped on the tabletop. Bullfrog used Lego bricks to represent the landscapes, a handy way of implementing the raising-and-lowering mechanic in a physical space. They went so far as to discuss a license with Lego, only to be told that Lego didn’t support “violent games.” Molyneux admits that the board game, while playable, was very different from the computerized Populous, playing out as a slow-moving, chess-like exercise in strategy. The computer Populous, by contrast, can get as frantic as any action game, especially in the final phase when all the early- and mid-game maneuvering and feinting comes down to the inevitable final genocidal struggle between Good and Evil.

Bullfrog. From left: Glenn Corpes (artist), Shaun Cooper (tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Bullfrog. From left: Glenn Corpes (artist and programmer), Shaun Cooper (artist and tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Ultimately far more important to the finished product than Bullfrog’s Lego Populous were the countless matches Molyneux played on the computer against Glenn Corpes. Apart from all of its other innovations in helping to invent the god-game and real-time-strategy genres, Populous was also a pioneering effort in online gaming. Multi-player games — the only way to play Populous for many months — took place between two people seated at two separate Amigas, connected together via modem or, if together in the same room as Molyneux and Corpes were, via a cable. Vanishingly few other designers were working in this space at the time, for understandable reasons: even leaving aside the fact that the majority of computer owners didn’t own modems, running a multi-player game in real-time over a connection as slow as 1200 baud was hardly a programming challenge for the faint-hearted. The fact that it works at all in Populous rather puts the lie to Molyneux’s self-deprecating description of himself as a “rubbish” coder.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which made it much easier to localize and popularize across Europe. Everything is done using the initially incomprehensible suite of icons you near the bottom of the screen.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which only made it that much easier for Electronic Arts to localize and popularize across Europe. Everything is instead done using the initially incomprehensible suite of icons you near the bottom of the screen. Populous does become intuitive in time, but it’s not without a learning curve.

Development of Populous fell into a comfortable pattern. Molyneux and Corpes would play together for several hours every evening, then nip off to the pub to talk about their experiences. Next day, they’d tweak the game, then they’d go at it again. It’s here that we come to the beating heart of Molyneux’s description of Populous as a game evolved rather than designed. Almost everything in the finished game beyond the basic concept was added in response to Molyneux and Corpes’s daily wars. For instance, Molyneux initially added knights, super-powered individuals who can rampage through enemy territory and cause a great deal of havoc in a very short period of time, to prevent their games from devolving into endless stalemates. “A game could get to the point where both players had massive populations,” he says, “and there was just no way to win.” With knights, the stronger player “could go and massacre the other side and end the game at a stroke.”

A constant theme of all the tweaking was to make a more viscerally exciting game that played more quickly. For commercial as well as artistic reasons — Amiga owners weren’t particularly noted for their patience with slow-paced, cerebral games — this was considered a priority. Over the course of development, the length of the typical game Molyneux played with Corpes shrank from several hours to well under one.

Give them time, and your people will turn their primitive villages into castles -- and no, the drawing isn't quite done to scale.

Give them time, and your people will turn their primitive huts into castles.

Even tweaked to play quickly and violently, Populous was quite a departure from the tried-and-true Amiga fare of shoot-em-ups, platformers, and action-adventures. The unenviable task of trying to sell the thing to a publisher was given to Les Edgar. After visiting about a dozen publishers, he convinced Electronic Arts take a chance on it. Bullfrog promised EA a finished Populous in time for Christmas 1988. By the time that deadline arrived, however, it was still an online multiplayer-only game, a prospect EA knew to be commercially untenable. Molyneux and his colleagues thus spent the next few months creating Populous‘s single-player “Conquest Mode.”

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

Perilously close to being an afterthought to the multi-player experience though it was, Conquest Mode would be the side of the game that the vast majority of its eventual players would come to know best if not exclusively. Rather than design a bunch of scenarios by hand, Bullfrog wrote an algorithm to procedurally generate 500 different “worlds” for play against a computer opponent whose artificial intelligence also had to be created from scratch during this period. This method of content creation, used most famously by Ian Bell and David Braben in Elite, was something of a specialty and signpost of British game designers, who, plagued by hardware limitations far more stringent than their counterparts in the United States, often used it as a way to minimize the space their games consumed in memory and on disk. Most recently, Geoff Crammond’s hit game The Sentinel, published by Firebird, had used a similar scheme. Glenn Corpes believes it may have been an EA executive named Joss Ellis who first suggested it to Bullfrog.

Populous‘s implementation is fairly typical of the form. Each of the 500 worlds except the first is protected by a password that is, like everything else, itself procedurally generated. When you win at a given level, you’re given the password to a higher, harder level; whether and how many levels you get to skip is determined by how resounding a victory you’ve just managed. It’s a clever scheme, packing a hell of a lot of potential gameplay onto a single floppy disk and even making an effort to avoid boring the good player — and all without forcing Bullfrog to deal with the complications of actually storing any state whatsoever onto disk.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

Given their previous failures, Bullfrog understandably wasn’t the most confident group when a well-known British games journalist named Bob Wade, who had already played a pre-release version of the game, came by for a visit. For hours, Molyneux remained too insecure to actually ask Wade the all-important question of what he thought of the game. At last, after Wade had joined the gang for “God knows how many” pints at their local, Molyneux worked up the courage to pop the question. Wade replied that it was the best game he’d ever played, and he couldn’t wait to get back to it — prompting Molyneux to think he must have made some sort of mistake, and that under no circumstances should he be allowed to play another minute of it in case his opinion should change. It was Wade and the magazine he was writing for at the time, ACE (Advanced Computer Entertainment), who coined the term “god game” in the glowing review that followed, the first trickle of a deluge of praise from the gaming press in Britain and, soon enough, much of the world.

Bullfrog’s first royalty check for Populous was for a modest £13,000. Their next was for £250,000, prompting a naive Les Edgar to call Electronic Arts about it, sure it was a mistake. It was no mistake; Populous alone reportedly accounted for one-third of EA’s revenue during its first year on the market. That Bullfrog wasn’t getting even bigger checks was a sign only of the extremely unfavorable deal they’d signed with EA from their position of weakness. Populous finally and definitively ended the now 30-year-old Peter Molyneux’s long run of obscurity and failure at everything he attempted. In his words, he went overnight from “urinating in the sink” and “owing more money than I could ever imagine paying back” to “an incredible life” in games. Port after port came out for the next couple of years, each of them becoming a bestseller on its platform. Populous was selected to become one of the launch titles for the Super Nintendo console in Japan, spawning a full-blown fad there that came to encompass comic books, tee-shirts, collectibles, and even a symphony concert. When they visited Japan for the first time on a promotional tour, Molyneux and Les Edgar were treated like… well, appropriately enough, like gods. Populous sold 3 million copies in all according to some reports, an almost inconceivable figure for a game during this period.

Amidst all its other achievements, Populous was also something of a pioneer in the realm of e-sports. The One magazine and Electronic Arts hosted a tournament to find the best player in Britain.

The One magazine and Electronic Arts hosted a tournament to find the best Populous player in Britain.

While a relatively small percentage of Populous players played online, those who did became pioneers of sorts in their own right. Some bulletin-board systems set up matchmaking services to pair up players looking for a game, any time, day or night; the resulting connections sometimes spanned national borders or even oceans. The matchmakers were aided greatly by Bullfrog’s forward-thinking decision to make all versions of Populous compatible with one another in terms of online play. In making it so quick and easy to find an online opponent, these services prefigured the modern world of Internet-enabled online gaming. Molyneux pronounced them “pretty amazing,” and at the time they really were. In 1992, he spoke excitedly of a recent trip to Japan, where’d he seen a town “with 10,000 homes all linked together. You can play games with anybody in the place. It’s enormous, really enormous, and it’s growing.” If only he’d known what online gaming would grow into in the next decade or two…

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibitio match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn't appear to have happened.

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibition match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn’t appear to have happened.

The original Amiga version of Populous had been released all but simultaneously with the Amiga version of SimCity. Press and public alike immediately linked the two games together; AmigaWorld magazine, for instance, went so far as to review them jointly in a single article. Both Will Wright of SimCity fame and Peter Molyneux were repeatedly asked in interviews whether they’d played the other’s game. Wright was polite but, one senses, a little disinterested in Populous, saying he “liked the idea of playing God and having a population follow you,” but “sort of wish they’d gone for a slightly more educational angle.” Molyneux was much more enthusiastic about his American counterpart’s work, repeatedly floating a scheme to somehow link the two games together in more literal fashion for online play.  He claimed at one point that Maxis (developers of SimCity) and his own Bullfrog had agreed on a liaison “to go backwards and forwards” between their two companies to work on linking their games. The liaison, he claimed, had “the Populous landscape moving to and from SimCity,” and a finished product would be out sometime in 1992. Like quite a number of the more unbelievable schemes Molyneux has floated over the years, it never happened.

The idea of a linkage between SimCity and Populous, whether taking place online or in the minds of press and public, can seem on the face of it an exceedingly strange one today. How would the online linkage actually work anyway? Would the little Medieval warriors from Populous suddenly start attacking SimCity‘s peaceful modern utopias? Or would Wright’s Sims plop themselves down in the middle of Molyneux’s apocalyptic battles and start building stadiums and power plants? These were very different games: Wright’s a noncompetitive, peaceful exercise in urban planning with strong overtones of edutainment; Molyneux’s a zero-sum game of genocidal warfare that aspired to nothing beyond entertainment. Knowing as we do today the future paths of these two designers — i.e., ever further in the directions laid down by these their first significant works — only heightens the seeming dichotomy.

That said, there actually were and are good reasons to think of SimCity and Populous as two sides of the same coin. For us today, the list includes first of all the reasons of simple historical concordance. Each marks the coming-out party of one of the most important game designers of all time, occurring within bare weeks of one another.

But of course the long-term importance of these two designers to their field wasn’t yet evident in 1989; obviously players were responding to something else in associating their games with one another. Once you stripped away their very different surface trappings and personalities, the very similar set of innovations at the heart of each was laid bare. AmigaWorld said it very well in that joint review: “The real joy of these programs is the interlocking relationships. Sure, you’re a creator, but even more a facilitator, influencer, and stage-setter for little computer people who act on your wishes in their own time and fashion.” It’s no coincidence that, just as Peter Molyneux was partly inspired by an ant hill to create Populous, one of Will Wright’s projects of the near future would be the virtual ant farm SimAnt. In creating the first two god games, the two were indeed implementing a very similar core idea, albeit each in his own very different way.

Joel Billings of the king of American strategy games SSI had founded his company back in 1979 with the explicit goal of making computerized versions of the board games he loved. SimCity and Populous can be seen as the point when computer strategy games transcended that traditional approach. The real-time nature of these games makes them impossible to conceive of as anything other than computer-based works, while their emergent complexity makes them objects of endless fascination for their designers as much or more so for than their players.

In winning so many awards and entrancing so many players for so long, SimCity and Populous undoubtedly benefited hugely from their sheer novelty. Their flaws stand out more clearly today. With its low-resolution graphics and without the aid of modern niceties like tool tips and graphical overlays, SimCity struggles to find ways to communicate vital information about what your city is really doing and why, making the game into something of an unsatisfying black box unless and until you devote a lot of time and effort to understanding what affects what. Populous has many of the same interface frustrations, along with other problems that feel still more fundamental and intractable, especially if you, like the vast majority of players back in its day, experience it through its single-player Conquest Mode. Clever as they are, the procedurally generated levels combined with the fairly rudimentary artificial intelligence of your computer opponent introduce a lot of infelicities. Eventually you begin to realize that one level is pretty much the same as any other; you just need to execute the same set of strategies and tactics more efficiently to have success at the higher levels.

Both Will Wright and Peter Molyneux are firm adherents to the experimental, boundary-pushing school of game design — an approach that yields innovative games but not necessarily holistically good games every time out. And indeed, throughout his long career each of them has produced at least as many misses as hits, even if we dismiss the complaints of curmudgeons like me and lump SimCity and Populous into the category of the hits. Both designers have often fallen into the trap, if trap it be, of making games that are more interesting for creators and commentators than they are fun for actual players. And certainly both have, like all of us, their own blind spots: in relying so heavily on scientific literature to inform his games, Wright has often produced end results with something of the feel of a textbook, while Molyneux has often lacked the discipline and gravitas to fully deliver on his most grandiose schemes.

But you know what? It really doesn’t matter. We need our innovative experimentalists to blaze new trails, just as we need our more sober, holistically-minded designers to exploit the terrain they discover. SimCity and Populous would be followed by decades of games that built on the possibilities they revealed — many of which I’d frankly prefer to play today than these two original ground-breakers. But, again, that reality doesn’t mean we should celebrate SimCity and Populous one iota less, for both resoundingly pass the test of historical significance. The world of gaming would be a much poorer place without Will Wright and Peter Molyneux and their first living worlds inside a box.

(Sources: The Official Strategy Guide for Populous and Populous II by Laurence Scotford; Master Populous: Blueprints for World Power by Clayton Walnum; Amazing Computing of October 1989; Next Generation of November 1998; PC Review of July 1992; The One of April 1989, September 1989, and May 1991; Retro Gamer 44; AmigaWorld of December 1987, June 1989, and November 1989; The Games Machine of November 1988; ACE of April 1989; the bonus content to the film From Bedrooms to Billions. Archived online sources include features on Peter Molyneux and Bullfrog for Wired Online, GameSpot, and Edge Online. Finally, Molyneux’s postmortem on Populous at the 2011 Game Developers Conference.

Populous is available for purchase from GOG.com.)

 

Tags: , , ,

Wasteland

Wasteland

We can mark the formal beginning of the Wasteland project to the day in December of 1985 when Brian Fargo, head of Interplay, flew out to Arizona with his employee Alan Pavlish to meet with Michael Stackpole. If all went well at the meeting, Pavlish was to join Stackpole and Ken St. Andre as the third member of the core trio who would guide the game to release. His role, however, would be very different from that of his two colleagues.

A hotshot programmer’s programmer, Pavlish, though barely twenty years old, had been kicking around the industry for several years already. Before Interplay existed, he’d done freelance work on Commodore VIC-20 games for their earlier incarnation as Boone Corporation, and done ports of games like Murder on the Zinderneuf to the Apple II and Commodore 64 for another little company called Designer Software. When Pavlish came to work for Interplay full-time, Fargo had first assigned him to similar work: he had ported the non-Interplay game Hacker to the Apple II for Activision. (In those pre-Bard’s Tale days, Fargo was still forced to accept such unglamourous work to make ends meet.) But Fargo had huge respect for Pavlish’s abilities. When the Wasteland idea started to take off while his usual go-to programming ace Bill Heineman1 was still swamped with the Bard’s Tale games and Interplay’s line of illustrated text adventures, Fargo didn’t hesitate to throw Pavlish in at the deep end: he planned to make him responsible for bringing the huge idea that was Wasteland to life on the little 64 K 8-bit Apple II and Commodore 64.

However, when Fargo and Pavlish got out of their airplane that day it was far from certain that there would be a Wasteland project for Pavlish to work on at all. In contrast to St. Andre, Stackpole was decidedly skeptical, and for very understandable reasons. His experiences with computer-game development to date hadn’t been happy ones. Over the past several years, he’d been recruited to three different projects and put considerable work into each, only to see each come to naught in one way or another. Thanks largely to the influence of Paul Jaquays,2 another tabletop veteran who headed Coleco’s videogame-design group during the first half of the 1980s, he’d worked on two games for the Coleco Adam, a would-be challenger in the home-computer wars. The more intriguing of the two, a Tunnels & Trolls adaptation, got cancelled before release. The other, an adaptation of the film 2010: Odyssey Two, was released only after the Adam had flopped miserably and been written off by Coleco; you can imagine how well that game sold. He’d then accepted a commission from science-fiction author cum game developer Fred Saberhagen to design a computer game that took place in the world of the latter’s Book of Swords trilogy. (Stackpole had already worked with Flying Buffalo on a board game set in the world of Saberhagen’s Berserker series.) The computerized Book of Swords had gone into stasis when it became clear that Berserker Works, the development company Saberhagen had founded, just didn’t have the resources to finish it.

So, yes, Stackpole needed some convincing to jump into the breach again with tiny Interplay, a company he’d never heard of.3 Luckily for Interplay, he, Fargo, and Pavlish all got along like a house on fire on that December day. Fargo and Pavlish persuaded Stackpole that they shared — or at least were willing to accommodate — his own emerging vision for Wasteland, for a computer game that would be a game and a world first, a program second. Stackpole:

Programmers design beautiful programs, programs that work easily and simply; game designers design games that are fun to play. If a programmer has to make a choice between an elegant program and a fun game element, you’ll have an elegant program. You need a game designer there to say, “Forget how elegant the program is — we want this to make sense, we want it to be fun.”

I was at a symposium where there were about a dozen people. When asked to tell what we were doing, what I kept hearing over and over from programmer/game designers was something like “I’ve got this neat routine for packing graphics, so I’m going to do a fantasy role-playing game where I can use this routine.” Or a routine for something else, or “I’ve got a neat disk sort,” or this or that. And all of them were putting these into fantasy role-playing games. Not to denigrate their skills as programmers — but that’s sort of like saying, “Gee, I know something about petrochemicals, therefore I’m going to design a car that will run my gasoline.” Well, if you’re not a mechanical engineer, you don’t design cars. You can be the greatest chemist in the world, but you’ve got no business designing a car. I’d like to hope that Wasteland establishes that if you want a game, get game designers to work with programmers.

This vision, cutting as it does so much against the way that games were commonly made in the mid-1980s, would have much to do with both where the eventual finished Wasteland succeeds and where it falls down.

Ditto the game’s tabletop heritage. As had been Fargo’s plan from the beginning, Wasteland‘s rules would be a fairly faithful translation of Stackpole’s Mercenaries, Spies, and Private Eyes tabletop RPG, which was in turn built on the foundation of Ken St. Andre’s Tunnels & Trolls. A clear evolutionary line thus stretched from the work that St. Andre did back in 1975 to Wasteland more than a decade later. No CRPG to date had tried quite as earnestly as Wasteland would to bring the full tabletop experience to the computer.

You explore the world of Wasteland from a top-down perspective rather than the first-person view of The Bard's Tale. This screenshot and the ones that follow come from the slightly later MS-DOS port rather than the 8-bit original.

You explore the world of Wasteland from a top-down perspective instead of the first-person view of The Bard’s Tale. Note that this screenshot and the ones that follow come from the slightly later (and vastly more pleasant to play) MS-DOS port rather than the 8-bit original.

Early in the new year, Stackpole and St. Andre visited Interplay’s California offices for a week to get the process of making Wasteland rolling. St. Andre arrived with a plot already dreamed up. Drawing heavily from the recent ultra-violent action flick Red Dawn, it posited a world where mutually-assured destruction hadn’t proved so mutual after all: the Soviet Union had won the war, and was now occupying the United States. The player would control a group of American freedom fighters skulking around the farmlands of Iowa, trying to build a resistance network. St. Andre and Stackpole spent a month or more after their visit to California drawing maps of cornfields and trying to find ways to make an awful lot of farmers seem different from one another. (Some of this work can be seen in the Agricultural Center in the finished Wasteland.) But finally the pair had to accept the painful truth: the game they were designing was boring. “I said it will be the dullest game you ever saw,” remembers St. Andre, “because the Russians would be there in strength, and your characters start weak and can’t do anything but skulk and hide and slowly, slowly build up.”

St. Andre suggested moving the setting to the desert of the American Southwest, an area with which he, being born and raised in Arizona, was all too familiar. The region also had a certain thematic resonance, being intimately connected with the history of the atomic bomb. The player’s party might even visit Las Vegas, where folks had once sat on their balconies and watched the mushroom clouds bloom. St. Andre suggested nixing the Soviets as well, replacing them with “ravening monsters stalking through a radioactive wasteland, a few tattered humans struggling to survive against an overwhelming threat.” It meant chucking a fair amount of work, but Fargo agreed that it sounded too good to pass up. They might as well all get used to these sorts of false starts. Little would go smoothly or according to plan on this project.

After that first week at Interplay, St. Andre and Stackpole worked from home strictly in a design role, coming up with the plans for the game that were then left to Pavlish in California to implement in code — still an unusual way of working in the mid-1980s, when even many of the great designers, like Dan Bunten4 and Sid Meier, tended to also be great programmers. But St. Andre and Stackpole used their computers — a Commodore 64 in the case of the former, a battered old Osborne luggable in that of the latter — to do nothing more complex than run a word processor. Bundle after bundle of paper was shipped from Arizona to California, in the form of both computer printouts and reams of hand-drawn maps. St. Andre and Stackpole worked, in other words, largely the same way they would have had Wasteland been planned as a new tabletop adventure module.

Wasteland must be, however, one hell of a big adventure module. It soon became clear that the map-design process, entailing as it did the plotting of every single square with detailed descriptions of what it contained and what the party should be able to do there, was overwhelming the two. St. Andre:

I hadn’t thought a great deal about what was going to be in any of these places. I just had this nebulous story in my mind: our heroes will start in A, they’ll visit every worthwhile place on the map and eventually wind up in Z — and if they’re good enough, they’ll win the game. Certain things will be happening in different locations — monsters of different types, people who are hard to get along with, lots of comic references to life before the war. I figured that when the time came for me to design an area, the Indian Village, for example, I would sit down and figure out what would be in it and that would be it. Except that it started taking a long time. Every map had 1024 squares on it, and each one could do something. Even if I just drew all the buildings, I had to go back and say, “These are all square nine: wall, wall, wall, wall, wall. And if you bump into a wall you’ll get this message: ‘The Indians are laughing at you for walking into a wall.'” Whatever — a map that I thought I could toss off in one or two days was taking two weeks, and the project was falling further and further behind.

Fargo agreed to let St. Andre and Stackpole bring in their old Flying Buffalo buddies Liz Danforth and Dan Carver to do maps as well, and the design team just continued to grow from there. “The guys who were helping code the maps, correcting what we sent in, wanted to do some maps,” remembers Stackpole. “Everyone wanted to have his own map, his own thumbprint on the game.”

Even Fargo himself, who could never quite resist the urge to get his own hands dirty with the creations of this company he was supposed to be running from on high, begged for a map. “I want to do a map. Let me have Needles,” St. Andre remembers him saying. “So I said, ‘You’re the boss, Brian, you’ve got Needles.'” But eventually Fargo had to accept that he simply didn’t have the time to design a game and run a company, and the city of Needles fell to another Interplay employee named Bruce Balfour. In all, the Wasteland manual credits no fewer than eight people other than St. Andre and Stackpole with “scenario design.” Even Pavlish, in between trying to turn this deluge of paper into code, managed to make a map or two of his own.

Wasteland is one of the few computer games in history in which those who worked on the softer arts of writing and design outnumbered those who wrote the code and drew the pictures. The ratio isn’t even close: the Wasteland team included exactly one programmer (Pavlish) and one artist (Todd J. Camasta) to go with ten people who only contributed to the writing and design. One overlooked figure in the design process, who goes wholly uncredited in the game’s manual, was Joe Ybarra, Interplay’s liaison with their publisher Electronic Arts. As he did with so many other classic games, Ybarra offered tactful advice and generally did his gentle best to keep the game on course, even going so far as to fly out to Arizona to meet personally with St. Andre and Stackpole.

Those two found themselves spending as much time coordinating their small army of map designers as they did doing maps of their own. Stackpole:

Work fell into a normal pattern. Alan and I would work details out, I’d pass it down the line to the folks designing maps. If they had problems, they’d tell me, Alan and I would discuss things, and they’d get an answer. In this way the practical problems of scenario design directly influenced the game system and vice versa. Map designers even talked amongst themselves, sharing strategies and some of these became standard routines we all later used.

Stackpole wound up taking personal responsibility for the last third or so of the maps, where the open world begins funneling down toward the climax. St. Andre:

I’m fairly strong at making up stories, but not at inventing intricate puzzles. In the last analysis, I’m a hack-and-slash gamer with only a little thought and strategy thrown in. Interplay and Electronic Arts wanted lots of puzzles in the game. Mike, on the other hand, is much more devious, so I gave him the maps with difficult puzzles and I did the ones that involved walking around, talking to people, and shooting things.

The relationship between these two veteran tabletop designers and Pavlish, the man responsible for actually implementing all of their schemes, wasn’t always smooth. “We’d write up a map with all the things on it and then Alan would say, ‘I can’t do that,'” says St. Andre. There would then follow some fraught discussions, doubtless made still more fraught by amateur programmer St. Andre’s habit of declaring that he could easily implement what was being asked in BASIC on his Commodore 64. (Stackpole: “It’s like a duffer coming up to Arnold Palmer at an average golf course and saying, ‘What do you mean you can’t make that 20-foot putt? I can make a 20-foot putt on a miniature golf course.'”) One extended battle was over the question of grenades and other “area-effect” weapons: St. Andre and Stackpole wanted them, Pavlish said they were just too difficult to code and unnecessary anyway. Unsung hero Joe Ybarra solved that one by quietly lobbying Fargo to make sure they went in.

One aspect of Wasteland that really demonstrates St. Andre and Stackpole’s determination to divorce the design from the technology is the general absence of the usual numbers that programmers favor — i.e., the powers of two that fit so neatly into the limited memories of the Apple II and Commodore 64. Pavlish instinctively wanted to make the two types of pistols capable of holding 16 or 32 bullets. But St. Andre and Stackpole insisted that they hold 7 or 18, just like their real-world inspirations. As demonstrated by the 1024-square maps, the two did occasionally let Pavlish get away with the numbers he favored, but they mostly stuck to their guns (ha!). “It’s going to be inelegant in terms of space,” admits Stackpole, “but that’s reality.”

Logic like this drove Pavlish crazy, striving as he was to stuff an unprecedentedly complex world into an absurdly tiny space. Small wonder that there were occasional blowups. Slowly he learned to give every idea that came from the designers his very best try, and the designers learned to accept that not everything was possible. With that tacit agreement in place, the relationship improved. In the latter stages of the project, St. Andre and Stackpole came to understand the technology well enough to start providing their design specifications in code rather than text. “Then we could put in the multiple saving throws, the skill and attribute checks,” says St. Andre. “Everything we do in a [Tunnels & Trolls] solitaire dungeon suddenly pops up in the last few maps we did for Wasteland because Mike and I were doing the actual coding.”

When not working on the maps, St. Andre and Stackpole — especially the latter, who came more and more to the fore as time went on — were working on the paragraph book that would contain much of Wasteland‘s story and flavor text. The paragraph book wasn’t so much a new idea as a revival of a very old one. Back in 1979, Jon Freeman’s Temple of Apshai, one of the first CRPGs to arrive on microcomputers, had included a booklet of “room descriptions” laid out much like a Dungeons & Dragons adventure module. This approach was necessitated by the almost unbelievably constrained system for which Temple of Apshai was written: a Radio Shack TRS-80 with just 16 K of memory and cassette-based storage. Moving into the late 1980s, the twilight years of the 8-bit CRPG, designers were finding the likes of the Apple II and Commodore 64 as restrictive as Freeman had the TRS-80 for the simple reason that, while the former platforms may have had four times as much memory as the latter, CRPG design ambitions had grown by at least the same multiple. Moving text, a hugely expensive commodity in terms of 8-bit storage, back into an accompanying booklet was a natural remedy. Think of it as one final measure to wring just a little bit more out of the Apple II and Commodore 64, those two stalwart old warhorses that had already survived far longer than anyone had ever expected. And it didn’t hurt, of course, that a paragraph book made for great copy protection.

While the existence of a Wasteland paragraph book in itself doesn’t make the game unique, St. Andre and Stackpole were almost uniquely prepared to use theirs well, for both had lots of experience crafting Tunnels & Trolls solo adventures. They knew how to construct an interactive story out of little snippets of static text as well as just about anyone, and how to scramble it in such a way as to stymie the cheater who just starts reading straight through. Stackpole, following a tradition that began at Flying Buffalo, constructed for the booklet one of the more elaborate red herrings in gaming history, a whole alternate plot easily as convoluted as that in the game proper involving, of all things, a Martian invasion. All told, the Wasteland paragraph book would appear to have easily as many fake entries as real ones.

You fight some strange foes in Wasteland. Combat shifts back to something very reminiescent of The Bard's Tale, with the added tactical dimension of a map showing everyone's location that you can access by tapping the space bar.

For combat, the display shifts back to something very reminiscent of The Bard’s Tale, with the added tactical dimension of a map showing everyone’s location that you can access by tapping the space bar. And yes, you fight some strange foes in Wasteland

Wasteland‘s screen layout often resembles that of The Bard’s Tale, and one suspects that there has to be at least a little of the same code hidden under its hood. In the end, though, the resemblance is largely superficial. There’s just no comparison in terms of sophistication. While it’s not quite a game I can love — I’ll try to explain why momentarily — Wasteland does unquestionably represent the bleeding edge of CRPG design as of its 1988 release date. CRPGs on the Apple II and Commodore 64 in particular wouldn’t ever get more sophisticated than this. Given the constraints of those platforms, it’s honestly hard to imagine how they could.

Key to Wasteland‘s unprecedented sophistication is its menu of skills. Just like in Mercenaries, Spies, and Private Eyes, you can tailor each of the up to four characters in your party as you will, free from the restrictive class archetypes of Dungeons & Dragons (or for that matter Tunnels & Trolls). Skills range from the obviously useful (Clip Pistol, Pick Lock, Medic) to the downright esoteric (Metallurgy, Bureaucracy, Sleight of Hand). And of course career librarian St. Andre made sure that a Librarian skill was included, and of course made it vital to winning the game.

Also as in Mercenaries, Spies, and Private Eyes, a character’s chance of succeeding at just about anything is determined by adding her level in a relevant skill, if any, to a relevant core attribute. For example, to determine a character’s chance of climbing something using her Climb skill the game will also look to her Agility. The system allows a range of solutions to most of the problems you encounter. Say you come to a locked door. You might have a character with the Pick Lock skill try getting in that way. Failing that, a character with the Demolition skill and a little handy plastic explosives could try blasting her way in. Or a strong character might dispense with skills altogether and just try to bash the door down using her Strength attribute. Although a leveling mechanism does exist that lets you assign points to characters’ skills and attributes, skills also improve naturally with use, a mechanism not seen in any previous CRPG other than Dungeon Master (a game that’s otherwise about as different from Wasteland as a game can be and still be called a CRPG).

The skills system makes Wasteland a very different gameplay experience from Ultima V, its only real rival in terms of 8-bit CRPG sophistication at the time of its release. For all its impressive world-building, Ultima V remains bound to Richard Garriott’s standard breadcrumb-trail philosophy of design; beating it depends on ferreting out a long string of clues telling you exactly where to go and exactly what to do. Wasteland, by contrast, can be beaten many ways. If you can’t find the password the guard wants to let you past that locked gate, you can try an entirely different approach: shoot your way in, blow the gate open, pick the lock on the back door and sneak in. It’s perhaps the first CRPG ever that’s really willing to let you develop your own playing personality. You can approach it as essentially a post-apocalyptic Bard’s Tale, making a frontal assault on every map and trying to blow away every living creature you find there, without concerning yourself overmuch about whether it be good or evil, friend or foe. Or you can play it — relatively speaking — cerebrally, trying to use negotiations, stealth, and perhaps a little swindling to get what you need. Or you can be like most players and do a bit of both, as the mood and opportunity strikes you. It’s very difficult if not impossible to get yourself irretrievably stuck in Wasteland. There are always options, always possibilities. While it’s far less thematically ambitious than Ultima V —  unlike the Ultima games, Wasteland was never intended to be anything more or less than pure escapist entertainment — Wasteland‘s more flexible, player-friendly design pointed the way forward while Ultima V was still glancing back.

Indeed, a big part of the enduring appeal of Wasteland to those who love it is the sheer number of different ways to play it. Interplay picked up on this early, and built an unusual feature into the game: it’s possible to reset the entire world to its beginning state while keeping the same group of lovingly developed characters. Characters can advance to ridiculous heights if you do this enough, taking on some equally ridiculous “ranks”: “1st Class Fargo,” “Photon Stud,” etc., culminating in the ultimate achievement of the level 183 “Supreme Jerk.” This feature lets veteran players challenge themselves by, say, trying to complete the game with just one character, and gives an out to anyone who screws up her initial character creation too badly and finds herself overmatched; she can just start over again and replay the easy bits with the same party to hopefully gain enough experience to correct their failings. It takes some of the edge off one of the game’s most obvious design flaws: it’s all but impossible to know which skills are actually useful until you’ve made your way fairly deep into the game.

The very fact that re-playing Wasteland requires you to reset its world at all points to what a huge advance it represents over the likes of The Bard’s Tale. The first CRPG I know of that has a truly, comprehensively persistent world, one in which the state of absolutely everything is saved, is 1986’s Starflight (a game that admittedly is arguably not even a CRPG at all). But that game runs on a “big” machine in 1980s terms, an IBM PC or clone with at least 256 K of memory. Wasteland does it in 64 K, rewriting every single map on the fly as you play to reflect what you’ve done there. Level half of the town of Needles with explosives early in the game, and it will still be leveled when you return many days later. Contrast with The Bard’s Tale, which remembers nothing but the state of your characters when you exit one of its dungeon levels, which lets you fight the same big boss battles over and over and over again if you like. The persistence allows you the player to really affect the world of Wasteland in big-picture ways that were well-nigh unheard-of at the time of its release, as Brian Fargo notes:

Wasteland let you do anything you wanted in any order you wanted, and you could get ripple effects that might happen one minute later or thirty minutes later, a lot like [the much later] Grand Theft Auto series. The Ultima games were open, but things tended to be very compartmentalized, they didn’t ripple out like in Wasteland.

Wasteland is a stunning piece of programming, a resounding justification for all of the faith Fargo placed in the young Alan Pavlish. Immersed in the design rather than the technical end of things as they were — which is itself a tribute to Pavlish, whose own work allowed them to be — St. Andre and Stackpole may still not fully appreciate how amazing it is that Wasteland does what it does on the hardware it does it on.

All of which rather raises the question of why I don’t enjoy actually playing Wasteland a little more then I do. I do want to be careful here in trying to separate what feel like more objective faults from my personal issues with the game. In the interest of fairness and full disclosure, let me put the latter right out there first.

Put simply, the writing of Wasteland just isn’t to my taste. I get the tone that St. Andre and Stackpole are trying to achieve: one of over-the-top comic ultra-violence, like such contemporary teenage-boy cinematic favorites as the Evil Dead films. And they do a pretty good job of hitting that mark. Your characters don’t just hit their enemies in Wasteland, they “brutalize” them. When they die, enemies “explode like a blood sausage,” are “reduced to a thin red paste,” are “spun into a dance of death,” or are “reduced to ground round.” And then there’s some of the imagery, like the blood-splattered doctor in the infirmary.

Wasteland

The personal appeal you find in those quotes and that image, some of the most beloved among Wasteland‘s loyal fandom, says much about whether you’ll enjoy Wasteland as a whole. In his video review of the game, Matt Barton says that “you will be disgusted or find it hilarious.” Well, I must say that my own feelings rather contradict that dichotomy. I can’t quite manage to feel disgusted or outraged at this kind of stuff, especially since, in blessed contrast to so many later games, it’s almost all described rather than illustrated. I do, however, find the entire aesthetic unfunny and boring, whether it’s found in Wasteland or Duke Nukem. In general, I just don’t find humor that’s based on transgression rather than wit to be all that humorous.

I am me, you are you, and mileages certainly vary. Still, even if we take it on its own terms it seems to me that there are other problems with the writing. As CRPG Addict Chester Bolingbroke has noted, Wasteland can’t be much bothered with consistency or coherency. The nuclear apocalypse that led to the situation your characters find themselves in is described as having taken place in 1998, only ten years on from the date of Wasteland‘s release. Yet when the writers find it convenient they litter the game with absurdly advanced technology, from human clones to telepathic mind links. And the tone of the writing veers about as well, perhaps as a result of the sheer number of designers who contributed to the game. Most of the time Wasteland is content with the comic ultra-violence of The Evil Dead, but occasionally it suddenly reaches toward a jarring epic profundity it hasn’t earned. The main storyline, which doesn’t kick in in earnest until about halfway through the game, is so silly and nonsensical that few of even the most hardcore Wasteland fans remember much about it, no matter how many times they’ve played through it.

Wasteland‘s ropey plotting may be ironic in light of Stackpole’s later career as a novelist, but it isn’t a fatal flaw in itself. Games are not the sum of their stories; many a great game has a poor or nonexistent story to tell. To whatever extent it’s a triumph, Wasteland must be a triumph of game design rather than writing, one last hurrah for Michael Stackpole the designer before Michael Stackpole the novelist took over. The story, like the stories in many or most allegedly story-driven games, is just an excuse to explore Wasteland‘s possibility space.

And that possibility space is a very impressive one, for reasons I’ve tried to explain already. Yet it’s also undone, at least a bit, by some practical implementation issues. St. Andre and Stackpole’s determination to make an elegant game design rather than an elegant program comes back to bite them here. The things going on behind the scenes in Wasteland are often kind of miraculous in the context of their time, but those things are hidden behind a clunky and inelegant interface. In my book, a truly great game should feel almost effortless to control, but Wasteland feels anything but. Virtually every task requires multiple keystrokes and the navigation of a labyrinth of menus. It’s a far cry from even the old-school simplicity of Ultima‘s alphabet soup of single-keystroke commands, much less the intuitive ease of Dungeon Master‘s mouse-driven interface.

Some of Wasteland‘s more pernicious playability issues feel like they stem from an overly literal translation of the tabletop experience to the computer.  The magnificent simplicity of the Mercenaries, Spies, and Private Eyes system feels much more clunky and frustrating on the computer. As you explore the maps, you’re expected to guess where a skill and/or attribute might be of use, then to try manually invoking it. If you’re not constantly thinking on this level, and always aware of just what skills every member of your party has that might apply, it’s very easy to miss things. For example, the very first map you’re likely to visit contains a mysterious machine. You’re expected to not just dismiss that as scenery, or to assume it’s something you’ll learn more about later, but rather to use someone’s Intelligence to learn that it’s a water purifier you might be able to fix. Meanwhile other squares on other maps contain similar descriptions that are just scenery. In a tabletop game, where there is a constant active repartee between referee and players, where everything in the world can be fully “implemented” thanks to the referee’s imagination, and where every player controls just one character whom she knows intimately instead of a whole party of four, the Mercenaries, Spies, and Private Eyes system works a treat. In Wasteland, it can feel like a tedious, mechanistic process of trial and error.

Other parts of Wasteland feel like equally heroic but perhaps misguided attempts to translate things that are simple and intuitive on the tabletop but extremely difficult on the computer to the digital realm at all costs, full speed ahead and damn the torpedoes. There is, for instance, a convoluted and confusing process for splitting your party into separate groups that can be on entirely separate maps at the same time. It’s impressive in its way, and gives Wasteland claim to yet another first in CRPG history to boot, but one has to question whether the time and effort put into it might have been better spent making a cleaner, more playable computer game. Ditto the parser-based conversation engine that occasionally pops up. An obvious attempt to bring the sort of free-form conversations that are possible with a human referee to the computer, in practice it’s just a tedious game of guess-the-word that makes it far too easy to miss stuff. While I applaud the effort St. Andre and Stackpole and their colleagues at Interplay made to bring more complexity to the CRPG, the fact remains that computer games are not tabletop games, and vice versa.

And then there’s the combat. The Bard’s Tale is still lurking down at the foundation of Wasteland‘s combat engine, but Interplay did take some steps to make it more interesting. Unlike in The Bard’s Tale, the position of your party and their enemies are tracked on a graphical map during combat. In addition to the old Bard’s Tale menu of actions — “attack,” “defend,” etc. — you can move around to find cover, or for that matter charge up to some baddies and stave their heads in with your crowbars in lieu of guns.

Yet somehow combat still isn’t much fun. This groundbreaking and much beloved post-apocalyptic CRPG also serves as an ironic argument for why the vast majority of CRPG designers and players still favor fantasy settings. Something that feels important, maybe even essential, feels lost without the ability to cast spells. Not only do you lose the thrill of seeing a magic-using character level up and trying out a new slate of spells, but you also lose the strategic dimension of managing your mana reserves, a huge part of the challenge of the likes of Wizardry and The Bard’s Tale. In theory, the acquiring of ever more powerful guns and the need to manage your ammunition stores in Wasteland ought to take the place of spells and the mana reserves needed to cast them, but in practice it doesn’t quite work out like that. New guns just aren’t as interesting as new spells, especially considering that there really aren’t all that many of the former to be found in Wasteland. And you’re never very far from a store selling bullets, and you can carry so many with you anyway that it’s almost a moot point.

Most of all, there’s just too much fighting. One place where St. Andre and Stackpole regrettably didn’t depart from CRPG tradition was in their fondness for the wandering monster. Much of Wasteland is a dull slog through endless low-stakes battles with “leather jerks” and “ozoners,” an experience sadly divorced from the game’s more interesting and innovative aspects but one that ends up being at least as time-consuming.

For all these reasons, then, I’m a bit less high on Wasteland than many others. It strikes me as more historically important than a timeless classic, more interesting than playable. There’s of course no shame in that. We need games that push the envelope, and that’s something that Wasteland most assuredly did. The immense nostalgic regard in which it’s still held today says much about how amazing its innovations really were back in 1988.

As the gap between that year of Wasteland‘s release and Fargo, Pavlish, and Stackpole’s December 1985 meeting will attest, this was a game that was in development an insanely long time by the standards of the 1980s. And as you have probably guessed, it was never intended to take anything like this long. Interplay first talked publicly about the Wasteland project as early as the Summer Consumer Electronics Show in June of 1986, giving the impression it might be available as early as that Christmas. Instead it took fully two more years.

Thanks to Wasteland‘s long gestation, 1987 proved a very quiet year for the usually prolific Interplay. While ports of older titles continued to appear, the company released not a single original new game that year. The Bard’s Tale III, turned over to Bill Heineman following Michael Cranford’s decision to return to university, went into development early in 1987, but like Wasteland its gestation would stretch well into 1988. (Stackpole, who was apparently starting to like this computer-game development stuff, wrote the storyline and the text for The Bard’s Tale III to accompany Heineman’s design.) Thankfully, the first two Bard’s Tale games were continuing to sell very well, making Interplay’s momentary lack of productivity less of a problem than it might otherwise have been.

Shortly before Wasteland‘s belated release, St. Andre, Stackpole, and Pavlish, along with a grab bag of the others who had worked with them, headed out to the Sonoran Desert for a photo shoot. Everyone scoured the oddities in the backs of their closets and the local leather shops for their costumes, and a professional makeup team was recruited to help turn them all into warriors straight out of Mad Max. Bill Heineman, an avid gun collector, provided much of the weaponry they carried. The final picture, featured on the inside cover of Wasteland‘s package, has since become far more iconic than the art that appeared on its front, a fitting tribute to this unique team and their unique vision.

Some of the Wasteland team. Ken St. Andre, Michael A. Stackpole, Bill Dugan, Nishan Hossepian, Chris Christensen, Alan Pavlish, Bruce Schlickbernd.

Some of the Wasteland team. From left: Ken St. Andre, Michael Stackpole, Bill Dugan, Nishan Hossepian, Chris Christensen, Alan Pavlish, Bruce Schlickbernd.

Both Wasteland and The Bard’s Tale III were finished almost simultaneously after many months of separate labor. When Fargo informed Electronic Arts of the good news, they insisted on shipping the two overdue games within two months of each other — May of 1988 in the case of Wasteland, July in that of The Bard’s Tale III — over his strident objections. He had good grounds for concern: these two big new CRPGs were bound to appeal largely to the same group of players, and could hardly help but cannibalize one another’s sales. To Interplay, this small company that had gone so long without any new product at all, the decision felt not just unwise but downright dangerous to their future.

Fargo had been growing increasingly unhappy with Electronic Arts, feeling Interplay just wasn’t earning enough from their development contracts for the hit games they had made for their publisher. Now this move was the last straw. Wasteland and The Bard’s Tale III would be the last games Interplay would publish through Electronic Arts, as Fargo decided to carry out an idea he’d been mulling over for some time: to turn Interplay into a full-fledged publisher as well as developer, with their own name — and only their own name — on their game boxes.

Following a pattern that was already all too typical, The Bard’s Tale III — the more traditional game, the less innovative, and the sequel — became by far the better selling of the pairing. Wasteland didn’t flop, but it didn’t become an out-and-out hit either. Doubtless for this reason, neither Interplay nor Electronic Arts were willing to invest in the extensive porting to other platforms that marked the Bard’s Tale games. After the original Apple II and Commodore 64 releases, the only Wasteland port was an MS-DOS version that appeared nine months later, in March of 1989. Programmed by Interplay’s Michael Quarles, it sports modestly improved graphics and an interface that makes halfhearted use of a mouse. While most original players of Wasteland knew it in its 8-bit incarnations, it’s this version that almost everyone who has played it in the years since knows, and for good reason: it’s a far less painful experience than the vintage 8-bit one of juggling disks and waiting, waiting, waiting for all of those painstakingly detailed maps to load and save.

Wasteland‘s place in history, and in the mind of Brian Fargo, would always loom larger than its sales figures might suggest. Unfortunately, his ability to build on its legacy was immediately hampered by the split with Electronic Arts: the terms of the two companies’ contract signed all rights to the  Wasteland name as well as The Bard’s Tale over to Interplay’s publisher. Thus both series, one potential and one very much ongoing, were abruptly stopped in their tracks. Electronic Arts toyed with making a Bard’s Tale IV on their own from time to time without ever seeing the idea all the way through. Oddly given the relative sales numbers, Electronic Arts did bring a sequel of sorts to Wasteland to fruition, although they didn’t go so far as to dare to put the Wasteland name on the box. Given the contents of said box, it’s not hard to guess why. Fountain of Dreams (1990) uses Michael Quarles’s MS-DOS Wasteland engine, but it’s a far less audacious affair. Slipped out with little fanfare — Electronic Arts could spot a turkey as well as anyone — it garnered poor reviews, sold poorly, and is unloved and largely forgotten today.

In the absence of rights to the Wasteland name, Fargo initially planned to leverage his development team and the tools and game engine they had spent so long creating to make more games in other settings that would play much like Wasteland but wouldn’t be actual sequels. The first of these was to have been called Meantime, and was to have been written and designed by Stackpole with the help of many of the usual Wasteland suspects. Its premise was at least as intriguing as Wasteland‘s: a game of time travel in which you’d get to meet (and sometimes battle) historical figures from Cyrano de Bergerac to P.T. Barnum, Albert Einstein to Amelia Earhart. At the Winter CES in January of 1989, Fargo said that Meantime would be out that summer: “I am personally testing the maps right now.” But it never appeared, thanks to a lot of design questions that were never quite solved and, most of all, thanks to the relentless march of technology. All of the Wasteland development tools ran on the Apple II and Commodore 64, platforms whose sales finally collapsed in 1989. Interplay tinkered with trying to move the tool chain to MS-DOS for several years, but the project finally expired from neglect. There just always seemed to be something more pressing to do.

Somewhat surprisingly given the enthusiasm with which they’d worked on Wasteland, neither St. Andre nor Stackpole remained for very long in the field of computer-game design. St. Andre returned to his librarian gig and his occasional sideline as a tabletop-RPG designer, not working on another computer game until recruited for Brian Fargo’s Wasteland 2 project many years later. Stackpole continued to take work from Interplay for the next few years, on Meantime and other projects, often working with his old Flying Buffalo and Wasteland colleague Liz Danforth. But his name too gradually disappeared from game credits in direct proportion to its appearance on the covers of more and more franchise novels. (His first such book, set in the universe of FASA’s BattleTech game, was published almost simultaneously with Wasteland and The Bard’s Tale III.)

Fargo himself never forgot the game that had always been first and foremost his own passion project. He would eventually revive it, first via the “spiritual sequels” Fallout (1997) and Fallout 2 (1998), then with the belated Kickstarter-funded sequel-in-name-as-well-as-spirit Wasteland 2 (2014).

But those are stories for much later times. Wasteland was destined to stand alone for many years. And yet it wouldn’t be the only lesson 1988 brought in the perils and possibilities of bringing tabletop rules to the computer. Another, much higher profile tabletop adaptation, the result of a blockbuster licensing deal given to the most unexpected of developers, was still to come before the year was out. Next time we’ll begin to trace the story behind this third and final landmark CRPG of 1988, the biggest selling of the whole lot.

(Sources: PC Player of August 1989; Questbusters of Juy 1986, March 1988, April 1988, May 1988, July 1988, August 1988, October 1988, November 1988, January 1989, March 1989. On YouTube, Rebecca Heineman and Jennell Jaquays at the 2013 Portland Retro Gaming Expo; Matt Barton’s interview with Brian Fargo; Brian Fargo at Unity 2012. Other online sources include a Michael Stackpole article on RockPaperShotgun; Matt Barton’s interview with Rebecca Heineman on Gamasutra; GTW64’s page on Meantime.

Wasteland is available for purchase from GOG.com.)


  1. Bill Heineman now lives as Rebecca Heineman. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. 

  2. Paul Jaquays now lives as Jennell Jaquays. 

  3. Interestingly, Stackpole did have one connection to Interplay, through Bard’s Tale designer Michael Cranford. Cranford sent Flying Buffalo a Tunnels & Trolls solo adventure of his own devising around 1983. Stackpole thought it showed promise, but that it wasn’t quite there yet, so he sent it back with some suggestions for improvement and a promise to look at it again if Cranford followed through on them. But he never heard another word from him; presumably it was right about this time that Cranford got busy making The Bard’s Tale

  4. In what must be a record for footnotes of this type, I have to also note that Dan Bunten later became Danielle Bunten Berry, and lived until her death in 1998 under that name. 

 
50 Comments

Posted by on February 26, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,