RSS

Tag Archives: electronic arts

Origin Sells Out

One day in early June of 1992, a group of executives from Electronic Arts visited Origin Systems’s headquarters in Austin, Texas. If they had come from any other company, the rank and file at Origin might not have paid them much attention. As it was, though, the visit felt a bit like Saddam Hussein dropping in at George Bush’s White House for a fireside chat. For Origin and EA, you see, had a history.

Back in August of 1985, just prior to the release of Ultima IV, the much smaller Origin had signed a contract to piggyback on EA’s distribution network as an affiliated label. Eighteen months later, when EA released an otherwise unmemorable CRPG called Deathlord whose interface hewed a little too closely to that of an Ultima, a livid Richard Garriott attempted to pull Origin out of the agreement early. EA at first seemed prepared to crush Origin utterly in retribution by pulling at the legal seams in the two companies’ contract. Origin, however, found themselves a protector: Brøderbund Software, whose size and clout at the time were comparable to that of EA. At last, EA agreed to allow Origin to go their own way, albeit probably only after the smaller company paid them a modest settlement for breaking the contract. Origin quickly signed a new distribution contract with Brøderbund, which lasted until 1989, by which point they had become big enough in their own right to take over their own distribution.

But Richard Garriott wasn’t one to forgive even a small personal slight easily, much less a full-blown threat to destroy his company. From 1987 on, EA was Public Enemy #1 at Origin, a status which Garriott marked in ways that only seemed to grow pettier as time went on. Garriott built a mausoleum for “Pirt Snikwah” — the name of Trip Hawkins, EA’s founder and chief executive, spelled backward — at his Austin mansion of Britannia Manor. Ultima V‘s parser treated the phrase “Electronic Arts” like a curse word; Ultima VI included a gang of evil pirates named after some of the more prominent members of EA’s executive staff. Time really did seem to make Garriott more rather than less bitter. Among his relatively few detail-oriented contributions to Ultima VII were a set of infernal inter-dimensional generators whose shapes together formed the EA logo. He also demanded that the two villains who went on a murder spree across Britannia in that game be named Elizabeth and Abraham. Just to drive the point home, the pair worked for a “Destroyer of Worlds” — an inversion of Origin’s longstanding tagline of “We Create Worlds.”

And yet here the destroyers were, just two months after the release of Ultima VII, chatting amiably with their hosts while they gazed upon their surroundings with what seemed to some of Origin’s employees an ominously proprietorial air. Urgent speculation ran up and down the corridors: what the hell was going on? In response to the concerned inquiries of their employees, Origin’s management rushed to say that the two companies were merely discussing “some joint ventures in Sega Genesis development,” even though “they haven’t done a lot of cooperative projects in the past.” That was certainly putting a brave face on half a decade of character assassination!

What was really going on was, as the more astute employees at Origin could all too plainly sense, something far bigger than any mere “joint venture.” The fact was, Origin was in a serious financial bind — not a unique one in their evolving industry, but one which their unique circumstances had made more severe for them than for most others. Everyone in the industry, Origin included, was looking ahead to a very near future when the enormous storage capacity of CD-ROM, combined with improving graphics and sound and exploding numbers of computers in homes, would allow computer games to join television, movies, and music as a staple of mainstream entertainment rather than a niche hobby. Products suitable for this new world order needed to go into development now in order to be on store shelves to greet it when it arrived. These next-generation products with their vastly higher audiovisual standards couldn’t be funded entirely out of the proceeds from current games. They required alternative forms of financing.

For Origin, this issue, which really was well-nigh universal among their peers, was further complicated by the realities of being a relatively small company without a lot of product diversification. A few underwhelming attempts to bring older Ultima games to the Nintendo Entertainment System aside, they had no real presence on videogame consoles, a market which dwarfed that of computer games, and had just two viable product lines even on computers: Ultima and Wing Commander. This lack of diversification left them in a decidedly risky position, where the failure of a single major release in either of those franchises could conceivably bring down the whole company.

The previous year of 1991 had been a year of Wing Commander, when the second mainline title in that franchise, combined with ongoing strong sales of the first game and a series of expansion packs for both of them, had accounted for fully 90 percent of the black ink in Origin’s books. In this year of 1992, it was supposed to have been the other franchise’s turn to carry the company while Wing Commander retooled its technology for the future. But Ultima VII: The Black Gate, while it had been far from an outright commercial failure, had garnered a more muted response than Origin had hoped and planned for, plagued as its launch had been by bugs, high system requirements, and the sheer difficulty of configuring it to run properly under the inscrutable stewardship of MS-DOS.

Even more worrisome than all of the specific issues that dogged this latest Ultima was a more diffuse sort of ennui directed toward it by gamers — a sense that the traditional approach of Ultima in general, with its hundred-hour play time, its huge amounts of text, and its emphasis on scope and player freedom rather than multimedia set-pieces, was falling out of step with the times. Richard Garriott liked to joke that he had spent his whole career making the same game over and over — just making it better and bigger and more sophisticated each time out. It was beginning to seem to some at Origin that that progression might have reached its natural end point. Before EA ever entered the picture, a sense was dawning that Ultima VIII needed to go in another direction entirely — needed to be tighter, flashier, more focused, more in step with the new types of customers who were now beginning to buy computer games. Ultima Underworld, a real-time first-person spinoff of the core series developed by the Boston studio Blue Sky Productions rather than Origin themselves, had already gone a considerable distance in that direction, and upon its near-simultaneous release with Ultima VII had threatened to overshadow its more cerebral big brother completely, garnering more enthusiastic reviews and, eventually, higher sales. Needless to say, had Ultima Underworld not turned into such a success, Origin’s financial position would have been still more critical than it already was. It seemed pretty clear that this was the direction that all of Ultima needed to go.

But making a flashier next-generation Ultima VIII — not to mention the next-generation Wing Commander — would require more money than even Ultima VII and Ultima Underworld together were currently bringing in. And yet, frustratingly, Origin couldn’t seem to drum up much in the way of financing. Their home state of Texas was in the midst of an ugly series of savings-and-loan scandals that had made all of the local banks gun-shy; the country as a whole was going through a mild recession that wasn’t helping; would-be private investors could see all too clearly the risks associated with Origin’s non-diversified business model. As the vaguely disappointing reception for Ultima VII continued to make itself felt, the crisis began to feel increasingly existential. Origin had lots of technical and creative talent and two valuable properties — Wing Commander in particular was arguably still the hottest single name in computer gaming — but had too little capital and a nonexistent credit line. They were, in other words, classic candidates for acquisition.

It seems that the rapprochement between EA and Origin began at the Summer Consumer Electronics Show in Chicago at the very beginning of June of 1992, and, as evidenced by EA’s personal visit to Origin just a week or so later, proceeded rapidly from there. It would be interesting and perhaps a little amusing to learn how the rest of Origin’s management team coaxed Richard Garriott around to the idea of selling out to the company he had spent the last half-decade vilifying. But whatever tack they took, they obviously succeeded. At least a little bit of sugar was added to the bitter pill by the fact that Trip Hawkins, whom Garriott rightly or wrongly regarded as the worst of all the fiends at EA, had recently stepped down from his role in the company’s management to helm a new semi-subsidiary outfit known as 3DO. (“Had Trip still been there, there’s no way we would have gone with EA,” argues one former Origin staffer — but, then again, necessity can almost always make strange bedfellows.)

Likewise, we can only wonder what if anything EA’s negotiators saw fit to say to Origin generally and Garriott specifically about all of the personal attacks couched within the last few Ultima games. I rather suspect they said nothing; if there was one thing the supremely non-sentimental EA of this era had come to understand, it was that it seldom pays to make business personal.

Richard and Robert Garriott flank Stan McKee, Electronic Arts’s chief financial officer, as they toast the consummation of one of the more unexpected acquisitions in gaming history at EA’s headquarters in San Mateo, California.

So, the deal was finalized at EA’s headquarters in San Mateo, California, on September 25, 1992, in the form of a stock exchange worth $35 million. Both parties were polite enough to call it a merger rather than an acquisition, but it was painfully clear which one had the upper hand; EA, who were growing so fast they had just gone through a two-for-one stock split, now had annual revenues of $200 million, while Origin could boast of only $13 million. In a decision whose consequences remain with us to this day, Richard Garriott even agreed to sign over his personal copyrights to the Ultima franchise. In return, he became an EA vice president; his brother Robert, previously the chief executive in Austin, now had to settle for the title of the new EA subsidiary’s creative director.

From EA’s perspective, the deal got them Ultima, a franchise which was perhaps starting to feel a little over-exposed in the wake of a veritable flood of Origin product bearing the name, but one which nevertheless represented EA’s first viable CRPG franchise since the Bard’s Tale trilogy had concluded back in 1988. Much more importantly, though, it got them Wing Commander, in many ways the progenitor of the whole contemporary craze for multimedia “interactive movies”; it was a franchise which seemed immune to over-exposure. (Origin had amply proved this point by releasing two Wing Commander mainline games and four expansion packs in the last two years, plus a “Speech Accessory Pack” for Wing Commander II, all of which had sold very well indeed.)

As you do in these situations, both management teams promised the folks in Austin that nothing much would really change. “The key word is autonomy,” Origin’s executives said in their company’s internal newsletter. “Origin is supposed to operate independently from EA and maintain profitability.” But of course things did — had to — change. There was an inescapable power imbalance here, such that, while Origin’s management had to “consult” with EA when making decisions, their counterparts suffered no such obligation. And of course what might happen if Origin didn’t “maintain profitability” remained unspoken.

Thus most of the old guard at Origin would go on to remember September 25, 1992, as, if not quite the end of the old, freewheeling Origin Systems, at least the beginning of the end. Within six months, resentments against the mother ship’s overbearing ways were already building in such employees as an anonymous letter writer who asked his managers why they were “determined to eradicate the culture that makes Origin such a fun place to work.” Within a year, another was asking even more heatedly, “What happened to being a ‘wholly owned independent subsidiary of EA?’ When did EA start telling Origin what to do and when to do it? I thought Richard said we would remain independent and that EA wouldn’t touch us?!? Did I miss something here?” Eighteen months in, an executive assistant named Michelle Caddel, the very first new employee Origin had hired upon opening their Austin office in 1987, tried to make the best of the changes: “Although some of the warmth at Origin has disappeared with the merger, it still feels like a family.” For now, at any rate.

Perhaps tellingly, the person at Origin who seemed to thrive most under the new arrangement was one of the most widely disliked: Dallas Snell, the hard-driving production manager who was the father of a hundred exhausting crunch times, who tended to regard Origin’s games as commodities quantifiable in floppy disks and megabytes. Already by the time the Origin had been an EA subsidiary for a year, he had managed to install himself at a place in the org chart that was for all practical purposes above that of even Richard and Robert Garriott: he was the only person in Austin who was a “direct report” to Bing Gordon, EA’s powerful head of development.

On the other hand, becoming a part of the growing EA empire also brought its share of advantages. The new parent company’s deep pockets meant that Origin could prepare in earnest for that anticipated future when games would sell more copies but would also require more money, time, and manpower to create. Thus almost immediately after closing the deal with EA, Origin closed another one, for a much larger office space which they moved into in January of 1993. Then they set about filling up the place; over the course of the next year, Origin would double in size, going from 200 to 400 employees.

The calm before the storm: the enormous cafeteria at Origin’s new digs awaits the first onslaught of hungry employees. Hopefully someone will scrounge up some tables and chairs before the big moment arrives…

And so the work of game development went on. When EA bought Origin, the latter naturally already had a number of products, large and small, in the pipeline. The first-ever expansion pack for an existing Ultima game — an idea borrowed from Wing Commander — was about to hit stores; Ultima VII: Forge of Virtue would prove a weirdly unambitious addition to a hugely ambitious game, offering only a single dungeon to explore that was more frustrating than fun. Scheduled for release in 1993 were Wing Commander: Academy, a similarly underwhelming re-purposing of Origin’s internal development tools into a public-facing “mission builder,” and Wing Commander: Privateer, which took the core engine and moved it into a free-roaming framework rather than a tightly scripted, heavily story-driven one; it thus became a sort of updated version of the legendary Elite, and, indeed, would succeed surprisingly well on those terms. And then there was also Ultima Underworld II: Labyrinth of Worlds, developed like its predecessor by Blue Sky up in Boston; it would prove a less compelling experience on the whole than Ultima Underworld I, being merely a bigger game rather than a better one, but it would be reasonably well-received by customers eager for more of the same.

Those, then, were the relatively modest projects. Origin’s two most expensive and ambitious games for the coming year consisted of yet one more from the Ultima franchise and one that was connected tangentially to Wing Commander. We’ll look at them a bit more closely, taking them one at a time.

The game which would be released under the long-winded title of Ultima VII Part Two: Serpent Isle had had a complicated gestation. It was conceived as Origin’s latest solution to a problem that had long bedeviled them: that of how to leverage their latest expensive Ultima engine for more than one game without violating the letter of a promise Richard Garriott had made more than a decade before to never use the same engine for two successive mainline Ultima games. Back when Ultima VI was the latest and greatest, Origin had tried reusing its engine in a pair of spinoffs called the Worlds of Ultima, which rather awkwardly shoehorned the player’s character from the main series — the “Avatar” — into plots and settings that otherwise had nothing to do with Richard Garriott’s fantasy world of Britannia. Those two games had drawn from early 20th-century science and adventure fiction rather than Renaissance Faire fantasy, and had actually turned out quite magnificently; they’re among the best games ever to bear the Ultima name in this humble critic’s opinion. But, sadly, they had sold like the proverbial space heaters in the Sahara. It seemed that Arthur Conan Doyle and Edgar Rice Burroughs were a bridge too far for fans raised on J.R.R. Tolkien and Lord British.

So, Origin adjusted their approach when thinking of ways to reuse the even more expensive Ultima VII engine. They conceived two projects. One would be somewhat in the spirit of Worlds of Ultima, but would stick closer to Britannia-style fantasy: called Arthurian Legends, it would draw from, as you might assume, the legends of King Arthur, a fairly natural thematic fit for a series whose creator liked to call himself “Lord British.” The other game, the first to go into production, would be a direct sequel to Ultima VII, following the Avatar as he pursued the Guardian, that “Destroyer of Worlds” from the first game, from Britannia to a new world. This game, then, was Serpent Isle. Originally, it was to have had a pirate theme, all fantastical derring-do on an oceanic world, with a voodoo-like magic system in keeping with Earthly legends of Caribbean piracy.

This piratey Serpent Isle was first assigned to Origin writer Jeff George, but he struggled to find ways to adapt the idea to the reality of the Ultima VII engine’s affordances. Finally, after spinning his wheels for some months, he left the company entirely. Warren Spector, who had become Origin’s resident specialist in Just Getting Things Done, then took over the project and radically revised it, dropping the pirate angle and changing the setting to one that was much more Britannia-like, right down to a set of towns each dedicated to one of a set of abstract virtues. Having thus become a less excitingly original concept but a more practical one from a development perspective, Serpent Isle started to make good progress under Spector’s steady hand. Meanwhile another small team started working up a script for Arthurian Legends, which was planned as the Ultima VII engine’s last hurrah.

Yet the somewhat muted response to the first Ultima VII threw a spanner in the works. Origin’s management team was suddenly second-guessing the entire philosophy on which their company had been built: “Do we still create worlds?” Arthurian Legends was starved of resources amidst this crisis of confidence, and finally cancelled in January of 1993. Writer and designer Sheri Graner Ray, one of only two people left on the project at the end, invests its cancellation with major symbolic importance:

I truly believe that on some level we knew that this was the death knell for Origin. It was the last of the truly grass-roots games in production there… the last one that was conceived, championed, and put into development purely by the actual developers, with no support or input from the executives. It was actually, kinda, the end of an era for the game industry in general, as it was also during this time that we were all adjusting to the very recent EA buyout of Origin.

Brian Martin, one of the last two developers remaining on the Arthurian Legends project, made this odd little memorial to it with the help of his partner Sheri Graner Ray after being informed by management that the project was to be cancelled entirely. Ray herself tells the story: “Before we left that night, Brian laid down in the common area that was right outside our office and I went around his body with masking tape… like a chalk line… we added the outline of a crown and the outline of a sword. We then draped our door in black cloth and put up a sign that said, ‘The King is Dead. Long live the King.’ …. and a very odd thing happened. The next morning when we arrived, there were flowers by the outline. As the day wore on more flowers arrived.. and a candle.. and some coins were put on the eyes… and a poem arrived… it was uncanny. This went on for several days with the altar growing more and more. Finally, we were told we had to take it down, because there was a press junket coming through and they didn’t want the press seeing it.”

Serpent Isle, on the other hand, was too far along by the time the verdict was in on the first Ultima VII to make a cancellation realistic. It would instead go down in the recollection of most hardcore CRPG fans as the last “real” Ultima, the capstone to the process of evolution a young Richard Garriott had set in motion back in 1980 with a primitive BASIC game called Akalabeth. And yet the fact remains that it could have been so, so much better, had it only caught Origin at a less uncertain, more confident time.

Serpent Isle lacks the refreshingly original settings of the two Worlds of Ultima games, as it does the surprisingly fine writing of the first Ultima VII; Raymond Benson, the head writer on the latter project, worked on Serpent Isle only briefly before decamping to join MicroProse Software. In compensation, though, Serpent Isle is arguably a better game than its predecessor through the first 65 percent or so of its immense length. Ultima VII: The Black Gate can at times feel like the world’s most elaborate high-fantasy walking simulator; you really do spend most of your time just walking around and talking to people, an exercise that’s made rewarding only by the superb writing. Serpent Isle, by contrast, is full to bursting with actual things to do: puzzles to solve, dungeons to explore, quests to fulfill. It stretches its engine in all sorts of unexpected and wonderfully hands-on directions. Halfway in, it seems well on its way to being one of the best Ultima games of all, as fine a sendoff as any venerable series could hope for.

In the end, though, its strengths were all undone by Origin’s crisis of faith in the traditional Ultima concept. Determined to get its sales onto the books of what had been a rather lukewarm fiscal year and to wash their hands of the past it now represented, management demanded that it go out on March 25, 1993, the last day of said year. As a result, the last third or so of Serpent Isle is painfully, obviously unfinished. Conversations become threadbare, plot lines are left to dangle, side quests disappear, and bugs start to sprout up everywhere you look. As the fiction becomes a thinner and thinner veneer pasted over the mechanical nuts and bolts of the design, solubility falls by the wayside. By the end, you’re wandering through a maze of obscure plot triggers that have no logical connection with the events they cause, making a walkthrough a virtual necessity. It’s a downright sad thing to have to witness. Had its team only been allowed another three or four months to finish the job, Serpent Isle could have been not only a great final old-school Ultima but one of the best CRPGs of any type that I’ve ever played, a surefire entrant in my personal gaming hall of fame. As it is, though, it’s a bitter failure, arguably the most heartbreaking one of Warren Spector’s storied career.

Unfashionable though such an approach was in 1993, almost all of the Serpent Isle team’s energy went into gameplay and script rather than multimedia assets; the game looks virtually identical to the first Ultima VII. An exception is the frozen northlands which you visit later in the game. Unfortunately, the change in scenery comes about the time that the design slowly begins to fall apart.

And there was to be one final note of cutting irony in all of this: Serpent Isle, which Origin released without a lot of faith in its commercial potential, garnered a surprisingly warm reception among critics and fans alike, and wound up selling almost as well as the first Ultima VII. Indeed, it performed so well that the subject of doing “more games in that vein,” in addition to or even instead of a more streamlined Ultima VIII, was briefly discussed at Origin. As things transpired, though, its success led only to an expansion pack called The Silver Seed before the end of the year; this modest effort became the true swansong for the Ultima VII engine, as well as the whole era of the 100-hour-plus, exploration-focused, free-form single-player CRPG at Origin in general. The very philosophy that had spawned the company, that had been at the core of its identity for the first decade of its existence, was fading into history. Warren Spector would later have this to say in reference to a period during which practical commercial concerns strangled the last shreds of idealism at Origin:

There’s no doubt RPGs were out of favor by the mid-90s. No doubt at all. People didn’t seem to want fantasy stories or post-apocalypse stories anymore. They certainly didn’t want isometric, 100 hour fantasy or post-apocalypse stories, that’s for sure! I couldn’t say why it happened, but it did. Everyone was jumping on the CD craze – it was all cinematic games and high-end-graphics puzzle games… That was a tough time for me – I mean, picture yourself sitting in a meeting with a bunch of execs, trying to convince them to do all sorts of cool games and being told, “Warren, you’re not allowed to say the word ‘story’ any more.” Talk about a slap in the face, a bucket of cold water, a dose of reality.

If you ask me, the reason it all happened was that we assumed our audience wanted 100 hours of play and didn’t care much about graphics. Even high-end RPGs were pretty plain-jane next to things like Myst and even our own Wing Commander series. I think we fell behind our audience in terms of the sophistication they expected and we catered too much to the hardcore fans. That can work when you’re spending hundreds of thousands of dollars – even a few million – but when games start costing many millions, you just can’t make them for a relatively small audience of fans.

If Serpent Isle and its expansion were the last gasps of the Origin Systems that had been, the company’s other huge game of 1993 was every inch a product of the new Origin that had begun to take shape following the worldwide success of the first Wing Commander game. Chris Roberts, the father of Wing Commander, had been working on something called Strike Commander ever since late 1990, leaving Wing Commander II and all of the expansion packs and other spinoffs in the hands of other Origin staffers. The new game took the basic idea of the old — that of an action-oriented vehicular simulator with a strong story, told largely via between-mission dialog scenes — and moved it from the outer space of the far future to an Earth of a very near future, where the international order has broken down and mercenaries battle for control over the planet’s dwindling resources. You take to the skies in an F-16 as one of the mercenaries — one of the good ones, naturally.

Origin and Chris Roberts pulled out all the stops to make Strike Commander an audiovisual showcase; the game’s gestation time of two and a half years, absurdly long by the standards of the early 1990s, was a product of Roberts constantly updating his engine to take advantage of the latest cutting-edge hardware. The old Wing Commander engine was starting to look pretty long in the tooth by the end of 1992, so this new engine, which replaced its predecessor’s scaled sprites with true polygonal 3D graphics, was more than welcome. There’s no point in putting a modest face on it: Strike Commander looked downright spectacular in comparison with any other flight simulator on offer at the time. It was widely expected, both inside and outside of Origin, to become the company’s biggest game ever. In fact, it became the first Origin game to go gold in the United States — 100,000 copies sold to retail — before it had actually shipped there, thanks to the magic of pre-orders. Meanwhile European pre-orders topped 50,000, an all-time record for EA’s British subsidiary. All in all, more than 1.1 million Strike Commander floppy disks — 30 tons worth of plastic, metal, and iron oxide — were duplicated before a single unit was sold. Why not? This game was a sure thing.

The hype around Strike Commander was inescapable for months prior to its release. At the European Computer Trade Show in London, the last big event before the release, Origin put together a mock-up of an airplane hangar. Those lucky people who managed to seize control for few minutes got to play the game from behind a nose cowl and instrument panel. What Origin didn’t tell you was that the computer hidden away underneath all the window dressing was almost certainly much, much more powerful than one you had at home.

Alas, pride goeth before a fall. Just a couple of weeks after Strike Commander‘s worldwide release on April 23, 1993, Origin had to admit to themselves in their internal newsletter that sales from retail to actual end users were “slower than expected.” Consumers clearly weren’t as enamored with the change in setting as Origin and just about everyone else in their industry had assumed they would be. Transporting the Wing Commander formula into a reasonably identifiable version of the real world somehow made the story, which hovered as usual in some liminal space between comic book and soap opera, seem rather more than less ludicrous. At the same time, the use of an F-16 in place of a made-up star fighter, combined with the game’s superficial resemblance to the hardcore flight simulators of the day, raised expectations among some players which the game had never really been designed to meet. The editors of Origin’s newsletter complained, a little petulantly, about this group of sim jockeys who were “ready for a cockpit that had every gauge, altimeter, dial, and soft-drink holder in its proper place. This is basically the group which wouldn’t be happy unless you needed the $35 million worth of training the Air Force provides just to get the thing off the ground.” There were advantages, Origin was belatedly learning, to “simulating” a vehicle that had no basis in reality, as there were to fictions similarly divorced from the real world. In hitting so much closer to home, Strike Commander lost a lot of what had made Wing Commander so appealing.

The new game’s other problem was more immediate and practical: almost no one could run the darn thing well enough to actually have the experience Chris Roberts had intended it to be. Ever since Origin had abandoned the Apple II to make MS-DOS their primary development platform at the end of the 1980s, they’d had a reputation for pushing the latest hardware to its limit. This game, though, was something else entirely even from them. The box’s claim that it would run on an 80386 was a polite fiction at best; in reality, you needed an 80486, and one of the fastest ones at that — running at least at 50 MHz or, better yet, 66 MHz — if you wished to see anything like the silky-smooth visuals that Origin had been showing off so proudly at recent trade shows. Even Origin had to admit in their newsletter that customers had been “stunned” by the hardware Strike Commander craved. Pushed along by the kid-in-a-candy-store enthusiasm of Chris Roberts, who never had a passing fancy he didn’t want to rush right out and implement, they had badly overshot the current state of computing hardware.

Of course, said state was always evolving; it was on this fact that Origin now had to pin whatever diminished hopes they still had for Strike Commander. The talk of the hardware industry at the time was Intel’s new fifth-generation microprocessor, which abandoned the “x86” nomenclature in favor of the snazzy new focused-tested name of Pentium, another sign of how personal computers were continuing their steady march from being tools of businesspeople and obsessions of nerdy hobbyists into mainstream consumer-electronics products. Origin struck a promotional deal with Compaq Computers in nearby Houston, who, following what had become something of a tradition for them, were about to release the first mass-market desktop computer to be built around this latest Intel marvel. Compaq placed the showpiece that was Strike Commander-on-a-Pentium front and center at the big PC Expo corporate trade show that summer of 1993, causing quite a stir at an event that usually scoffed at games. “The fuse has only been lit,” went Origin’s cautiously optimistic new company line on Strike Commander, “and it looks to be a long and steady burn.”

But time would prove this optimism as well to be somewhat misplaced: one of those flashy new Compaq Pentium machines cost $7000 in its most minimalist configuration that summer. By the time prices had come down enough to make a Pentium affordable for gamers without an absurd amount of disposable income, other games with even more impressive audiovisuals would be available for showing off their hardware. Near the end of the year, Origin released an expansion pack for Strike Commander that had long been in the development pipeline, but that would be that: there would be no Strike Commander II. Chris Roberts turned his attention instead to Wing Commander III, which would raise the bar on development budget and multimedia ambition to truly unprecedented heights, not only for Origin but for their industry at large. After all, Wing Commander: Academy and Privateer, both of which had had a fraction of the development budget of Strike Commander but wound up selling just as well, proved that there was still a loyal, bankable audience out there for the core series.

Origin had good reason to play it safe now in this respect and others. When the one-year anniversary of the acquisition arrived, the accountants had to reveal to EA that their new subsidiary had done no more than break even so far. By most standards, it hadn’t been a terrible year at all: Ultima Underworld II, Serpent Isle, Wing Commander: Academy, and Wing Commander: Privateer had all more or less made money, and even Strike Commander wasn’t yet so badly underwater that all hope was lost on that front. But on the other hand, none of these games had turned into a breakout hit in the fashion of the first two Wing Commander games, even as the new facilities, new employees, and new titles going into development had cost plenty. EA was already beginning to voice some skepticism about some of Origin’s recent decisions. The crew in Austin really, really needed a home run rather than more base hits if they hoped to maintain their status in the industry and get back into their overlord’s good graces. Clearly 1994, which would feature a new mainline entry in both of Origin’s core properties for the first time since Ultima VI had dropped and Wing Commander mania had begun back in 1990, would be a pivotal year. Origin’s future was riding now on Ultima VIII and Wing Commander III.

(Sources: the book Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland; Origin’s internal newsletter Point of Origin from March 13 1992, June 19 1992, July 31 1992, September 25 1992, October 23 1992, November 6 1992, December 4 1992, December 18 1992, January 29 1993, February 12 1993, February 26 1993, March 26 1993, April 9 1993, April 23 1993, May 7 1993, May 21 1993, June 18 1993, July 2 1993, August 27 1993, September 10 1993, October 13 1993, October 22 1993, November 8 1993, and December 1993; Questbusters of April 1986 and July 1987; Computer Gaming World of October 1992 and August 1993. Online sources include “The Conquest of Origin” at The Escapist, “The Stars His Destination: Chris Roberts from Origin to Star Citizen at US Gamer, Shery Graner Ray’s blog entry “20 Years and Counting — Origin Systems,” and an interview with Warren Spector at RPG Codex.

All of the Origin games mentioned in this article are available for digital purchase at GOG.com.)

 

 
38 Comments

Posted by on September 6, 2019 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , , ,

The Designer’s Designer

Dan Bunten delivers the keynote at the 1990 Game Developers Conference.

Dan Bunten and his little company Ozark Softscape could look back on a tremendous 1984 as that year came to an end. Seven Cities of Gold had been a huge success, Electronic Arts’s biggest game of the year, doing much to keep the struggling publisher out of bankruptcy court by selling well over 100,000 copies. Bunten himself had become one the most sought-after interviewees in the industry. Everyone who got the chance to speak with him seemed to agree that Seven Cities of Gold was only the beginning, that he was destined for even greater success.

As it turned out, though, 1984 would be the high-water mark for Bunten, at least in terms of that grubbiest but most implacable metric of success in games: quantity of units shifted. The years that followed would be frustrating as often as they would be inspiring, as Bunten pursued a vision that seemed at odds with every trend in the industry, all the while trying to thread the needle between artistic fulfillment and commercial considerations.


In the wake of Seven Cities of Gold‘s success, EA badly wanted a follow-up with a similar theme, so much so that they offered Bunten a personal bonus of $5000 to make it Ozark’s next project. The result was Heart of Africa, a game which at first glance looks like precisely the sequel EA was asking for but that actually plays quite differently. Instead of exploring the Americas as Hernán Cortés during the 1600s, it has you exploring Africa as an intrepid Victorian adventurer (“Livingston, I presume?”). In keeping with the changed time and location, your goal isn’t to conquer the land for your country — Africa had, for better or for worse, already been thoroughly partitioned among the European nations by 1890, the year in which the game takes place — but simply to discover and to map. In the best tradition of Victorian adventure novels like King Solomon’s Mines, your ultimate goal is to find the tomb of a mythical Egyptian pharaoh. Bunten later admitted that the differences from Heart of Africa‘s predecessor weren’t so much a product of original design intent as improvisation after he had bumbled into an historical context that just wouldn’t work as a more faithful sequel.

Indeed, Bunten in later years dismissed Heart of Africa, his most adventure-like game ever and his last ever that was single-player only, as nothing more than “a game done to please EA”: “I honestly didn’t want to do the project.” Its biggest problem hinges on the fact that its environment is randomly generated each time you start a new game, itself an attempt to remedy the most obvious failing of adventure games as a commercial proposition: their lack of replayability. Yet the random maps can never live up to what a hand-crafted map, designed for challenge and dramatic effect, might have been; the “story” in Heart of Africa is all too clearly just a bunch of shifting interchangeable parts. Bunten later acknowledged that “the attempt to make a replayable adventure game made for a shallow product (which seems true in every other case designers have tried it as well). I guess that if elements are such that they can be randomly shifted then they [aren’t] substantive enough to make for a compelling game. So, even though I don’t like linear games, they seem necessary to have the depth a good story needs.”

Heart of Africa did quite well for EA upon its release in 1985 — well enough, in fact, to become Bunten’s third most successful game of all time. Yet the whole experience left a bad taste in his mouth. He came away from the project determined to return to the guiding vision behind his first game for EA, the commercially unsuccessful but absolutely brilliant M.U.L.E.: a vision of computer games that people played together rather than alone. In the future, he would continue to compromise at times on the style and subject matter of his games in order to sell them to his publishers, but he would never again back away from his one great principle. All of his games henceforward would be multiplayer — first, foremost, and in one case exclusively. In fact, that one case would be his very next game.

The success of his previous two games having opened something of a window of opportunity with EA, Bunten charged ahead on what he would later describe as his single “most experimental game.” Robot Rascals is a multiplayer scavenger hunt in which two physical decks of cards are integral to the game. Each player controls a robot, and must use it to collect the four items shown on the cards in her hand and return with them to home base in order to win. The game lives on the razor’s edge of pure chaos, the product both of random events generated by the computer and of a second deck of cards — the “specials” — which among other things can force players to draw new item cards, trash their old cards, or trade cards among one another; thus everyone’s goals are shifting almost constantly. As always in a Dan Bunten game, there are lots of thoughtful features here, from ways to handicap the game for players of different ages or skill levels to three selectable levels of overall complexity. He designed it to be “a game that anyone could play” rather than one limited to “special-interest groups like role-playing people or history buffs.” It can be a lot of fun, even if it’s not quite on the level of M.U.L.E. (then again, what is, right?). But this latest bid to make computer games acceptable family entertainment wound up selling hardly at all upon its release in 1986, ending Bunten’s two-game commercial hot streak.

By this point in Bunten’s career, changes in his personal life were beginning to have a major impact on the games he made. In 1985, while still working on Heart of Africa, he had divorced his second wife and married his third, with all the painful complications such disruptions entail when one is leaving children behind with the former spouse. In 1986, he and his new wife moved from Little Rock, Arkansas, to Hattiesburg, Mississippi, so she could complete a PhD. This event marked the effective end of Ozark Softscape as anything but a euphemism for Dan Bunten himself and whatever programmers and artists he happened to contract work out to. The happy little communal house/office where Dan and Bill Bunten, Jim Rushing, and Alan Watson had created games, with a neighborhood full of eager testers constantly streaming through the living room, was no more; only Watson continued to work on Bunten’s games from Robot Rascals on, and then more as just another hired programmer than a valued design voice. Even after moving back to Little Rock in 1988, Bunten would never be able to recapture the communal alchemy of 1982 to 1985.

Coupled with these changes were other, still more ominous ones in Dan Bunten himself. Those who knew him during these years generally refer only vaguely to his “problems,” and this discretion of course does them credit; I too have no desire to psychoanalyze the man. What does seem clear, however, is that he was growing increasingly unhappy as time wore on. He became more demanding of his colleagues, difficult enough to work with that many of them decided it just wasn’t worth it, even as he became more erratic in his own habits, perhaps due to an alcohol intake that struck many as alarming.

Yet Bunten was nothing if not an enigmatic personality. At the same time that close friends were worrying about his moodiness and his drinking, he could show up someplace like The Computer Game Developers Conference and electrify the attendees with his energy and ideas. Certainly his eyes could still light up when he talked about the games he was making and wanted to make. The worrisome questions were how much longer he would be allowed to make those games in light of their often meager sales, and, even more pressingly, why his eyes didn’t seem to light up about much else in his life anymore.

But, to return to the firmer ground of the actual games he was continuing to make: Modem Wars, his next one, marked the beginning of a new chapter in his tireless quest to get people playing computer games together. “We’ve failed at gathering people around the computer,” Bunten said before starting work on it. “We’re going to have to connect them out of the back by connecting their computers to each other.” He would make, in other words, a game played by two people on two separate computers, connected via modem.

Modem Wars was known as Sport of War until just prior to its release by EA in 1988, and in many ways that was a better title. Its premise is a new version of Bunten’s favorite sport of football, played not by individual athletes but by infantry, artillery, and even aircraft, if you can imagine such a thing. One might call it a mashup between two of his early designs for SSI: the strategic football simulator Computer Quarterback and the proto-real-time-strategy game Cytron Masters.

It’s the latter aspect that makes Modem Wars especially visionary. The game was nothing less than an online real-time-strategy death match years before the world had heard of such a thing. While a rudimentary artificial intelligence was provided for single-player play, it was made clear by the game’s very title that this was strictly a tool for learning to play rather than the real point of the endeavor. Daniel Hockman’s review of Modem Wars for Computer Gaming World ironically describes the qualities of online real-time strategy as a potential “problem” and “marketing weakness” — the very same qualities which a later generation would take as the genre’s main attractions:

A sizable number of gamers are not used to thinking in real-time situations. They can spend hours ordering tens of thousands of men into mortal combat, but they wimp out when they have to think under fire. They want to play chess instead of speed chess. They want to analyze instead of act. As the enemy drones zero in on their comcen, they throw up their hands in frustration when it’s knocked out before they can extract themselves from the maelstrom of fire that has engulfed them.

Whether because gamers really were daunted by this need to think on their feet or, more likely, because of the relative dearth of fast modems and stable online connections in 1988, Modem Wars became another crushing commercial disappointment for Bunten. EA declared themselves “hesitant” to keep pursuing this direction in the wake of the game’s failure. Rather than causing Bunten to turn away from multiplayer gaming, this loss of faith caused him to turn away from EA.

In the summer of 1989, MicroProse Software announced that they had signed a five-year agreement with Bunten, giving them first rights to all of the games he made during that period. The great hidden driver behind the agreement was MicroProse’s own star designer Sid Meier, who had never hidden his enormous admiration for Bunten’s work. Bunten doubtless hoped that a new, more supportive publisher would mark the beginning of a new, more commercially successful era in his career. And in the beginning at least, such optimism would, for once, prove well-founded.

Known at first simply as War!, then as War Room, and finally as Command H.Q., Bunten’s first game for MicroProse was aptly described by its designer as being akin to an abstract, casual board game of military strategy, like Risk or Axis & Allies. The big wrinkle was that this beer-and-pretzels game was to be played in real time rather than turns. But, perhaps in response to complaints about his previous game like those voiced by Daniel Hockman above, the pace is generally far less frenetic this time around. Not only can the player select an overall speed, but the program itself actually takes charge to speed up the action when not much is happening and slow it down when things heat up. Although a computer opponent is provided, the designer’s real focus was once more on modem-to-modem play.

But, whatever its designer’s preferences, MicroProse notably de-emphasized the multiplayer component in their advertising upon Command H.Q.‘s release in 1990, and this, combined with a more credible artificial intelligence for the computer opponent, gave it more appeal to the traditional wargame crowd than Modem Wars had demonstrated. Ditto a fair measure of evangelizing done by Computer Gaming World, with whom Bunten had always had a warm relationship, having even authored a regular column there for a few years in the mid-1980s. The magazine’s lengthy review concluded by saying, “This is the game we’ve all been waiting for”; they went on to publish two more lengthy articles on Command H.Q. strategy, and made it their “Wargame of the Year” for 1990. For all these reasons, Command H.Q. sold considerably better than had Bunten’s last couple of games; one report places its total sales at around 75,000 units, enough to make it his second most successful game ever.

With that to buoy his spirits, Bunten made big plans for his next game, Global Conquest. “Think of it as Command H.Q. meets Seven Cities of Gold meets M.U.L.E.,” he said. Drawing heavily from Command H.Q. in particular, as well as the old grand-strategy classic Empire, he aimed to make a globe-spanning strategy game where economics would be as important as military maneuvers. He put together a large and vocal group of play testers on CompuServe, and tried to incorporate as many of their suggestions as possible, via a huge options panel that allowed players to customize virtually every aspect of the game, from the rules themselves to the geography and topography of the planet they were fighting over, all the way down to the look of the icons representing the individual units. This time, up to four humans could play against one another in a variety of ways: they could all play together by taking turns on one computer, or they could each play on their own computer via a local-area network, or four players could share two computers that were connected via modem. The game was turn-based, but with an interesting twist designed to eliminate analysis paralysis: when the first player mashed the “next turn” button, everyone else had just twenty seconds to finish up their own turns before the execution phase began.

In later years, Dan Bunten himself had little good to say about what would turn out to be his last boxed game. In fact, he called it his absolute “worst game” of all the ones he had made. While play-testing in general is a wonderful thing, and every designer should do as much of it as possible, a designer also needs to keep his own vision for what kind of game he wants to make at the forefront. In the face of prominent-in-their-own-right, opinionated testers like Computer Gaming World‘s longtime wargame scribe Alan Emrich, Bunten failed to do this, and wound up creating not so much a single coherent strategy game as a sort of strategy-game construction set that baffled more than it delighted. “This game was a hodgepodge rather than an integration,” he admitted several years later. “It was just the opposite of the KISS doctrine. It was a kitchen-sink design. It had everything. Build your own game by struggling through several options menus.” He acknowledged as well that the mounting unhappiness in his personal life, which had now led to a divorce from his third wife, was making it harder and harder to do good work.

Released in 1992, Global Conquest under-performed commercially as well. In addition to the game’s intrinsic failings, it didn’t help matters that MicroProse had just five months prior released Sid Meier’s Civilization, another exercise in turn-based grand strategy on a global scale, also heavily influenced by Empire, that managed to be far more thematically and texturally ambitious while remaining more focused and playable as a game — albeit without the multiplayer element that was so important to Bunten.

But of course, there’s more to a game than whether it’s played by one person or more than one, and it strikes me as reasonable to question whether Bunten was beginning to lose his way as a designer in other respects even as he stuck so obstinately to his multiplayer guns. Setting aside their individual strengths and failings, the final three boxed games of Bunten’s career, with their focus on “wars” and “command” and “conquest,” can feel a little disheartening when compared to what came before. Games like M.U.L.E., Robot Rascals, and to some extent even Seven Cities of Gold and Heart of Africa had a different, friendlier, more welcoming personality. This last, more militaristic trio feels like a compromise, the product of a Dan Bunten who said that, if he couldn’t bring multiplayer gaming to the masses, he would settle for the grognard crowd, indulging their love for guns and tanks and bombs. So be it. Now, though, he was about to give that same crowd the shock of their lives.

In November of 1992, just months after completing the supremely masculine wargame Global Conquest, Dan Bunten had sexual-reassignment surgery, becoming the woman Danielle “Dani” Bunten Berry. (For continuity’s sake, I’ll generally continue to refer to her by the shorthand of “Bunten” rather than “Berry” for the remainder of this article.) It’s not for us to speculate about the personal trauma that must have accompanied such a momentous decision. What we can and should take note of, however, is that it was an unbelievably brave decision. For all that we still have a long way to go today when it comes to giving transsexuals the rights and respect they deserve, the early 1990s were a far less enlightened time than even our own on this issue. And it wasn’t as if Bunten could take comfort in the anything-goes anonymity of a New York City or San Francisco.  Dan Bunten had lived, and as Dani Bunten now continued to live, in the intensely conservative small-town atmosphere of Little Rock, Arkansas. Many of those closest to her disowned her, including her mother and her ex-wives, making it heartbreakingly difficult for her to maintain a relationship with her children. She had remained in Little Rock all these years, at no small cost to her career prospects, largely because of these ties of blood, which she had believed to be indissoluble. This rejection, then, must have felt like the bitterest of betrayals.

Dan Bunten with his beverage of choice.

The games industry as well, with its big-breasted damsels in distress and its machine-gun-toting male heroes, wasn’t exactly notable for its enlightened attitudes toward sex and gender. Many of Bunten’s old friends and colleagues would see her for the first time after her surgery and convalescence at the Game Developers Conference scheduled for April of 1993, and they looked forward to that event with almost as much trepidation as Bunten herself must have felt. It was all just so very unexpected. To whatever extent they had carried around a mental image of a man who would choose to become a woman, Dan Bunten didn’t fit the profile at all. He had been the games’ industry own Ozark Mountains boy, a true son of the South, always ready with his “folksy mountain humor” (read, “dirty jokes”). His rangy frame stood six feet two inches tall. He loved nothing more than a rough-and-tumble game of back-lot football, unless it be beer and poker afterward. As his three ex-wives and three children attested, he had certainly seemed to like women, but no one had ever imagined that he liked them enough to want to be one. What were they supposed to say to him — er, to her — now?

They needn’t have worried. Dani Bunten handled her coming-out party with the same low-key grace and humor she would display for the rest of her life as a woman. She said that she had made the switch to do her part to redress the gender imbalance inside the industry, and to help improve the aesthetics of game designers to match the improving aesthetics of their games. The tension dissipated, and soon everyone got into the spirit of the thing. A straw poll named Dani Bunten the game designer most likely to appear on the Oprah Winfrey Show. A designer named Gordon Walton had a typical experience: “I was put off when she made the change to become Dani, until the minute I spoke to her. It was clear to me she was much happier as Dani, and if anything an even more incredible person.” Another GDC regular remembered the “unhappy man” from the 1992 event, “sitting on the hallway floor drinking and smoking,” and contrasted him with the “happy woman” he now saw.

No one with any interest in the inner workings of those strangest of creatures, their fellow humans, could fail to be fascinated by Bunten’s dispatches from both sides of the gender divide. “Aren’t there things you’ve always wanted to know about women but were afraid to ask?” she said. “Well, now’s your chance!”

I had to learn a lot to actually “count” as a woman! I had to learn how to walk, speak, dress as a woman. Those little things which are necessary so that other people don’t [feel] alienated.There’s a little summary someone gave me to make clear what being a woman means: as a woman you have to sing when you speak, dance when you walk, and you have to open your heart… I know how stereotypical that sounds, but it is true! Speech for a man is something completely different: the melody of speech is fast, monotone, and decreases at the end of a sentence. Sometimes, this still happens to me, and people are always irritated. Female speech is a little bit like song – we have a lot more melody and different speech patterns. Walking is really a bit like dancing: slower and connected, with a lot of subtle movements. I enjoyed it at once.

She had few filters when talking about the nitty-gritty details:

One of the saddest changes I had to deal with after my operation was the fact that I couldn’t aim anymore when urinating. Boys — I have two little sons and a daughter — simply love to aim.

Bunten said that, in keeping with her new identity, she didn’t feel much desire to design any more wargames; this led to the end of her arrangement with MicroProse. By way of compensation, Electronic Arts that year released a nicely done “commemorative edition” of Seven Cities of Gold, complete with dramatically upgraded graphics and sound to suit the times. Bunten had little to nothing to do with the project, but it sold fairly well, and perhaps helped to remind her of her roots.

In the same spirit, Bunten’s first real project after her transformation became a new version of M.U.L.E. EA’s founder Trip Hawkins had always named that game as one of his all-time favorites, and had frequently stated how disappointed he was that it had never gotten the attention it deserved. Now, Hawkins had left his day-to-day management role at EA to run 3DO, a spin-off company peddling a multimedia set-top box for the living room. Hawkins thought M.U.L.E. would be perfect for the platform, and recruited Bunten to make it happen. It was a dream project; showing excellent taste, she still regarded M.U.L.E. as the best thing she had ever done. But the dream quickly began to sour.

3DO first requested that, instead of taking turns managing their properties on the map, players all be allowed to do so simultaneously. Bunten somewhat reluctantly agreed. And then:

As soon as I added the simultaneity, it instantly put into their heads, “Why can’t we shoot at each other?” And I said, “No guns.” And they said, “What about bombs? Can we drop a bomb in front of you? It won’t hurt you. It will be a cartoon thing, it will just slow you down.” And I said, “You don’t get it. It’s changing the whole notion of how this thing works!”

[3DO is] staking its future on the idea of a new generation of hardware and therefore, you’d assume, a new generation of software, but they said, “No, our market is still 18 to 35, male. We need something with action, something with intensity.” Chrome and sizzle. Ugh.

In the end, Bunten walked out, disappointed enough that she seriously considered getting out of games altogether, going so far as to apply for jobs as the industrial engineer Dan Bunten had once been before his first personal computer came along.

Instead she found a role with a new company called Mpath as a design and strategy consultant. The goal of that venture was to bring multiplayer gaming to the new frontier of the World Wide Web, and its founders included her fellow game designer Brian Moriarty, of Infocom and LucasArts fame. She also studied the elusive concept of “games for girls” in association with a think tank set up by Microsoft co-founder Paul Allen; some of her proposals would later come to market as the products of Purple Moon, Brenda Laurel’s brief-lived but important publisher of games for girls aged 8 to 14.

Offers to do conventional boxed games as sole designer, however, weren’t forthcoming; how much that was down to lingering personal prejudices against her for her changed sex and how much to the fact that the games she wanted to make just weren’t considered commercially viable must always be open for debate. Refusing as usual to be a victim, Bunten said that her “priorities had shifted” since her change anyway: “I don’t identify myself with the job as strongly as before.” Deciding that, for her, heaven was other people after a life spent programming computers, she devoured anthropology texts and riffed on Karl Jung’s theories of a collective unconscious. “Literature, anthropology, and even dance,” she noted, “have a good deal more to teach designers about human drives and abilities than the technologists of either end of California, who know silicon and celluloid but not much else.” So, she bided her time as a designer, waiting for a more inclusive ludic future to arrive. At the 1997 GDC, she described a prescient vision of “small creative shops” freed from the inherent conservatism of the “distribution trap” by the magic of the Internet.

That future would indeed come to pass — but, sadly, not in time for Dani Bunten Berry to see it. Shortly after delivering that speech, she went to see her doctor about a persistent cough, whereupon she was diagnosed with an advanced case of lung cancer. In one of those cruel ironies which always seem to dog the lives of us poor mortals, she had finally kicked a lifelong habit of heavy smoking just a few months before.

She appeared in public for the last time in May of 1998. The occasion was, once again, the Game Developers Conference, where she had always shone so. She struggled audibly for breath as she gave the last presentation of her life, entitled “Do Online Games Still Suck?,” but her passion carried her through. At the end of the conference, at a special ceremony held aboard the Queen Mary in Long Beach Harbor, she was presented with the first ever GDC Lifetime Achievement Award. The master of ceremonies for that evening was her friend and colleague Brian Moriarty, who knew, like everyone else in attendance, that the end was near. He closed his heartfelt tribute thus:

It is no exaggeration to characterize tonight’s honoree as the world’s foremost authority on multiplayer computer games. Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes the prototypes of the defining art form of the 21st century.

As one of those historians, I can only heartily concur with his assessment.

It would be nice to say that Dani Bunten passed peacefully to her rest. But, as anyone with any experience with lung cancer will recognize, that just isn’t how the disease works. Throughout her life, she had done nothing the easy way, and her death — ugly, painful, and slow — was no exception. On the brighter side, she did reconcile to some extent with her mother and other family members and friends who had rejected her. The end came on July 3, 1998. Rather incredibly in light of the prodigious, multifaceted life she had lived, she was just 49 years old.

It’s a life which resists pigeonholing or sloganeering. Bunten herself explicitly rejected the role of transgender advocate, inside or outside of the games industry. Near the end of her life, she expressed regret for her decision to change her physical sex, saying she could have found ways to live in a more gender-fluid way without taking such a drastic step. Whether this was a reasoned evaluation or a product of the pain and trauma of terminal illness must remain, like so much else about her, an enigma.

What is clear, however, is that Bunten, through the grace and humor with which she handled her transition and through her refusal to go away and hide thereafter as some might have wished, taught others in the games industry who were struggling with similar issues of identity that a new gender need not mean a decisive break with every aspect of one’s past — that a prior life in games could continue to be a life in games even with a different pronoun attached. She did this in a quieter way than the speechifying some might have wished for from her, but, nevertheless, do it she did. Jessica Mulligan, who transitioned from male to female a few years after her, remembers meeting Bunten shortly before her own sexual-reassignment surgery, hoping to hear some “profound words on The Transition”: “While I was looking for spiritual guidance, she was telling me where to shop for shoes. Talk about keeping someone honest! Every change in our personal lives is profound to us. You still have to pay attention to the nuts and bolts or the change is meaningless.”

Danielle Bunten Berry does her makeup.

For some, of course — even for some with generally good intentions — Danielle Bunten Berry’s transgenderism will always be the defining aspect of her life, her career in games a mere footnote to that other part of her story. But that’s not how she would have wanted it. She regarded her games as her greatest legacy after her children, and would doubtless want to be remembered as a game designer above all else.

Back in 1989, after Modem Wars had failed in the marketplace, Electronic Arts decided that the lack of “a network of people to play” was a big reason for its failure. The great what-if question pertaining to Bunten’s career is what she might have done in partnership with an online network like CompuServe, which could have provided stable connectivity along with an eager group of players and all the matchmaking and social intrigue anyone could ask for. She finally began to explore this direction late in her life, through her work with Mpath. But what might have happened if she had made the right connections — forgive the pun! — earlier? We can only speculate.

As it is, though, it’s true that, in terms of units shifted and profits generated, there have been far more impressive careers. She suffered the curse of any pioneer who gets too far out in front of the culture. All of her eleven games combined probably sold no more than 400,000 copies at the outside, a figure some prominent designers’ new games can easily better on their first week today. Certainly her commercial disappointments far outnumber her successes. But then, sales aren’t the only metric by which to measure success.

Dani Bunten, one might say, is the designer’s designer. Greg Costikyan once told what happened when he offered to introduce Warren Spector — one of those designers who can sell more games in a week than Bunten did in a lifetime — to her back in the day: “He regretfully refused; he had loved M.U.L.E. so much he was afraid he wouldn’t know what to say. He would sound like a blithering fanboy and be embarrassed.” Chris Crawford calls the same title simply “the best computer-game design of all time.” Brenda Laurel dedicated Purple Moon’s output to Bunten. Sid Meier was so taken with Seven Cities of Gold that Pirates!, Railroad Tycoon, and Civilization, his trilogy of masterpieces, can all be described as extensions in one way or another of what Bunten first wrought. And Seven Cities of Gold was only Meier’s second favorite Bunten game: he loved M.U.L.E. so much that he was afraid to even try to improve on it.

Ironically, the very multiplayer affordances that Bunten so steadfastly refused to give up on, much to the detriment of her income, continue to make it difficult for her games to be seen at their best today. M.U.L.E. can be played as its designer really intended it only on an Atari 8-bit computer — real or emulated — with four vintage joysticks plugged in and four players holding onto them in a single living room; that is, needless to say, not a trivial thing to arrange in this day and age. Likewise, the need to have the exceedingly rare physical cards to hand has made it impossible for most people to even try out Robot Rascals today. (It took me months to track down a pricey German edition on eBay.) And Bunten’s final run of boxed games, reliant on ancient modem hookups as they are, are even more difficult to play with others today than they were in their own time.

Dani Bunten didn’t have an easy life, internally or externally. She remained always an enigma — the life of the party who goes home alone, the proverbial stranger among her best friends. One person who knew her after she became a woman claimed she still had a “shadowed, slightly haunted look, even when she was smiling.” Given the complicated emotions that are still stirred up in so many of us by transgenderism, that may have been projection. On the other hand, though, it may have been perception. Even Bunten’s childhood had been haunted by the specter of familial discord and possibly abuse, to such an extent that she refused to talk much about it. But she did once tell Greg Costikyan that she grew up loving games mainly because it was only when playing them that her family wasn’t “totally dysfunctional.”

I think that for Dani Bunten games were most of all a means of communication, a way of punching through that bubble of ego and identity that isolates all of us to one degree or another, and that perhaps isolated her more so than most. Thus her guiding vision became, as Sid Meier puts it, “the family gathered around the computer.” After all, it’s a small step to go from communicating to connecting, from connecting to loving. She openly stated that she had made Robot Rascals for her own family most of all: “They’ve never played my games. I think they found them too esoteric or complex. I wanted something that I could enjoy with them, that they’d all be able to relate to.” The tragedy for her — perhaps a key to the essential sadness many felt at Bunten’s core, whether she was living as a man or a woman — is that reality never quite lived up to that Norman Rockwell dream of the happy family gathered around a computer; her daughter, the duly appointed caretaker of her legacy, still calls M.U.L.E. “boring and tedious” today. But the dream remains, and her games have given those of us privileged to discover them great joy and comfort in the midst of lives that have admittedly — hopefully! — been far easier than that of their creator. And so I’ll close, in predictable but unavoidable fashion, with Danielle Bunten Berry’s most famous quote — a quote predictable precisely because it so perfectly sums up her career: “No one on their death bed ever said, ‘I wish I had spent more time alone with my computer!'” Words to live by, my fellow gamers. Words to live by.

Danielle Bunten Berry, 1949-1998.

(Sources: Compute! of March 1989, December 1989, April 1990, January 1992, and December 1993; Questbusters of May 1986; Commodore Power Play of June/July 1986; Commodore Magazine of July 1987, October 1988, and June 1989; Ahoy! of March 1987; Computer Gaming World of January/February 1987, May 1988, February 1989, February 1990, December 1990, February 1991, March 1991, May 1991, April 1992, June 1992, August 1992, June 1993, August 1993, July 1994, September 1995, and October 1998; Family Computing of January 1987; Compute!’s Gazette of August 1989; The One of April 1991; Game Players PC Entertainment of September 1992; Game Developer of February/March 1995, July 1998, September 1998, and October 1998; Electronic Arts’s newsletter Farther of Winter 1986; Power Play of January 1995; Arkansas Times of February 8 2012. Online sources include the archived contents of the old World of Mule site, the archived contents of a Danielle Bunten Berry tribute site, the Salon article “Get Behind the M.U.L.E.”, and Bunten’s interview at Halcyon Days.)

 
33 Comments

Posted by on November 16, 2018 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Lost Files of Sherlock Holmes

In 1989, Trip Hawkins reluctantly decided to shift Electronic Arts’s strategic focus from home computers to videogame consoles, thereby to “reach millions of customers.” That decision was reaching fruition by 1992. For the first time that year, EA’s console games outsold those they published for personal computers. The whole image of the company was changing, leaving behind the last vestiges of the high-toned “software artists” era of old in favor of something less intellectual and more visceral — something aimed at the mass market rather than a quirky elite.

Still, corporate cultures don’t change overnight, and the EA of 1992 continued to release some computer games which were more in keeping with their image of the 1980s than that of this new decade. One of the most interesting and rewarding of these aberrations — call them the product of corporate inertia — was a game called The Lost Files of Sherlock Holmes, whose origin story doesn’t exactly lead one to expect a work of brilliance but which is in fact one of the finest, most faithful interpretations of the legendary detective in the deerstalker cap ever to make its way onto a monitor screen.

The initial impetus for Lost Files was provided by an EA producer named Christopher Erhardt. After studying film and psychology at university, Erhardt joined the games industry in 1987, when he came to Infocom to become the in-house producer for their latter-day lineup of graphical games from outside developers, such as Quarterstaff, BattleTech: The Crescent Hawk’s Inception, and Arthur: The Quest for Excalibur. When Infocom was shuttered in 1989, he moved on to EA in the same role, helming a number of the early Sega Genesis games that did so much to establish the company’s new identity. His success on that front gave him a fair amount of pull, and so he pitched a pet idea of his: for a sort of computerized board game that would star Sherlock Holmes along with a rotating cast of suspects, crimes, and motives, similar to the old 221B Baker Street board game as well as a classic computer game from Accolade called Killed Until Dead. It turned out that EA’s management weren’t yet totally closed to the idea of computer games that were, as Erhardt would later put it, “unusual and not aimed at the mass market” — as long, that is, as they could be done fairly inexpensively.

Mythos Software. On the top row are James Ferguson, Elinor Mavor, and Scott Mavor. On the bottom row are John Dunn and David Wood.

In order to meet the latter condition, Erhardt enlisted a tiny Tempe, Arizona, company known as Mythos Software — not to be confused with the contemporaneous British strategy-games developer Mythos Games. This Mythos was being run by one James Ferguson, its fresh-out-of-university founder, from the basement of his parents’ house. He was trying to break into the wider world of software development that lay outside the bounds of the strictly local contracts he had fulfilled so far; his inexperience and eagerness ensured that Mythos would work cheap. And in addition to cut-rate pricing, Ferguson had another secret weapon to deploy: an artist named Scott Mavor who had a very special way with pixel graphics, a technique that EA’s in-house employees would later come to refer to as “the Mavor glow.” The highly motivated Mythos, working to Erhardt’s specifications, created a demo in less than two weeks that was impressive enough to win the project a tentative green light.

Eric Lindstrom and R.J. Berg.

Another EA employee, a technical writer named Eric Lindstrom, saw the demo and suggested turning what had been planned as a computerized board game into a more narratively ambitious point-and-click adventure game. When Erhardt proved receptive to the suggestion, Lindstrom put together the outline of a story, “The Mystery of the Serrated Scalpel.” He told Erhardt that he knew the perfect person to bring the story to life: one of his colleagues among EA’s manual writers, a passionate Sherlock Holmes aficionado — he claimed to have read Arthur Conan Doyle’s complete canon of Holmes stories “two or three times” — named R.J. Berg.

The project’s footing inside EA was constantly uncertain. Christopher Erhardt says he “felt like I was playing the Princess Bride, and the dread pirate Roberts was coming It was always, ‘Yep – we may cancel it.'” But in the end the team was allowed to complete their point-and-click mystery, despite it being so utterly out of step with EA’s current strategic focus, and it was quietly released in the fall of 1992.

I find the critical dialog that followed, both in the immediate wake of Lost Files‘s release and many years later in Internet circles, to be unusually interesting. In particular, I’d like to quote at some length from Computer Gaming World‘s original review, which was written by Charles Ardai, one of the boldest and most thoughtful — and most entertaining — game reviewers of the time; this I say even as I find myself disagreeing with his conclusions far more often than not. His review opens thus:

If there is any character who has appeared in more computer games than Nintendo’s plump little goldmine, Mario, it has to be Sherlock Holmes. There have been almost a dozen Holmes-inspired games over the years, one of the best being Sherlock Holmes Consulting Detective, which is currently available in two different CD-ROM editions from ICOM. Other valiant attempts have included Imagic’s Sherlock Holmes in Another Bow, in which Holmes took a sea voyage with Gertrude Stein, Picasso, Thomas Edison, and Houdini, among others; and Infocom’s deadly serious Sherlock: Riddle of the Crown Jewels.

The difference between Holmes and Mario games, however, is that new Mario games are always coming out because the old ones sold like gangbusters, while new Sherlock Holmes games come out in spite of the fact that their predecessors sold like space heaters in the Sahara. It is noteworthy that, until ICOM, no company had ever released more than one Sherlock Holmes game, while all the Mario games come from the same source. It is also worth noting that the Holmes curse is not limited to games: the last few Holmes movies, such as Without a Clue and Young Sherlock Holmes, were not exactly box-office blockbusters.

The paradox of Sherlock Holmes can be stated so: while not that many people actually like the original Sherlock Holmes stories, everyone seems to think that everyone else adores them. Like Tarzan and Hawkeye, Holmes is a literary icon, universally known and much-beloved as a character in the abstract — not, however, as part of any single work. Finding someone who has actually read and enjoyed the writing of Edgar Rice Burroughs, James Fenimore Cooper, or Arthur Conan Doyle requires the patience of Diogenes. Most people know the character from television and the movies, at best; at worst, from reviews of television shows and movies they never bothered to see.

So, why do new Holmes adaptations surface with such regularity? Because the character is already famous and the material is in the public domain (thereby mitigating the requisite licensing fees associated with famous characters of more recent vintage. Batman or Indiana Jones, for instance.) Another answer is that Sherlock Holmes is seen as bridging the gap between entertainment and literature. Game companies presumably hope to cash in on the recognition factor and have some of the character’s ponderous respectability rub off on their product. They also figure that they can’t go wrong basing their games on a body of work that has endured for almost a century.

Unfortunately for them, they are wrong. There are only so many copies of a game that one can sell to members of the Baker Street Irregulars (the world’s largest and best-known Sherlock Holmes fan club), and a vogue for Victoriana has never really caught on among the rest of the game-buying population. The result is that, while Holmes games have been good, bad, and indifferent, their success has been uniformly mediocre.

This delightfully cynical opening gambit is so elegantly put together that one almost hesitates to puncture its cogency with facts. Sadly, though, puncture we must. While there were certainly Sherlock Holmes games released prior to Lost Files that flopped, there’s no evidence to suggest that this was the fault of the famous detective with his name on the box, and plenty of evidence to the contrary: that his name could, under the right circumstances, deliver at least a modest sales boost. In addition to the Sherlock Holmes Consulting Detective CD-ROM productions, a counter-example to Ardai’s thesis that’s so huge even he has to acknowledge it — the first volume of that series sold over 1 million units — there’s also the Melbourne House text adventure Sherlock; that game, the hotly anticipated followup to the bestselling-text-adventure-of-all-time The Hobbit, likely sold well over 100,000 units in its own right in the much smaller market of the Europe of 1984. Even Infocom’s Riddle of the Crown Jewels, while by no means a smash hit, sold significantly better than usual for an Infocom game in the sunset of the company’s text-only era. (Nor would I describe that game as “deadly serious” — I could go with “respectful” at most — but that’s perhaps picking nits.)

Still, setting aside those inconvenient details, it’s worth considering this broader question of just why there have been so many Sherlock Holmes games over the years. Certainly the character doesn’t have the same immediate appeal with the traditional gaming demographic as heavyweight properties like Star Wars and Star Trek, Frodo Baggins and Indiana Jones — or, for that matter, the born-in-a-videogame Super Mario. The reason for Sherlock’s ubiquity in the face of his more limited appeal is, of course, crystal clear, as Ardai recognizes: he’s in the public domain, meaning anyone who wishes to can make a Sherlock Holmes game at any time without paying anyone.1

If you’re going to do Sherlock Holmes, you just have to get the fog right.

As such, Holmes occupies a nearly unique position in our culture. He’s one of the last great fictional icons, historically speaking, who’s so blessedly free of intellectual-property restrictions. Absolutely everyone, whether they’ve ever read a story or seen a film featuring him or not, knows him. The only characters with a remotely similar degree of recognizability who postdate him are Dracula, the Wizard of Oz, and Peter Pan — and neither of the latter two at least presents writers with quite the same temptation to tell new story after story after story.

As is noted in Lost Files‘s manual, Sherlock Holmes has become such an indelible part of our cultural memory that when we see him we experience a sort of vicarious nostalgia for a London none of us ever knew: “Gas lamps, the sound of horses’ hooves, steam locomotives, and romantic street cries. And then there is the atmosphere of that cozy room in Baker Street: Holmes in his armchair before a roaring coal fire, legs stretched out before him, listening with Dr. Watson to yet another bizarre story.” One might say that Sherlock Holmes gets the chronological balance just right, managing to feel both comfortably, nostalgically traditional and yet also relevant and relatable. In contrast to the Victorian scenery around him, his point of view as a character feels essentially modern, applicable to modern modes of storytelling. I’m not sure that any other fictional character combines this quality to quite the same extent with a freedom from copyright lawyers. These factors have fostered an entire creative subculture of Sherlockia which spans the landscape of modern media, dwarfing Arthur Conan Doyle’s canonical four novels and 56 short stories by multiple orders of magnitude.

The relative modernity of Sherlock Holmes is especially important in the context of interactive adaptations. The player of any narrative-driven game needs a frame of reference — needs to understand what’s expected of her in the role she’s expected to play. Thankfully, the divide between Sherlock Holmes and the likes of C.S.I. is a matter of technology rather than philosophy; Sherlock too solves crimes through rationality, combining physical evidence, eyewitness and suspect interviews, and logical deduction to reach a conclusion. Other legendary characters don’t share our modern mindset; it’s much more difficult for the player to step into the role of an ancient Greek hero who solves problems by sacrificing to the gods or an Arthurian knight who views every event as a crucible of personal honor. (Anyone doubtful of Sherlock Holmes’s efficacy in commercial computer games should have a look at the dire commercial history of Arthurian games.)

With so much material to make sense of, post-Doyle adventures of Sherlock Holmes get sorted on the basis of various criteria. One of these is revisionism versus faithfulness. While some adaptations go so far as to transport Sherlock and his cronies hook, line, and sinker into our own times, others make a virtue out of hewing steadfastly to the character and setting described by Arthur Conan Doyle. This spirit of Sherlockian fundamentalism, if you will, is just one more facet of our long cultural dialog around the detective, usually manifesting as a reactionary return to the roots when other recent interpretations are judged to have wandered too far afield.

No matter how much the Sherlockian fundamentalists kick and scream, however, the fact remains that the Sherlock Holmes of the popular imagination has long since become a pastiche of interpretations reflecting changing social mores and cultural priorities. That’s fair enough in itself — it’s much of the reason why Doyle’s timeless sleuth remains so timeless — but it does make it all too easy to lose sight of Holmes and Watson as originally conceived in the stories. Just to cite the most obvious example: Holmes’s famous deerstalker cap is never mentioned in the text of the tales, and only appeared on a few occasions in the illustrations that originally accompanied them. The deerstalker became such an iconic part of the character only after it was sported by the actor Basil Rathbone as an item of daily wear — an odd choice for the urban Holmes, given that it was, as the name would imply, a piece of hunting apparel normally worn by sporting gentlemen in the countryside — in a long series of films, beginning with The Hound of the Baskervilles in 1939.

Although Lost Files doesn’t go so far as to forgo the deerstalker — there are, after all, limits to these things — it does generally try to take its cue from the original stories rather than the patchwork of interpretations that followed them. Berg:

I definitely aimed for Holmesian authenticity. I’d like to think that, if he were alive, Doyle would like the game. After all, the characters of Holmes and Watson have been manipulated quite a bit by the various media they’ve appeared in, especially the films. For example, the Watson of Lost Files is definitely Doyle’s Watson, competent and intelligent, rather than the bumbling character portrayed in many of the movies. I also wanted to retain Holmes’s peculiar personality. He’s really not that likable a character; he’s arrogant, a misogynist, and extremely smug.

This spirit of authenticity extends to the game’s portrayal of Victorian London. There are, I’ve always thought, two tiers when it comes to realistic portrayals of real places in fiction. Authors on the second tier have done a whole lot of earnest research into their subject, and they’re very eager to show it all to you, filling your head with explicit descriptions of things which a person who actually lived in that setting would never think twice about, so ingrained are they in daily existence. Authors on the top tier, by contrast, have seemingly absorbed the setting through their pores, and write stories that effortlessly evoke it without beating you over the head with all the book research they did to reach this point of imaginative mastery.

Indeed, Sherlock. Leaving the cozy environs of 221B Baker Street.

Lost Files largely meets the latter bar as it sends you around to the jewelers and tobacconists, theaters and pubs, opulent mansions and squalid tenements of fin-de-siècle London. The details are there for when you need them or decide to go looking for them; just try mousing around the interior of 221B Baker Street. (“A typical sitting-room chair. The sheen of its wine-red velveteen covering shows that it is well-used. A dark purple silk dressing gown with a rolled collar is carelessly crumpled on the seat and the antimacassar requires changing.”) More impressive, though, is the way that the game just breathes its setting in that subtle way that can only be achieved by a writer with both a lighter touch and countless hours of immersion in the period at his command. For example Berg spent time reading Charles Dickens as well as Arthur Conan Doyle in order to capture the subtle rhythms of Victorian English in his conversations. This version of Holmes’s London isn’t the frozen-in-amber museum exhibit it sometimes comes off as in other works of Sherlockia. “We wanted a dirty game,” says Eric Lindstrom. “We wanted people to feel that people were burning coal, that they could see who was walking in the streets. Just as it was in London at the time.”

There is, however, one important exception to the game’s rule of faithfulness to the original stories: Lost Files presents a mystery that the reader can actually solve. In light of the place Holmes holds in our cultural memory as the ultimate detective, one of the great ironies of Doyle’s stories is that they really aren’t very good mysteries at all by the standard of later mystery fiction — a standard which holds a good mystery to be an implicit contest between writer and reader, in which the reader is presented with all the clues and challenged to solve the case before the writer’s detective does so. Doyle’s stories cheat egregiously by this standard, hiding vital evidence from the reader, and often positing a case’s solution on a chain of conjecture that’s nowhere near as ironclad as the great detective presents it to be. Eric Lindstrom:

The [original] stories do not work the way we are used to today. They are not whodunnits; whodunnits only became popular later. Readers have virtually no way of finding out who the culprit is. Sometimes the offender does not even appear in the plot. These are adventure stories narrated from the perspective of Dr. Watson.

For obvious reasons, Lost Files can’t get away with being faithful to this aspect of the Sherlock Holmes tradition. And so the mystery it presents is straight out of Arthur Conan Doyle — except that it plays fair. Notably, you play as Holmes himself, not, as in the original stories, as Watson. Thus you know what Holmes knows, and the game can’t pull the dirty trick on you, even if it wanted to, of hiding information until the big reveal at the end. Many other works of Sherlockia — even the otherwise traditionalist ones — adapt the same approach, responding to our post-nineteenth-century perception of what a good mystery story should be.

And make no mistake: “The Case of the Serrated Scalpel” is a very good mystery indeed. I hesitate to spoil your pleasure in it by saying too much, and so will only state that what begins as the apparently random murder of an actress in an alley behind the Regency Theatre — perhaps by Jack the Ripper, leaving Whitechapel and trying his hand in the posher environs of Mayfair? — keeps expanding in scope, encompassing more deaths and more and more Powerful People with Secrets to Keep. As I played, I was excited every time I made a breakthrough. Even better, I felt like a detective, to perhaps a greater extent than in any computer game I’ve ever played. Among games in general, I can only compare the feeling of solving this mystery to that of tackling some of the more satisfying cases in the Sherlock Holmes Consulting Detective tabletop game.

Part of the reason the mystery comes together so well is just down to good adventure-game design principles, of the sort which eluded so many other contemporary practitioners of the genre. Berg:

The idea was to produce a game that was different from existing adventures, which I frankly felt were often tedious. We wanted to eliminate the elements that tend to detract from the reality of the experience — things like having to die in order to learn some crucial information, constantly having to re-cover the same territory, and the tendency to simply pick up and use every object you encounter. We wanted to give players a deeper experience.

So, there are none of the dreaded adventure-game dead ends in Lost Files. More interestingly, the design does, as Berg alludes above, mostly eschew the typical use-unlikely-object-in-unlikely-place model of gameplay. Tellingly, the few places where it fails to do so are the weakest parts of the game.

As I’ve noted before, the classic approach to the adventure game, as a series of physical puzzles to solve, can be hugely entertaining, but it almost inevitably pushes a game toward comedy, often in spite of its designers’ best intentions. Most of us have played alleged interactive mysteries that leave you forever messing about with slider puzzles and trivial practical problems of the sort that any real detective would solve in five minutes, just by calling for backup. In Infocom’s Sherlock: Riddle of the Crown Jewels, for example, you learn that a stolen ruby is hidden in the eye of the statue of Lord Nelson on top of Nelson’s Column, and then get to spend the next little while trying to get a pigeon to fetch it for you instead of, you know, just telling Inspector Lestrade to send out a work crew. Lost Files does its level best to resist the siren call of the trivial puzzle, and, with only occasional exceptions, it succeeds. Thereby is the game freed to become one of the best interactive invocations of a classic mystery story ever. You spend your time collecting and examining physical evidence, interviewing suspects, and piecing together the crime’s logic, not solving arbitrary road-block puzzles. Lost Files is one of the few ostensibly serious adventure games of its era which manages to maintain the appropriate gravitas throughout, without any jarring breaks in tone.

This isn’t to say that it’s po-faced or entirely without humorous notes; the writing is a consistent delight, filled with incisive descriptions and flashes of dry wit, subtle in all the ways most computer-game writing is not. Consider, for example, this description of a fussy jeweler: “The proprietor is a stern-looking woman, cordial more through effort than personality. She frequently stares at the cleaning girl who tidies the floor, to make sure she is still hard at work.” Yes, this character is a type more than a personality — but how deftly is that type conveyed! In two sentences, we come to know this woman. I’d go so far as to call R.J. Berg’s prose on the whole better than that of the rather stolid Arthur Conan Doyle, who tended to bloviate on a bit too much in that all too typical Victorian style.

The fine writing lends the game a rare quality that seems doubly incongruous when one considers the time in which it was created, when multimedia was all the rage and everyone was rushing to embrace voice acting and “interactive movies.” Ditto the company which published it, who were pushing aggressively toward the forefront of the new mass-media-oriented approach to games. In spite of all that, darned if Lost Files doesn’t feel downright literary — thoughtful, measured, intelligent, a game to take in slowly over a cup of tea. Further enhancing the effect is its most unique single technical feature: everything you do in the game is meticulously recorded in an in-game journal kept by the indefatigable Dr. Watson. The journal will run into the hundreds of onscreen “pages” by the time you’re all done. It reads surprisingly well too; it’s not inconceivable to imagine printing it out — the handy option to print it or save it to a file is provided — and giving it to someone else to read with pleasure. That’s a high standard indeed, one which vanishingly few games could meet. But I think that The Lost Files of Sherlock Holmes just about manages it.

Having given so much praise to Lindstrom and Berg’s design and writing, I have to give due credit as well to Mythos Software’s efforts to bring it all to life. The interface of Lost Files is thoroughly refined and pleasant to work with, a remarkable achievement considering that this was the first point-and-click graphic adventure to be made by everyone involved. An optional but extremely handy hotspot finder minimizes the burden of pixel hunting, and the interface is full of other thoughtful touches, like a default action that is attached to each object; this saves you more often than not from having to make two clicks to carry out an action.

Finally, praise must be given to Scott Mavor’s “Mavor glow” graphics as well. To minimize the jagged edges typical of pictures drawn in the fairly low resolution of 256-color VGA graphics, Mavor avoided sharp shifts in color from pixel to pixel. Instead he blended his edges together gradually, creating a lovely, painterly effect that does indeed almost seem to glow. Scott’s mother Elinor Mavor, who worked with him to finish up the art in the latter stages of the project:2

Working with just 256 colors, Scott showed me how he created graduating palettes of each one, which allowed him to do what he called “getting rid of the dots” in each scene. To further mute the pixels, he kept the colors on the darker side, which also enhanced the Victorian mood.

Weaving the illusion of continuous-tone artwork with all those little “dots” made us buggy-eyed after a long day’s work. One night, I woke up, went into the bathroom, turned on the light, and the world just pixilated in front of me. Scary imprints on my retinas had followed me away from the computer monitor, rendering my vision as a pointillistic painting à la George Seurat.

While the graphics of its contemporaries pop out at you with bright, bold colors, the palette of Lost Files of Sherlock Holmes smacks more of the “brown sauce” of the old masters — murky, mysterious, not initially jaw-dropping but totally in keeping with the mood of the script. As you, playing the diligent detective, begin to scan them carefully, the pictures reveal more and more details of the sort that are all too easy to overlook at a quick glance. It makes for an unusually mature aesthetic statement, and a look that can be mistaken for that of no other game.

Backstage at the opera.

Given all its strengths, I find it surprising that Lost Files has gotten more than its share of critical flak over the years. I have a theory as to why that should be, but before I get to that I’ll let one of the naysayers share his point of view. Even after admitting that the story is “a ripping yarn,” the graphics atmospheric, the period details correct, and the writing very good, Charles Ardai concludes his review thusly:

Don’t get me wrong: the dialogue is well-written, the choices are entertaining, and in most cases the actions the game requires the player to perform are very interesting. The story is good and the game is a pleasure to watch. Yet, that is what one does — watch.

This game wants, more than anything in the world, to be a Sherlock Holmes movie. Though it would be a very good one if it were, it is not. Therefore, it is deeply and resoundingly unsatisfying. The plot unfolds quite well, with plenty of twists, but the player has no more control over it than he would if he were reading a novel. The player is, at best, like an actor in a play. Unfortunately, said actor has not been given a copy of the script. He has to hit his marks and say his lines by figuring out the cues given by the other characters and reading his lines off the computer equivalent of cue cards.

If this is what one wants — a fine Sherlock Holmes pastiche played out on the computer screen, with the player nominally putting the lead character through his paces — fine. “The Case of the Serrated Scalpel” delivers all that one could hope for in that vein. If one wants a game — an interactive experience in which one’s decisions have an effect on what happens — this piece of software is likely to disappoint.

The excellent German podcast Stay Forever criticized the game along similar — albeit milder — lines in 2012. And in his mostly glowing 2018 review of the game for The Adventure Gamer joint-blogging project, Joe Pranevich as well noted a certain distancing effect, which he said made him feel not so much like he was playing Sherlock Holmes and solving a mystery as watching Sherlock do the solving. The mystery, he further notes — correctly — can for the most part be solved by brute force by the patient but obtuse player, simply by picking every single conversation option when talking to every single character and showing each of them every single object you’ve collected.

At the extreme, criticisms like these would seem to encroach on the territory staked out by the noted adventure-game-hater Chris Crawford, who insists that the entire genre is a lie because it cannot offer the player the ability to do anything she wants whenever she wants. I generally find such complaints to be a colossal bore, premised on a misunderstanding of what people who enjoy adventure games find most enjoyable about them in the first place. But I do find it intriguing that these sorts of complaints keep turning up so often in the case of this specific game, and that they’re sometimes voiced even by critics generally friendly to the genre. My theory is that the mystery of Lost Files may be just a little bit too good: it’s just enticing enough, and just satisfying enough to slowly uncover, that it falls into an uncanny valley between playing along as Sherlock Holmes and actually being Sherlock Holmes.

But of course, playing any form of interactive fiction must be an imaginative act on the part of the player, who must be willing to embrace the story being offered and look past the jagged edges of interactivity. Certainly Lost Files is no less interactive than most adventure games, and it offers rich rewards that few can match if you’re willing to not brute-force your way through it, to think about and really engage with its mystery. It truly is a game to luxuriate in and savor like a good novel. In that spirit, I have one final theory to offer you: I think this particular graphic adventure may be especially appealing to fans of textual interactive fiction. Given its care for the written word and the slow-build craftsmanship of its plotting, it reminds me more of a classic Infocom game than most of the other, flashier graphic adventures that jostled with it for space on store shelves in the early 1990s.

Which brings me in my usual roundabout fashion to the final surprising twist in this very surprising game’s history. After its release by a highly skeptical EA, its sales were underwhelming, just as everyone had been telling Christopher Erhardt they would be all along. But then, over a period of months and years, the game just kept on selling at the same slow but steady clip. It seemed that computer-owning Sherlock Holmes aficionados weren’t the types to rush out and buy games when they were hot. Yet said aficionados apparently did exist, and they apparently found the notion of a Sherlock Holmes adventure game intriguing when they finally got around to it. (Somehow this scenario fits in with every stereotype I carry around in my head about the typical Sherlock Holmes fan.) Lost Files‘s sales eventually topped the magical 100,000-unit mark that separated a hit from an also-ran in the computer-games industry of the early- and mid-1990s.

It wasn’t a very good idea, but they did it anyway. R.J. Berg on a sound stage with an actress, filming for the 3DO version of Lost Files of Sherlock Holmes. Pictures like this were in all the games magazines of the 1990s. Somehow such pictures — not to mention the games that resulted from them — seem far more dated than Pong these days.

Lost Files of Sherlock Holmes may not have challenged the likes of John Madden Football in the sales sweepstakes, but it did make EA money, and some inside the company did notice. In 1994, they released a version for the 3DO multimedia console. For the sake of trendiness, this version added voice acting and inserted filmed footage of actors into the conversation scenes, replacing the lovely hand-drawn portraits in the original game and doing it no new aesthetic favors in the process. In 1996, with the original still selling tolerably well, most of the old team got back together for a belated sequel — The Lost Files of Sherlock Holmes: Case of the Rose Tattoo — that no one would ever have dreamed they would be making a couple of years before.

But then, almost everything about the story of Lost Files is unlikely, from EA of all companies deciding to make it — or, perhaps better said, deciding to allow it to be made — to a bunch of first-time adventure developers managing to put everything together so much better than many established adventure-game specialists were doing at the time. And how incredibly lucky for everyone involved that such a Sherlock Holmes devotee as R.J. Berg should have been kicking around writing manuals for EA, just waiting for an opportunity like this one to show his chops. I’ve written about four Sherlock Holmes games now in the course of this long-running history of computer gaming — yet another measure of the character’s cultural ubiquity! — and this one nudges out Riddle of the Crown Jewels to become the best one yet. It just goes to show that, no matter how much one attempts to systematize the process, much of the art and craft of making games comes down to happy accidents.

(Sources: Compute! of April 1993 and June 1993; Computer Gaming World of February 1993; Questbusters of September 1988 and December 1992; Electronic Games of February 1993. Online sources include Elinor Mavor’s remembrances of the making Lost Files of Sherlock Holmes, the comprehensive Game Nostalgia page on the game, the Stay Forever podcast episode devoted to the game, Joe Pranevich’s playthrough for The Adventure Gamer, the archived version of the old Mythos Software homepage, and Jason Scott’s “Infocom Cabinet” of vintage documents.

Feel free to download Lost Files of Sherlock Holmes from right here, in a format designed to be as easy as possible to get running under your platform’s version of DOSBox or using ScummVM.)


  1. There have been occasional questions about the extent to which Sherlock Holmes and his supporting cast truly are outside all bounds of copyright, usually predicated on the fact that the final dozen stories were published in the 1920s, the beginning of the modern copyright era, and thus remain protected. R.J. Berg remembers giving “two copies of the game and a really trivial amount of money” to Arthur Conan Doyle’s aged granddaughter, just to head off any trouble on that front. When a sequel to <em>Lost Files of Sherlock Holmes</em> was published in 1996, no permission whatsoever was sought or demanded. 

  2. Scott Mavor died of cancer in 2008 

 
 

Tags: , , ,

Whither the Software Artist? (or, How Trip Hawkins Learned to Stop Worrying and Love the Consoles)

One of the places we ran the “Can a computer make you cry?” [advertisement] was in Scientific American. Scientific American readers weren’t even playing videogames. Why the hell are you wasting any of this really expensive advertising? You’re competing with BMW for that ad.

— Trip Hawkins (EA Employee #1)

Consumers were looking for a brand signal for quality. They didn’t lionize the game makers as these creators to fawn over. They thought of the game makers almost as collaborators in their experience. So apostatizing didn’t make sense to the consumers.

— Bing Gordon (EA Employee #7)

In the ’80s that was an interesting experiment, that whole trying-to-make-them-into-rock-stars kind of thing. It was certainly a nice way to recruit top talent. But the reality is that computer programmers and artists and designers are not rock stars. It may have worked for the developers, but I don’t think it had any impact on consumers.

— Stewart Bonn (EA Employee #19)

One of the stories that gamers most love to tell each other is that of Electronic Arts’s fall from grace. If you’re sufficiently interested in gaming history to be reading this blog, you almost certainly know the story in the broad strokes: how Trip Hawkins founded EA in 1982 as a haven for “software artists” doing cutting-edge work; how he put said artists front and center in rock-star-like poses in a series of iconic advertisements, the most famous of which asked whether a computer could make you cry; how he wrote on the back of every stylish EA “album cover” not about EA as a company but as “a collection of electronic artists who share a common goal to fulfill the potential of personal computing”; and how all the idealism somehow dissipated to give us the EA of today, a shambling behemoth that crushes more clever competitors under its sheer weight as it churns out sequel after sequel, retread after retread. The exact point where EA became the personification of everything retrograde and corporate in gaming varies with the teller; perhaps the closest thing to a popular consensus is the rise of John Madden Football and EA Sports in the early 1990s, when the last vestiges of software artistry in the company’s advertisements were replaced by jocks shouting, “It’s in the game!” Regardless of the specifics, though, everyone agrees that It All Went Horribly Wrong at some point. The story of EA has become gamers’ version of a Biblical tragedy: “For what shall it profit a man, if he shall gain the whole world, and lose his own soul?”

Of course, as soon as one starts pulling out Bible quotes, it profits to ask whether one has gone too far. And, indeed, the story of EA is often over-dramatized and over-simplified. Questions of authenticity and creativity are always fraught; to imagine that anyone is really in the arts just for the art strikes me as hopelessly naive. The EA of the early 1980s wasn’t founded by artists but rather by businessmen, backed by venture capitalists with goals of their own that had little to do with “fulfilling the potential of personal computing.” Thus, when the software-artists angle turned out not to work so well, it didn’t take them long to pivot. This, then, is the history of that pivot, and how it led to the EA we know today.


Advertising is all about image making — about making others see you in the light in which you wish to be seen. Without realizing that they were doing anything of the sort, EA’s earliest marketers cemented an image into the historical imagination at the same time that they failed in their more practical task of crafting a message that resonated with the hoped-for customers of their own time. The very same early EA advertising campaign which speaks so eloquently to so many today actually missed the mark entirely in its own day, utterly failing to set the public imagination afire with this idea of programmers and game designers as rock stars. When Trip Hawkins sent Bill Budge — the programmer of his who most naturally resembled a rock star — on an autograph-signing tour of software stores and shopping malls, it didn’t lead to any outbreak of Budgomania. “Nobody would ever show up,” remembers Budge today, still wincing at the embarrassment of sitting behind a deserted autograph booth.

Nor were customers flocking into stores to buy the games EA’s rock stars had created. Sales remained far below initial projections during the eighteen months following EA’s official launch in June of 1983, and the company skated on the razor’s edge of bankruptcy on multiple occasions. While their first year yielded the substantial hits Pinball Construction Set, Archon, and One-on-One, 1984 could boast only one comparable success story, Seven Cities of Gold. Granted, four hits in two years was more than plenty of other publishers managed, but EA had been capitalized under the expectation that their games would open up whole new demographics for entertainment software. “The idea was to make games for 28-year-olds when everybody else was making games for 13-year-olds,” says Bing Gordon, Trip Hawkins’s old university roommate and right-hand man at EA. When those 28-year-olds failed to materialize, EA was left in the lurch.

For better or for worse, One-on-One is the spiritual forefather of the unstoppable EA Sports lineup of today.

The most important architect of EA’s post-launch retrenchment was arguably neither Trip Hawkins nor Bing Gordon, but rather Larry Probst, who left the free-falling Activision to join EA as vice president for sales in 1984. Probst, who had worked at the dry-goods giants Johnson & Johnson and Clorox before joining Activision, had no particular attachment to the idea of software artists. He rather looked at the business of selling games much as he had that of selling toilet paper and bleach. He asked himself how EA could best make money in the market that existed rather than some fanciful new one they hoped to create. Steve Peterson, a product manager at EA, remembers that others “would still talk about how we were trying to create new forms of entertainment and break new boundaries.” But Probst, and increasingly Trip Hawkins as well, had the less high-minded goal of “going public and being a billion-dollar company.”

Probst had the key insight that distribution, more so than software artists or perhaps even product quality in the abstract, was the key to success in an industry that, following a major downturn in home computing in general in 1984, was only continuing to get more competitive. EA therefore spurned the existing distribution channels, which were nearly monopolized by SoftSel, the great behind-the-scenes power in the software industry to which everyone else was kowtowing; SoftSel’s head, Robert Leff, was the most important person in software that no one outside the industry had ever heard of. Instead of using SoftSel, EA set up their own distribution network piece by painful piece, beginning by cold-calling the individual stores and offering cut-rate deals in order to tempt them into risking the wrath of Leff and ordering from another source.

Then, once a reasonable distribution network was in place, EA leveraged the hell out of it by setting up a program of so-called “Affiliated Labels” — other publishers who would pay EA instead of a conventional distributor like SoftSel to get their products onto store shelves. It was a well-nigh revolutionary idea in game publishing, attractive to smaller publishers because EA was ready and able to help out with a whole range of the logistical difficulties they were always facing, from packaging and disk duplication to advertising campaigns. For EA, meanwhile, the Affiliated Labels yielded huge financial rewards and placed them in the driver’s seat of much of the industry, with the power of life and death over many of their smaller ostensible competitors.

Unsurprisingly, Activision, the only other publisher with comparable distributional clout, soon copied the idea, setting up a similar program of their own. But even as they did so, EA, seemingly always one step ahead, was becoming the first American publisher to send games — both their own and those of others — directly to Europe without going through a European intermediary like Britain’s U.S. Gold label.

There was always something a bit contrived, in that indelible Silicon Valley way, about how EA chose to present themselves to the world. Here we have Bing Gordon, head of technology Greg Riker, and producer Joe Ybarra indulging in some of the creative play which, an accompanying article is at pains to tell us, was constantly going on around the office.

Larry Probst’s strategy of distribution über alles worked a treat, yielding explosive growth that more than made up for the company’s early struggles. In 1986, EA became the biggest computer-game publisher in the United States and the world, with annual revenues of $30 million. Their own games were doing well, but were assuming a very different character from the “simple, hot, and deep” ideal of the launch — a phrase Trip Hawkins had once loved to apply to games that were less stereotypically nerdy than the norm, that he imagined would be suitable for busy young adults with a finger on the pulse of hip pop culture. Now, having failed to attract that new demographic, EA adjusted their product line to appeal to those who were already buying computer games. A case in point was The Bard’s Tale, EA’s biggest hit of 1985, a hardcore CRPG that might take a hundred hours or more to complete — fodder for 13-year-olds with long summer vacations to fill rather than 28-year-olds with jobs and busy social calendars.

If “simple, hot, and deep” and programmers as rock stars had been two of the three pillars of EA’s launch philosophy, the last was the one written into Hawkins’s original mission statement as “stay with floppy-disk-based computers only.” Said statement had been written, we should remember, just as the first great videogame fad, fueled by the Atari VCS, was passing its peak and beginning the long plunge into what would go down in history as the Great Videogame Crash of 1983. At the time, it certainly wasn’t only the new EA who believed that the toy-like videogame consoles were the past, and that more sophisticated personal computers, running more sophisticated games, were the future. “I think that computer games are fundamentally different from videogames,” said Hawkins on the Computer Chronicles television show. “It becomes a question of program size, when you want to know how good a program can I have, how much can I do with it, and how long will it take before I’m bored with it.” This third pillar of EA’s strategy would take a bit longer to fall than the others, but fall it would.

The origins of EA’s loss of faith in the home computer in general as the ultimate winner of the interactive-entertainment platform wars can ironically be traced to their decision to wholeheartedly endorse one computer in particular. In October of 1984, Greg Riker, EA’s director of technology, got the chance to evaluate a prototype of Commodore’s upcoming Amiga. His verdict upon witnessing this first truly multimedia personal computer, with its superlative graphics and sound, was that this was the machine that could change everything, and that EA simply had to get involved with it as quickly as possible. He convinced Trip Hawkins of his point of view, and Hawkins managed to secure Amiga Prototype Number 12 for the company within weeks. In the months that followed, EA worked to advance the Amiga with if anything even more enthusiasm than Commodore themselves: developing libraries and programming frameworks which they shared with their outside developers; writing tools internally, including what would become the Amiga’s killer app, Deluxe Paint; documenting the Interchange File Format, a set of standard specifications for sharing pictures, sounds, animations, and music across applications. All of these things and more would remain a part of the Amiga platform’s basic software ecosystem throughout its existence.

When the Amiga finally started shipping late in 1985, EA actually made a far better public case for the machine than Commodore, taking out a splashy editorial-style advertisement just inside the cover of the premiere issue of the new AmigaWorld magazine. It showed the eight Amiga games EA would soon release and explained “why Electronic Arts is committed to the Amiga,” the latter headline appearing above a photograph of Trip Hawkins with his arm proprietorially draped over the Amiga on his desk.

Trip Hawkins with an Amiga

But it all turned into an immense disappointment. Initially, Commodore priced the Amiga wrong and marketed it worse, and even after they corrected some of their worst mistakes it perpetually under-performed in the American marketplace. For Hawkins and EA, the whole episode planted the first seeds of doubt as to whether home computers — which at the end of the day still were computers, requiring a degree of knowledge to operate and associated in the minds of most people more with work than pleasure — could really be the future of interactive entertainment as a mass-media enterprise. If a computer as magnificent as the Amiga couldn’t conquer the world, what would it take?

Perhaps it would take a piece of true consumer electronics, made by a company used to selling televisions and stereos to customers who expected to be able to just turn the things on and enjoy them — a company like, say, Philips, who were working on a new multimedia set-top box for the living room that they called CD-I. The name arose from the fact that it used the magical new technology of CD-ROM for storage, something EA had been begging Commodore to bring to the Amiga to no avail. EA embraced CD-I with the same enthusiasm they had recently shown for the Amiga, placing Greg Riker in personal charge of creating tools and techniques for programming it, working more as partners in CD-I’s development with Philips than as a mere third-party publisher.

Once again, however, it all came to nought. CD-I turned into one of the most notorious slow-motion fiascos in the history of the games industry, missing its originally planned release date in the fall of 1987 and then remaining vaporware for years on end. In early 1989, EA finally ran out of patience, mothballing all work on the platform unless and until it became a viable product; Greg Riker left the company to go work for Microsoft on their own CD-ROM research.

CD-I had cost EA a lot of money to no tangible result whatsoever, but it does reveal that the idea of gaming on something other than a conventional computer was no longer anathema to them. In fact, the year in which EA gave up on CD-I would prove the most pivotal of their entire history. We should therefore pause here to examine their position in 1989 in a bit more detail.

Despite the frustrating failure of the Amiga and CD-I to open a new golden age of interactive entertainment, EA wasn’t doing badly at all. Following years of steady growth, annual revenue had now reached $63 million, up 27 percent from 1988. EA was actively distributing about 100 titles under their own imprint, and 250 more under the imprint of the various Affiliated Labels, who had become absolutely key to their business model, accounting for some 45 percent of their total revenues. About 80 percent of their revenues still came from the United States, with 15 percent coming from Europe — where EA had set up a semi-independent subsidiary, the Langley, England-based EA Europe, in 1987 — and the remainder from the rest of the world. The company was extremely diversified. They were producing software for ten different computing platforms worldwide, had released 40 separate titles that had earned them at least $1 million each, and had no single title that accounted for more than 6 percent of their total revenues.

What we have here, then, is a very healthy business indeed, with multiple revenue streams and cash in the bank. The games they released were sometimes good, sometimes bad, sometimes mediocre; EA’s quality standards weren’t notably better or worse than the rest of their industry. “We tried to create a brand that fell somewhere between Honda and Mercedes,” admits Bing Gordon, “but a lot of the time we shipped Chevy.” Truth be told, even in the earliest days the rhetoric surrounding EA’s software artists had been a little overblown; many of the games their rock stars came up with were far less innovative than the advertising that accompanied them. The genius of Larry Probst had been to explicitly recognize that success or failure as a games publisher had as much to do with other factors as it did with the actual games you released.

For all their success, though, no one at EA was feeling particularly satisfied with their position. On the contrary: 1989 would go down in EA’s history as the year of “crisis.” As successful as they had become selling home-computer software, they remained big fish in a rather small pond, a situation out of keeping with the sense of overweening ambition that had been a part of the company’s DNA since its founding. In 1989, about 4 million computers were being used to play games on a regular or semi-regular basis in American homes, enough to fuel a computer-game industry worth an estimated $230 million per year. EA alone owned more than 25 percent of that market, more than any competitor. But there was another, related market in which they had no presence at all: that of the videogame consoles, which had returned from the dead to haunt them even as they were consolidating their position as the biggest force in computer games. The country was in the grip of Nintendo mania. About 22 million Nintendo Entertainment Systems were already in American homes — a figure accounting for 24 percent of all American households — and cartridge-based videogames were selling to the tune of $1.6 billion per year.

Unlike many of their peers, EA hadn’t yet suffered all that badly under the Nintendo onslaught, largely because they had already diversified away from the Commodore 64, the low-end 8-bit computer which had been the largest gaming platform in the world just a couple of years before, and which the NES was now in the process of annihilating. But still, the future of the computer-games industry in general felt suddenly in doubt in a way that it hadn’t since at least the great home-computer downturn of 1984. A sizable coalition inside EA, including Larry Probst and most of the board of directors, pushed Trip Hawkins hard to get EA’s games onto the consoles. Fearing a coup, he finally came around. “We had to go into the [console-based] videogame business, and that meant the world of mass-market,” Hawkins remembers. “There were millions of customers we were going to reach.”

But through which door should they make their entrance? Accustomed to running roughshod over his Affiliated Labels, Hawkins wasn’t excited about the prospect of entering Nintendo’s walled garden, where the shoe would be on the other foot, thanks to that company’s infamously draconian rules for its licensees. Nintendo’s standard contract demanded that they receive the first $12 from every game a licensee sold, required every game to go through an exhaustive review process before publication, and placed strict limits on how many games a licensee was allowed to publish per year and how many units they were allowed to manufacture of each one. For EA, accustomed to being the baddest hombre in the Wild West that was the computer-game marketplace, this was well-nigh intolerable. Bing Gordon insists even today that, thanks to all of the fees and restrictions, no one other than Nintendo was doing much more than breaking even on the NES during this, the period that would go down in history as the platform’s golden age.

So, EA decided instead to back a dark horse: the much more modern Sega Genesis, which hadn’t even been released yet in North America. It was built around the same 16-bit Motorola 68000 CPU found in computers like the Commodore Amiga and Apple Macintosh, with audiovisual capabilities not all that far removed from the likes of the Amiga. The Genesis would give designers and programmers who were used to the affordances of full-fledged computers a far less limiting platform than the NES to work with, and it offered the opportunity to get in on the ground floor of a brand-new market, as opposed to the saturated NES platform. The only problem was that Sega’s licensing fees were comparable to those of Nintendo, even though they could only offer their licensees access to a much more uncertain pool of customers.

Determined to play hardball, Hawkins had a team of engineers reverse-engineer the Genesis, sufficient to let them write games for it with or without Sega’s official development kit. Then he met with Sega again, telling them that, if they refused to adjust their licensing terms, he would release games on the console without their blessing, forcing them to initiate an ugly court battle of the sort that was currently raging between Nintendo and Atari if they wished to bring him to heel. That, he was gambling, was expense and publicity of a sort which Sega simply couldn’t afford. And Sega evidently agreed with his assessment; they accepted a royalty rate half that being demanded by Nintendo. By this roundabout method, EA became the first major American publisher to support the new console, and from that point forward the two companies became, as Hawkins puts it, “good partners.”

EA initially invested $2.5 million in ten games for the Genesis, some of them original to the console, some ports of their more popular computer games. They started shipping the first of them in June of 1990, ten months after the Genesis itself had first gone on sale in the United States. This first slate of EA Genesis titles arrived in a marketplace that was still starving for quality games, just as Hawkins had envisioned it would be. Among them was the game destined to become the face of the new, mass-market-oriented EA: John Madden Football, a more action-oriented re-imagining of a 1988 computer game of the same name.

John Madden Football debuted as a rather cerebral, tactics-heavy computer game in 1988, just another in an EA tradition of famous-athlete-endorsed sports games stretching back to 1983’s (Dr. J and Larry Bird Go) One-on-One. No one in 1988 could have imagined what it would come to mean in the years to come for either its publisher or its spokesman/mascot, both of whom would ride it to iconic heights in American pop culture.

The Sega Genesis marked the third time EA had taken a leap of faith on a new platform. It was the first time, however, that their faith paid off. About 25 percent of the games EA sold in 1990 were for the Genesis. And when the console really started to take off in 1991, fueled not least by their own games, EA was there to reap the rewards. In that year, four of the ten best-selling Genesis games were published by EA. At the peak of their dominance, EA alone was publishing about 35 percent of all the games sold for the Genesis. Absent the boost their games gave it early on, it’s highly questionable whether the Genesis would have succeeded at all in the United States.

In the beginning, few of EA’s outside developers had been terribly excited about writing for the consoles. One of them remembers Hawkins “reading us the riot act” just to get them onboard. Indeed, Hawkins claims today that about 15 percent of EA’s internal employees were so unhappy with the new direction that they quit. Certainly his latest rhetoric could hardly have been more different from that of 1983:

I knew we had to let go of our attachment to machines that the public did not want to buy, and support the hardware that the public would embrace. I made this argument on the grounds of delivering customer satisfaction, and how quality is in the eye of the beholder. If the customer buys a Genesis, we want to give him the best we can for the machine he bought and not resent the consumer for not buying a $1000 computer.

By this point, Hawkins had finally bitten the bullet and done a deal with Nintendo, who, in the face of multiple government investigations and lawsuits over their business practices, were becoming somewhat more generous with both their competitors and licensees. When games like Skate or Die, a port of a Commodore 64 hit that just happened to be perfect for the Nintendo and Sega demographics as well, started to sell in serious numbers on the consoles, Hawkins’s developers’ aversion started to fade in the face of all that filthy lucre. Soon the developers of Skate or Die were happily plunging into a sequel which would be a console exclusive.

Even the much-dreaded oversight role played by Nintendo, in which they reviewed every game before allowing it to be published, proved less onerous than expected. When Will Harvey, the designer of an action-adventure called The Immortal, finally steeled himself to look at Nintendo’s critique thereof, he was happily surprised to find the list of “suggestions” to be very helpful on the whole, demonstrating real sensitivity to the effect he was trying to achieve. Even Bing Gordon, who had been highly skeptical of getting into bed with Nintendo, had to admit in the end that “the rating system is fair. On a scale from zero to a hundred, where zero meant the system was totally manipulated for Nintendo’s self-interest and a hundred meant that it was absolutely democratic, they’d probably get a ninety. I’ve seen a little bit of self-interest, but this is America, the land of self-interest.”

Although EA cut their Nintendo teeth on the NES, it was on the long-awaited follow-up console, 1991’s Super Nintendo, that they really began to thrive. That machine boasted capabilities similar to those of the Sega Genesis, meaning EA already had games ready to port over, along with developers with considerable expertise in writing for a more advanced species of console. Just in time for the Christmas of 1991, EA released a new version of John Madden FootballJohn Madden Football ’92 — simultaneously on the Super Nintendo and the Genesis. The sequel had been created, according to the recollections of several EA executives, against the advice of market researchers and retailers: “All you’re going to do is obsolete our old game.” But Trip Hawkins remembered how much, as a kid, he had loved the Strat-O-Matic Football board game, for which a new set of player and team cards was issued every year just before the beginning of football season, ensuring that you could always recreate in the board game the very same season you were watching every Sunday on television. So, he ignored the objections of the researchers and the retailers, and John Madden Football ’92 became an enormous hit, by far the biggest EA had yet enjoyed on any platform — thus inaugurating, for better or for worse, the tradition of annual versions of gaming’s most evergreen franchise. Like clockwork, we’ve gotten a new Madden every single year since, a span of time that numbers a quarter-century and change as of this writing.

All of this had a transformative effect on EA’s bottom line, bringing on their biggest growth spurt yet. Revenues increased from $78 million in 1990 to $113 million in 1991; then they jumped to $175 million in 1992, accompanied by a two-for-one stock split that was necessary to keep the share price, which had been at $10 just a few years before, from exceeding $50. In that year, six of the fifteen most popular console games, across all platforms, were published by EA. Their Sega Genesis games alone generated $77 million, 18 percent more than the entirety of the company’s product portfolio had managed in 1989. This was also the first year that EA’s console games in the aggregate outsold their offerings for computers. They were leaving no doubt now as to where their primary loyalty lay: “The 16-bit consoles are far better for games than PCs. The Genesis is a very sophisticated machine…” The disparity between the two sides of the company’s business would only continue to get more pronounced, as EA’s sales jumped by an extraordinary 70 percent — to $298 million — in 1993, a spurt fueled entirely by console-game sales.

But, despite all their success on the consoles, EA — and especially their founder, Trip Hawkins — continued to chafe under the restrictions of the walled-garden model of software distribution. Accordingly, Hawkins put together a group inside EA to research the potential for a CD-ROM-based multimedia set-top box of their own, one that would be used for more than just playing games — sort of a CD-I done right. “The Japanese videogame companies,” he said, “are too shortsighted to see where this is going.” In contrast to their walled gardens, his box would be as open as possible. Rather than a single new hardware product, it would be a set of hardware specifications and an operating system which manufacturers could license, which would hopefully result in a situation similar to the MS-DOS marketplace, where lots of companies competed and innovated within the bounds of an established standard. The marketplace for games and applications as well on the new machine would be far less restricted than the console norm, with a more laissez-faire attitude to content and a royalty fee of just $3 per unit sold.

In 1991, EA spun off the venture under the name of 3DO. Hawkins turned most of his day-to-day responsibilities at EA over to Larry Probst in order to take personal charge of his new baby, which took tangible form for the first time with the release of the Panasonic “Real 3DO Player” in late 1993. It and other implementations of the 3DO technology managed to sell 500,000 units worldwide — 200,000 of them in North America — by January of 1995. Yet those numbers were still a pittance next to those of the dedicated game consoles, and the story of 3DO became one of constant flirtations with success that never quite led to that elusive breakthrough moment. As 3DO struggled, Hawkins’s relations with his old company worsened. He believed they had gone back on promises to support his new venture wholeheartedly; “I didn’t feel like I was leaving EA, but it turned out that way,” he says today with lingering bitterness. The long, frustrating saga of 3DO wouldn’t finally straggle to a bankruptcy until 2003.

EA, meanwhile, was flying ever higher absent their founder. Under Larry Probst — always the most hard-nosed and sober-minded of the executive staff, the person most laser-focused on the actual business of selling videogames — EA cemented their reputation as the conservative, risk-averse giant of their industry. This new EA was seemingly the polar opposite of the company that had once asked with almost painful earnestness if a computer could make you cry. And yet, paradoxically, it was a place still inhabited by a surprising number of the people who had come up with that message. Most prominent among them was Bing Gordon, who notes cryptically today only that “people’s ideals get tested in the face of love or money.” Part of the problem — assuming one judges EA’s current less-than-boldly-innovative lineup of franchises to be a problem — may be a simple buildup of creative cruft that has resulted from being in business for so long. Every franchise that debuts in inspiration and innovation, then goes on to join John Madden Football on the list of EA perennials, sucks some of the bandwidth away that might otherwise have been devoted to the next big innovator.

In the summer of 1987, when EA was still straddling the line between their old personality and their new, Trip Hawkins wrote the following lines in their official newsletter — lines which evince the keenly felt tension between art and commerce that has become the defining aspect of EA’s corporate history for so many in the years since:

Unfortunately, simply being creative doesn’t always mean you’ll be wildly successful. Van Gogh sold only one painting during his lifetime. Lots of people would still rather go see Porky’s Revenge IV, ignoring well-produced movies like Amadeus or Chariots of Fire. As a result, film producers take fewer risks, and we get less variety, and pretty soon the Porky’s and Rambo clones are all you can find on a Friday night. Software developers have the same problem. (To this day, all of us M.U.L.E. fans wonder why the entire world hasn’t fallen in love with our favorite game.)

The only way to solve the problem is to do it together. On our end, we’ll keep innovating, researching, experimenting with new ways to use this new medium; on your end, you can support our efforts by taking an occasional risk, by buying something new and different… maybe Robot Rascals, or Make Your Own Murder Party.

You may be very pleasantly surprised — and you’ll help our software artists live to innovate another day.

Did EA go the direction they did because of gamers’ collective failure to support their most innovative, experimental work? Does it even matter if so? The more pragmatic among us might note that the EA of today is delivering games that millions upon millions of people clearly want to play, and where’s the harm in that?

Still, as we look upon this industry that has so steadfastly refused to grow up in so many ways, there remain always those pictures of EA’s first generation of software artists — pictures that, yes, are a little pretentious and a lot contrived, but that nevertheless beckon us to pursue higher ideals. They’ve taken on an identity of their own now, quite apart from the history of the company that once splashed them across the pages of glossy lifestyle magazines. Long may they continue to inspire.

(Sources: the book Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay and Game Over: How Nintendo Conquered the World by David Sheff; Harvard Business School’s case study “Electronic Arts in 1995”; ACE of April 1990; Amazing Computing of July 1992; Computer Gaming World of March 1988, October 1988, and June 1989; MicroTimes of April 1986; The One of November 1988; Electronic Arts’s newsletter Farther from Summer 1987; AmigaWorld premiere issue; materials relating to the Software Publishers Association included in the Brøderbund archive at the Strong Museum of Play; the episode of the Computer Chronicles television series entitled “Computer Games.” Online sources include “We See Farther — A History of Electronic Arts” at Gamasutra, “How Electronic Arts Lost Its Soul” at Polygon, and Funding Universe‘s history of Electronic Arts.)

 
 

Tags: , ,

Peter Molyneux’s Kingdom in a Box

Peter Molyneux, circa 1990

Peter Molyneux, circa 1990.

I have this idea of a living world, which I have never achieved. It’s based upon this picture in my head, and I can see what it’s like to play that game. Every time I do it, then it maybe gets closer to that ideal. But it’s an ambitious thing.

— Peter Molyneux

One day as a young boy, Peter Molyneux stumbled upon an ant hill. He promptly did what young boys do in such situations: he poked it with a stick, watching the inhabitants scramble around as destruction rained down from above. But then, Molyneux did something that set him apart from most young boys. Feeling curious and maybe a little guilty, he gave the ants some sugar for energy and watched quietly as they methodically undid the damage to their home. Just like that, he woke up to the the idea of little living worlds with lots of little living inhabitants — and to the idea of he himself, the outsider, being able to affect the lives of those inhabitants. The blueprint had been laid for one of the most prominent and influential careers in the history of game design. “I have always found this an interesting mechanic, the idea that you influence the game as opposed to controlling the game,” he would say years later. “Also, the idea that the game can continue without you.” When Molyneux finally grew bored and walked away from the ant hill on that summer day in his childhood, it presumably did just that, the acts of God that had nearly destroyed it quickly forgotten. Earth — and ants — abide.

Peter Molyneux was born in the Surrey town of Guildford (also hometown of, read into it what you will, Ford Prefect) in 1959, the son of an oil-company executive and a toy-shop proprietor. To hear him tell it, he was qualified for a career in computer programming largely by virtue of being so hopeless at everything else. Being dyslexic, he found reading and writing extremely difficult, a handicap that played havoc with his marks at Bearwood College, the boarding school in the English county of Berkshire to which his family sent him for most of his teenage years. Meanwhile his less than imposing physique boded ill for a career in the military or manual labor. Thankfully, near the end of his time at Bearwood the mathematics department acquired a Commodore PET,  while the student union almost simultaneously installed a Space Invaders machine. Seeing a correspondence between these two pieces of technology that eluded his fellow students, Molyneux set about trying to program his own Space Invaders on the PET, using crude character glyphs to represent the graphics that the PET, being a text-only machine, couldn’t actually draw. No matter. A programmer had been born.

These events, followed shortly by Molyneux’s departure from Bearwood to face the daunting prospect of the adult world, were happening at the tail end of the 1970s. Like so many of the people I’ve profiled on this blog, Molyneux was thus fortunate enough to be born not only into a place and circumstances that would permit a career in games, but at seemingly the perfect instant to get in on the ground floor as well. But, surprisingly for a fellow who would come to wear his huge passion for the medium on his sleeve — often almost as much to the detriment as to the benefit of his games and his professional life — Molyneux took a meandering path filling fully another decade to rise to prominence in the field. Or, to put it less kindly: he failed, repeatedly and comprehensively, at every venture he tried for most of the 1980s before he finally found the one that clicked.

Perhaps inspired by his mother’s toy shop, his original dream was to be not so much a game designer as a computer entrepreneur. After earning a degree in computer science from Southampton University, he found himself a job working days as a systems analyst for a big company. By night, he formed a very small company called Vulcan in his hometown of Guildford to implement a novel scheme for selling blank disks. He wrote several simple programs: a music creator, some mathematics drills, a business simulator, a spelling quiz. (The last, having been created by a dyslexic and terrible speller in general, was a bit of a disaster.) For every ten disks you bought for £10, you would get one of the programs for free along with your blank disks. After placing his tiny advertisement in a single magazine, Molyneux was so confident of the results that he told his local post office to prepare for a deluge of mail, and bought a bigger mailbox for his house to hold it all. He got five orders in the first ten days, less than fifty in the scheme’s total lifespan — along with about fifty more inquiries from people who had no interest in the blank disks but just wanted to buy his software.

Taking their interest to heart, Molyneux embarked on Scheme #2. He improved the music creator and the business simulator and tried to sell them as products in their own right. Even years later he would remain proud of the latter in particular — his first original game, which he named Entrepreneur: “I really put loads of features into it. You ran a business and you could produce anything you liked. You had to do things like keep the manufacturing line going, set the price for your product, decide what advertising you wanted, and these random events would happen.” With contests all the rage in British games at the time, he offered £100 to the first person to make £1 million in Entrepreneur. The prize went unclaimed; the game sold exactly two copies despite being released near the zenith of the early-1980s British mania for home computers. “Everybody around me was making an absolute fortune,” Molyneux remembers. “You had to be a complete imbecile in those days not to make a fortune. Yet here I was with Entrepreneur and Composer, making nothing.” He wasn’t, it appeared, very good at playing his own game of entrepreneurship; his own £1 million remained far out of reach. Nevertheless, he moved on to the next scheme.

Scheme #3 was to crack the business and personal-productivity markets via a new venture called Taurus, initiated by Molyneux and his friend Les Edgar, who were later joined by one Kevin Donkin. Molyneux having studied accounting at one time in preparation for a possible career in the field (“the figures would look so messy that no one would ever employ me”), it was decided that Taurus would initially specialize in financial software with exciting names like Taurus Accounts, Taurus Invoicing, and Taurus Stock Control. Those products, like all the others Molyneux had created, went nowhere. But now came a bizarre story of mistaken identity that… well, it wouldn’t make Molyneux a prominent game designer just yet, but it would move him further down the road to that destination.

Commodore was about to launch the Amiga in Britain, and, this being early on when they still saw it as potential competition for the IBMs of the world, was looking to convince makers of productivity software to write for the machine.  They called up insignificant little Taurus of all people to request a meeting to discuss porting the “new software” the latter had in the works to the Amiga. Molyneux and Edgar assumed Commodore must have somehow gotten wind of a database program they were working on. In a state of no small excitement, they showed up at Commodore UK’s headquarters on the big day and met a representative. Molyneux:

He kept talking about “the product,” and I thought they were talking about the database. At the end of the meeting, they say, “We’re really looking forward to getting your network running on the Amiga.” And it suddenly dawned on me that this guy didn’t know who we were. Now, we were called Taurus, as in the star sign. He thought we were Torus, a company that produced networking systems. I suddenly had this crisis of conscience. I thought, “If this guy finds out, there go my free computers down the drain.” So I just shook his hand and ran out of that office.

An appropriately businesslike advertisement for Taurus's database manager gives no hint of what lies in the company's futures.

An appropriately businesslike advertisement for Taurus’s database manager gives no hint of what actually lies in the company’s future…

By the time Commodore figured out they had made a terrible mistake, Taurus had already been signed as official Amiga developers and given five free Amigas. They parlayed those things into a two-year career as makers of somewhat higher-profile but still less than financially successful productivity software for the Amiga. After the database, which they named Acquisition and declared “the most complete database system conceived on any microcomputer” — Peter Molyneux’s habit of over-promising, which gamers would come to know all too well, was already in evidence — they started on a computer-aided-design package called X-CAD Designer. Selling in the United States for the optimistic prices of $300 and $500 respectively, both programs got lukewarm reviews; they were judged powerful but kind of incomprehensible to actually use. But even had the reviews been better, high-priced productivity software was always going to be a hard sell on the Amiga. There were just three places to really make money in Amiga software: in personal-creativity software like paint programs, in video-production tools, and, most of all, in games. In spite of all of Commodore’s earnest efforts to the contrary, the Amiga had by now become known first and foremost as the world’s greatest gaming computer.

The inspiration for the name of Bullfrog Software.

The inspiration for Bullfrog Software.

Molyneux and his colleagues therefore began to wind down their efforts in productivity software in favor of a new identity. They renamed their company Bullfrog after a ceramic figurine they had lying around in the “squalor” of what Molyneux describes as their “absolutely shite” office in a Guildford pensioner’s attic. Under the new name, they planned to specialize in games — Scheme #4 for Peter Molyneux. “We had a simple choice of hitting our head against a brick wall with business software,” he remembers, “or doing what I really wanted to do with my life anyway, which was write games.” Having made the choice to make Bullfrog a game developer, their first actual product was not a game but a simple drum sequencer for the Amiga called A-Drum. Hobgoblins and little minds and all the rest. When A-Drum duly flopped, they finally got around to games.

A friend of Molyneux’s had written a budget-priced action-adventure for the Commodore 64 called Druid II: Enlightenment, and was looking for someone to do an Amiga conversion. Bullfrog jumped at the chance, even though Molyneux, who would always persist in describing himself as a “rubbish” programmer, had very little idea how to program an action game. When asked by Enlightenment‘s publisher Firebird whether he could do the game in one frame — i.e., whether he could update everything onscreen within a single pass of the electron gun painting the screen to maintain the impression of smooth, fluid movement — an overeager Molyneux replied, “Are you kidding me? I can do it in ten frames!” It wasn’t quite the answer Firebird was looking for. But in spite of it all, Bullfrog somehow got the job, producing what Molyneux describes as a “technically rather poor” port of what had been a rather middling game in the first place. (Molyneux’s technique for getting everything drawn in one frame was to simply keep shrinking the size of the display until even his inefficient routines could do the job.) And then, as usual for everything Molyneux touched, it flopped. But Bullfrog did get two important things out of the project: they learned much about game programming, and they recruited as artist for the project one Glenn Corpes, who was not only a talented pixel pusher but also a talented programmer and fount of ideas almost the equal of Molyneux.

Despite the promising addition of Corpes, the first original game conjured up by the slowly expanding Bullfrog fared little better than Enlightenment. Corpes and Kevin Donkin turned out a very of-its-time top-down shoot-em-up called Fusion, which Electronic Arts agreed to release. Dismissed as “a mixture of old ideas presented in a very unexciting manner” by reviewers, Fusion was even less impressive technically than had been the Enlightenment port, being plagued by clashing colors and jittery scrolling — not at all the sort of thing to impress the notoriously audiovisually-obsessed Amiga market. Thus Fusion flopped as well, keeping Molyneux’s long record of futility intact. But then, unexpectedly from this group who’d shown so little sign of ever rising above mediocrity, came genius.

To describe Populous as a stroke of genius would be a misnomer. It was rather a game that grew slowly into its genius over a considerable period of time, a game that Molyneux himself considers more an exercise in evolution than conscious design. “It wasn’t an idea that suddenly went ‘Bang!'” he says. “It was an idea that grew and grew.” And its genesis had as much to do with Glenn Corpes as it did with Peter Molyneux.

Every Populous world is built out of combinations of just 16 blocks.

Every Populous world is built out of combinations of just 56 blocks.

It all began when Corpes started showing off a routine he had written which let him build isometric landscapes out of three-dimensional blocks, like a virtual Lego set. You could move the viewpoint about the landscape, raising and lowering the land by left-clicking to add new blocks, right-clicking to remove them. Molyneux was immediately sure there was a game in there somewhere. His childhood memory of the ant farm leaping to mind, he said, “Let’s have a thousand people running around on it.”

Populous thus began with those little people in lieu of ants, wandering independently over Corpes’s isometric landscapes in real time. When they found a patch they liked, they would settle down, building little huts. Since, this being a computer game, the player would obviously need something to do as well, Molyneux started adding ways for you, as a sort of God on high, to influence the people’s behavior in indirect ways. He added something he called a “Papal Magnet,” a huge ankh you could place in the world to draw your people toward a given spot. But there would come a problem if the way to the Ankh happened to be blocked by, say, a lake. Molyneux claims he added Populous‘s most basic mechanic, the thing you spend by far the most time doing when playing the game, as a response to his “incompetence” as a coder and resulting inability to write a proper path-finding algorithm: when your people get stuck somewhere, you can, subject to your mana reserves — even gods have limits — raise or lower the land to help them out. With that innovation, Populous from the player’s perspective became largely an exercise in terraforming, creating smooth, even landscapes on which your people can build their huts, villages, and eventually castles. As your people become fruitful and multiply, their prayers fuel your mana reserves.

Next, Molyneux added warfare to the picture. Now you would be erecting mountains and lakes to protect your people from their enemies, who start out walking about independently on the other side of the world. The ultimate goal of the game, of course, is to use your people to wipe out your enemy’s people before they do the same to you; this is a very Old Testament sort of religious experience. To aid in that goal, Molyneux gradually added lots of other godly powers to your arsenal, more impressive than the mere raising and lowering of land if also far more expensive in terms of precious mana: flash floods, earthquakes, volcanic eruptions, etc. You know, all your standard acts of God, as found in the Bible and insurance claims.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Lego Populous. Bullfrog had so much fun with this implementation of the idea that they seriously discussed trying to turn it into a commercial board game.

Parts of Populous were prototyped on the tabletop. Bullfrog used Lego bricks to represent the landscapes, a handy way of implementing the raising-and-lowering mechanic in a physical space. They went so far as to discuss a license with Lego, only to be told that Lego didn’t support “violent games.” Molyneux admits that the board game, while playable, was very different from the computerized Populous, playing out as a slow-moving, chess-like exercise in strategy. The computer Populous, by contrast, can get as frantic as any action game, especially in the final phase when all the early- and mid-game maneuvering and feinting comes down to the inevitable final genocidal struggle between Good and Evil.

Bullfrog. From left: Glenn Corpes (artist), Shaun Cooper (tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Bullfrog. From left: Glenn Corpes (artist and programmer), Shaun Cooper (artist and tester), Peter Molyneux (designer and programmer), Kevin Donkin (designer and programmer), Les Edgar (office manager), Andy Jones (artist and tester).

Ultimately far more important to the finished product than Bullfrog’s Lego Populous were the countless matches Molyneux played on the computer against Glenn Corpes. Apart from all of its other innovations in helping to invent the god-game and real-time-strategy genres, Populous was also a pioneering effort in online gaming. Multi-player games — the only way to play Populous for many months — took place between two people seated at two separate Amigas, connected together via modem or, if together in the same room as Molyneux and Corpes were, via a cable. Vanishingly few other designers were working in this space at the time, for understandable reasons: even leaving aside the fact that the majority of computer owners didn’t own modems, running a multi-player game in real-time over a connection as slow as 1200 baud was hardly a programming challenge for the faint-hearted. The fact that it works at all in Populous rather puts the lie to Molyneux’s self-deprecating description of himself as a “rubbish” coder.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which made it much easier to localize and popularize across Europe. Everything is done using the initially incomprehensible suite of icons you near the bottom of the screen.

You draw your people toward different parts of the map by placing the Papal Magnet. The first one to touch it becomes the leader. There are very few words in the game, which only made it that much easier for Electronic Arts to localize and popularize across Europe. Everything is instead done using the initially incomprehensible suite of icons you near the bottom of the screen. Populous does become intuitive in time, but it’s not without a learning curve.

Development of Populous fell into a comfortable pattern. Molyneux and Corpes would play together for several hours every evening, then nip off to the pub to talk about their experiences. Next day, they’d tweak the game, then they’d go at it again. It’s here that we come to the beating heart of Molyneux’s description of Populous as a game evolved rather than designed. Almost everything in the finished game beyond the basic concept was added in response to Molyneux and Corpes’s daily wars. For instance, Molyneux initially added knights, super-powered individuals who can rampage through enemy territory and cause a great deal of havoc in a very short period of time, to prevent their games from devolving into endless stalemates. “A game could get to the point where both players had massive populations,” he says, “and there was just no way to win.” With knights, the stronger player “could go and massacre the other side and end the game at a stroke.”

A constant theme of all the tweaking was to make a more viscerally exciting game that played more quickly. For commercial as well as artistic reasons — Amiga owners weren’t particularly noted for their patience with slow-paced, cerebral games — this was considered a priority. Over the course of development, the length of the typical game Molyneux played with Corpes shrank from several hours to well under one.

Give them time, and your people will turn their primitive villages into castles -- and no, the drawing isn't quite done to scale.

Give them time, and your people will turn their primitive huts into castles.

Even tweaked to play quickly and violently, Populous was quite a departure from the tried-and-true Amiga fare of shoot-em-ups, platformers, and action-adventures. The unenviable task of trying to sell the thing to a publisher was given to Les Edgar. After visiting about a dozen publishers, he convinced Electronic Arts take a chance on it. Bullfrog promised EA a finished Populous in time for Christmas 1988. By the time that deadline arrived, however, it was still an online multiplayer-only game, a prospect EA knew to be commercially untenable. Molyneux and his colleagues thus spent the next few months creating Populous‘s single-player “Conquest Mode.”

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

In addition to the green and pleasant land of the early levels, there are also worlds of snow and ice, desert worlds, and even worlds of fire and lava to conquer.

Perilously close to being an afterthought to the multi-player experience though it was, Conquest Mode would be the side of the game that the vast majority of its eventual players would come to know best if not exclusively. Rather than design a bunch of scenarios by hand, Bullfrog wrote an algorithm to procedurally generate 500 different “worlds” for play against a computer opponent whose artificial intelligence also had to be created from scratch during this period. This method of content creation, used most famously by Ian Bell and David Braben in Elite, was something of a specialty and signpost of British game designers, who, plagued by hardware limitations far more stringent than their counterparts in the United States, often used it as a way to minimize the space their games consumed in memory and on disk. Most recently, Geoff Crammond’s hit game The Sentinel, published by Firebird, had used a similar scheme. Glenn Corpes believes it may have been an EA executive named Joss Ellis who first suggested it to Bullfrog.

Populous‘s implementation is fairly typical of the form. Each of the 500 worlds except the first is protected by a password that is, like everything else, itself procedurally generated. When you win at a given level, you’re given the password to a higher, harder level; whether and how many levels you get to skip is determined by how resounding a victory you’ve just managed. It’s a clever scheme, packing a hell of a lot of potential gameplay onto a single floppy disk and even making an effort to avoid boring the good player — and all without forcing Bullfrog to deal with the complications of actually storing any state whatsoever onto disk.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

It inevitably all comes down to a frantic final free-for-all between your people and those of your enemy.

Given their previous failures, Bullfrog understandably wasn’t the most confident group when a well-known British games journalist named Bob Wade, who had already played a pre-release version of the game, came by for a visit. For hours, Molyneux remained too insecure to actually ask Wade the all-important question of what he thought of the game. At last, after Wade had joined the gang for “God knows how many” pints at their local, Molyneux worked up the courage to pop the question. Wade replied that it was the best game he’d ever played, and he couldn’t wait to get back to it — prompting Molyneux to think he must have made some sort of mistake, and that under no circumstances should he be allowed to play another minute of it in case his opinion should change. It was Wade and the magazine he was writing for at the time, ACE (Advanced Computer Entertainment), who coined the term “god game” in the glowing review that followed, the first trickle of a deluge of praise from the gaming press in Britain and, soon enough, much of the world.

Bullfrog’s first royalty check for Populous was for a modest £13,000. Their next was for £250,000, prompting a naive Les Edgar to call Electronic Arts about it, sure it was a mistake. It was no mistake; Populous alone reportedly accounted for one-third of EA’s revenue during its first year on the market. That Bullfrog wasn’t getting even bigger checks was a sign only of the extremely unfavorable deal they’d signed with EA from their position of weakness. Populous finally and definitively ended the now 30-year-old Peter Molyneux’s long run of obscurity and failure at everything he attempted. In his words, he went overnight from “urinating in the sink” and “owing more money than I could ever imagine paying back” to “an incredible life” in games. Port after port came out for the next couple of years, each of them becoming a bestseller on its platform. Populous was selected to become one of the launch titles for the Super Nintendo console in Japan, spawning a full-blown fad there that came to encompass comic books, tee-shirts, collectibles, and even a symphony concert. When they visited Japan for the first time on a promotional tour, Molyneux and Les Edgar were treated like… well, appropriately enough, like gods. Populous sold 3 million copies in all according to some reports, an almost inconceivable figure for a game during this period.

Amidst all its other achievements, Populous was also something of a pioneer in the realm of e-sports. The One magazine and Electronic Arts hosted a tournament to find the best player in Britain.

The One magazine and Electronic Arts hosted a tournament to find the best Populous player in Britain.

While a relatively small percentage of Populous players played online, those who did became pioneers of sorts in their own right. Some bulletin-board systems set up matchmaking services to pair up players looking for a game, any time, day or night; the resulting connections sometimes spanned national borders or even oceans. The matchmakers were aided greatly by Bullfrog’s forward-thinking decision to make all versions of Populous compatible with one another in terms of online play. In making it so quick and easy to find an online opponent, these services prefigured the modern world of Internet-enabled online gaming. Molyneux pronounced them “pretty amazing,” and at the time they really were. In 1992, he spoke excitedly of a recent trip to Japan, where’d he seen a town “with 10,000 homes all linked together. You can play games with anybody in the place. It’s enormous, really enormous, and it’s growing.” If only he’d known what online gaming would grow into in the next decade or two…

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibitio match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn't appear to have happened.

A youngster named Andrew Reader wound up winning the tournament, only to get trounced in an exhibition match by the master, Peter Molyneux himself. There was talk of televising a follow-up tournament on Sky TV, but it doesn’t appear to have happened.

The original Amiga version of Populous had been released all but simultaneously with the Amiga version of SimCity. Press and public alike immediately linked the two games together; AmigaWorld magazine, for instance, went so far as to review them jointly in a single article. Both Will Wright of SimCity fame and Peter Molyneux were repeatedly asked in interviews whether they’d played the other’s game. Wright was polite but, one senses, a little disinterested in Populous, saying he “liked the idea of playing God and having a population follow you,” but “sort of wish they’d gone for a slightly more educational angle.” Molyneux was much more enthusiastic about his American counterpart’s work, repeatedly floating a scheme to somehow link the two games together in more literal fashion for online play.  He claimed at one point that Maxis (developers of SimCity) and his own Bullfrog had agreed on a liaison “to go backwards and forwards” between their two companies to work on linking their games. The liaison, he claimed, had “the Populous landscape moving to and from SimCity,” and a finished product would be out sometime in 1992. Like quite a number of the more unbelievable schemes Molyneux has floated over the years, it never happened.

The idea of a linkage between SimCity and Populous, whether taking place online or in the minds of press and public, can seem on the face of it an exceedingly strange one today. How would the online linkage actually work anyway? Would the little Medieval warriors from Populous suddenly start attacking SimCity‘s peaceful modern utopias? Or would Wright’s Sims plop themselves down in the middle of Molyneux’s apocalyptic battles and start building stadiums and power plants? These were very different games: Wright’s a noncompetitive, peaceful exercise in urban planning with strong overtones of edutainment; Molyneux’s a zero-sum game of genocidal warfare that aspired to nothing beyond entertainment. Knowing as we do today the future paths of these two designers — i.e., ever further in the directions laid down by these their first significant works — only heightens the seeming dichotomy.

That said, there actually were and are good reasons to think of SimCity and Populous as two sides of the same coin. For us today, the list includes first of all the reasons of simple historical concordance. Each marks the coming-out party of one of the most important game designers of all time, occurring within bare weeks of one another.

But of course the long-term importance of these two designers to their field wasn’t yet evident in 1989; obviously players were responding to something else in associating their games with one another. Once you stripped away their very different surface trappings and personalities, the very similar set of innovations at the heart of each was laid bare. AmigaWorld said it very well in that joint review: “The real joy of these programs is the interlocking relationships. Sure, you’re a creator, but even more a facilitator, influencer, and stage-setter for little computer people who act on your wishes in their own time and fashion.” It’s no coincidence that, just as Peter Molyneux was partly inspired by an ant hill to create Populous, one of Will Wright’s projects of the near future would be the virtual ant farm SimAnt. In creating the first two god games, the two were indeed implementing a very similar core idea, albeit each in his own very different way.

Joel Billings of the king of American strategy games SSI had founded his company back in 1979 with the explicit goal of making computerized versions of the board games he loved. SimCity and Populous can be seen as the point when computer strategy games transcended that traditional approach. The real-time nature of these games makes them impossible to conceive of as anything other than computer-based works, while their emergent complexity makes them objects of endless fascination for their designers as much or more so for than their players.

In winning so many awards and entrancing so many players for so long, SimCity and Populous undoubtedly benefited hugely from their sheer novelty. Their flaws stand out more clearly today. With its low-resolution graphics and without the aid of modern niceties like tool tips and graphical overlays, SimCity struggles to find ways to communicate vital information about what your city is really doing and why, making the game into something of an unsatisfying black box unless and until you devote a lot of time and effort to understanding what affects what. Populous has many of the same interface frustrations, along with other problems that feel still more fundamental and intractable, especially if you, like the vast majority of players back in its day, experience it through its single-player Conquest Mode. Clever as they are, the procedurally generated levels combined with the fairly rudimentary artificial intelligence of your computer opponent introduce a lot of infelicities. Eventually you begin to realize that one level is pretty much the same as any other; you just need to execute the same set of strategies and tactics more efficiently to have success at the higher levels.

Both Will Wright and Peter Molyneux are firm adherents to the experimental, boundary-pushing school of game design — an approach that yields innovative games but not necessarily holistically good games every time out. And indeed, throughout his long career each of them has produced at least as many misses as hits, even if we dismiss the complaints of curmudgeons like me and lump SimCity and Populous into the category of the hits. Both designers have often fallen into the trap, if trap it be, of making games that are more interesting for creators and commentators than they are fun for actual players. And certainly both have, like all of us, their own blind spots: in relying so heavily on scientific literature to inform his games, Wright has often produced end results with something of the feel of a textbook, while Molyneux has often lacked the discipline and gravitas to fully deliver on his most grandiose schemes.

But you know what? It really doesn’t matter. We need our innovative experimentalists to blaze new trails, just as we need our more sober, holistically-minded designers to exploit the terrain they discover. SimCity and Populous would be followed by decades of games that built on the possibilities they revealed — many of which I’d frankly prefer to play today than these two original ground-breakers. But, again, that reality doesn’t mean we should celebrate SimCity and Populous one iota less, for both resoundingly pass the test of historical significance. The world of gaming would be a much poorer place without Will Wright and Peter Molyneux and their first living worlds inside a box.

(Sources: The Official Strategy Guide for Populous and Populous II by Laurence Scotford; Master Populous: Blueprints for World Power by Clayton Walnum; Amazing Computing of October 1989; Next Generation of November 1998; PC Review of July 1992; The One of April 1989, September 1989, and May 1991; Retro Gamer 44; AmigaWorld of December 1987, June 1989, and November 1989; The Games Machine of November 1988; ACE of April 1989; the bonus content to the film From Bedrooms to Billions. Archived online sources include features on Peter Molyneux and Bullfrog for Wired Online, GameSpot, and Edge Online. Finally, Molyneux’s postmortem on Populous at the 2011 Game Developers Conference.

Populous is available for purchase from GOG.com.)

 

Tags: , , ,