RSS

Tag Archives: electronic arts

The Ratings Game, Part 3: Dueling Standards

When Sega, Nintendo, and the Software Publishers Association (SPA) announced just before the Senate hearing of December 9, 1993, that they had agreed in principle to create a standardized rating system for videogames, the timing alone marked it as an obvious ploy to deflect some of the heat that was bound to come their way later that day. At the same time, though, it was also more than a ploy: it was in fact the culmination of an effort that had been underway in some quarters of the industry for months already, one which had begun well before the good Senators Lieberman and Kohl discovered the horrors of videogame violence and sex. As Bill White of Sega was at pains to point out throughout the hearing, Sega had been seriously engaged with the question of a rating system for quite some time, and had managed to secure promises of support from a considerable portion of the industry. But the one entity that had absolutely rejected the notion was the very one whose buy-in was most essential for any overarching initiative of this sort: Nintendo. “Howard [Lincoln] was not going to be part of any group created by Sega,” laughs Dr. Arthur Pober, one of the experts the latter consulted.

So, Sega decided to go it alone. Again as described by Bill White at the hearing, they rolled out a thoroughly worked-out rating system for any and all games on their platforms just in time for Mortal Kombat in September of 1993. It divided games into three categories: GA for general audiences, MA-13 for those age thirteen or older, and MA-17 for those age seventeen or older. An independent board of experts was drafted to assign each new game its rating without interference from Sega’s corporate headquarters; its chairman was the aforementioned Arthur Pober, a distinguished educational psychologist with decades of research experience about the role of media in children’s lives on his CV. Under his stewardship, Mortal Kombat wound up with an MA-13 rating; Night Trap, which had already been in stores for the better part of a year by that point, was retroactively assigned a rating of MA-17.

Although one might certainly quibble that these ratings reflected the American media establishment’s terror of sex and relatively blasé attitude toward violence, Sega’s rating system bore all the outward signs of being a good-faith exercise. At the very least it was, as White repeatedly stated at the hearing, a good first step, one that was taken before any of the real controversy even began.

The second step was of course Nintendo’s grudging acquiescence to the concept of a universal rating system on the day of the hearing — a capitulation whose significance should not be underestimated in light of the company’s usual attitude toward intra-industry cooperation, which might be aptly summarized as “our way or the highway.” And the third step came less than a month later, at the 1994 Winter Consumer Electronics Show, which in accordance with long tradition took place over the first week of the new year in Las Vegas.

Anyone wandering the floor at this latest edition of CES would have seen a digital-games industry that was more fiercely competitive than ever. Sega, celebrating a recent report that gave them for the first time a slight edge over Nintendo in overall market share, had several attention-grabbing new products on offer, including the latest of their hugely popular Sonic the Hedgehog games; the Activator, an early attempt at a virtual-reality controller; the CDX, a portable CD player that could also be used as a game console; and, most presciently of all, a partnership with AT&T to bring online multiplayer gaming, including voice communication, to the Genesis. Meanwhile Nintendo gave the first hints about what would see the light of day some 30 months later as the Nintendo 64. And other companies were still trying to muscle their way into the bifurcated milieu of the living-room consoles. Among them were Atari, looking for a second shot at videogame glory with their Jaguar console; Philips, still flogging the dead horse known as CD-I; and a well-financed new company known as 3DO, with a console that bore the same name. Many traditional makers of business-oriented computers were suddenly trying to reach many of the same consumers, through products like Compaq’s new home-oriented Presario line; even stodgy old WordPerfect was introducing a line of entertainment and educational software. Little spirit of cooperation was in evidence amidst any of this. With “multimedia” the buzzword of the zeitgeist, the World Wide Web looming on the near horizon, and no clarity whatsoever about what direction digital technology in the home was likely to take over the next few years, the competition in the space was as cutthroat as it had ever been.

And yet in a far less glitzy back room of the conference center, all of these folks and more met to discuss the biggest cooperative initiative ever proposed for their industry, prompted by the ultimatum they had so recently been given by Senators Lieberman and Kohl: “Come up with a rating system for yourself, or we’ll do it for you.” The meeting was organized by the SPA, which had the virtue of not being any of the arch-rival console makers, and was thus presumably able to evince a degree of impartiality. “Companies such as 3DO, Atari, Acclaim, id Software, and Apogee already have rating systems,” said Ken Wasch, the longstanding head of the SPA, to open the proceedings. “But a proliferation of rating systems is confusing to retailers and consumers alike. Even before this became an issue in the halls of Congress or in the media, there was a growing belief that we needed a single, easily recognizable system to rate and label our products.”

But the SPA lost control of the meeting almost from the moment Wasch stepped down from the podium. The industry was extremely fortunate that neither Senator Lieberman nor Kohl took said organization up on an invitation to attend in person. One participant remembers the meeting consisting mostly of “people sitting around a table screaming and carrying on.” Cries of “Censorship!” and “Screw ’em! We’ll make the games we want to make!” dominated for long stretches. Many regarded the very notion of a rating system as an unacceptable intrusion by holier-than-thou bureaucrats; they wanted to call what they insisted was the senators’ bluff, to force them to put up actual government legislation — legislation whose constitutionality would be highly questionable — or to shut up about it.

Yet such advocates of the principle of free speech over all other concerns weren’t the sum total of the problem. Even many of those who felt that a rating system was probably necessary were thoroughly unimpressed with the hosts of the meeting, and not much disposed to fall meekly into line behind them.

The hard reality was that the SPA had never been viewed as a terribly effectual organization. Formed  to be the voice of the computer-software industry in 1984 — i.e., just after the Great Videogame Crash — it had occupied itself mostly with anti-piracy campaigns and an annual awards banquet in the years since. The return of a viable console marketplace in the form of the Nintendo Entertainment System and later the Sega Genesis had left it in an odd position. Most of the publishers of computer games who began moving some or all of their output to the consoles were members of the SPA, and through them the SPA itself got pulled into this brave new world. But there were certainly grounds to question whether the organization’s remit really ought to involve the console marketplace at all. Was the likes of Acclaim, the publisher of console-based videogames like Mortal Kombat, truly in the same business as such other SPA members as the business-software titans Microsoft and WordPerfect? Nintendo had always pointedly ignored the SPA; Sega had joined as a gesture of goodwill to their outside publishers who were also members, but hardly regarded it as a major part of their corporate strategy. In addition to being judged slow, bureaucratic, and uncreative, the SPA was regarded by everyone involved with the consoles as being much more invested in computer software of all stripes than console-based videogames. And what with computer games representing in the best case fifteen percent of the overall digital-games market, that alone struck them as a disqualifier for spearheading an initiative like this one.

Electronic Arts, the largest of all of the American game publishers, was in an interesting position here. Founded in 1983 to publish games exclusively for computers, EA had begun moving onto consoles in a big way at the dawn of the 1990s, scoring hits there with such games as the first installments in the evergreen John Madden Football series. By the beginning of 1994, console games made up over two-thirds of their total business.

A senior vice president at EA by the name of Jack Heistand felt that an industry-wide rating system was “the right thing to do. I really believed in my heart that we needed to communicate to parents what the content was inside games.” Yet he also felt convinced from long experience that the SPA was hopelessly ill-equipped for a project of this magnitude, and the disheartening meeting which the SPA tried to lead at CES only cemented that belief. So, immediately after the meeting was over, he approached EA’s CEO Larry Probst with a proposal: “Let’s get all the [other] CEOs together to form an industry association. I will chair it.” Probst readily agreed.

Jack Heistand

The SPA was not included in this other, secret meeting, even though it convened at that same CES. Its participants rather included a representative from each of the five manufacturers of currently or potentially viable consoles: Sega, Nintendo, Atari, Philips, and 3DO. Rounding out their numbers were two videogame-software publishers: Acclaim Entertainment of Mortal Kombat fame and of course Electronic Arts. With none of the console makers willing to accept one of their rivals as chairman of the new steering committee, they soon voted to bestow the role upon Jack Heistand, just as he had planned it.

Sega, convinced of the worthiness of their own rating system, would have happily brought the entirety of the industry under its broad tent and been done with it, but this Nintendo’s pride would never allow. It became clear as soon as talks began, if it hadn’t been already, that whatever came next would have to be built from scratch. With Senators Lieberman and Kohl breathing down their necks, they would all have to find a way to come together, and they would have to do so quickly. The conspirators agreed upon an audacious timetable indeed: they wanted to have a rating system in place for all games that shipped after October 31, 1994 — just in time, in other words, for the next Christmas buying season. It was a tall order, but they knew that they would be able to force wayward game publishers to comply if they could only get their own house in order, thanks to the fact all of the console makers in the group employed the walled-garden approach to software: all required licenses to publish on their platforms, meaning they could dictate which games would and would not appear there. They could thus force a rating system to become a ubiquitous reality simply by pledging not to allow any games on their consoles which didn’t include a rating.

On February 3, 1994, Senator Lieberman introduced the “Video Game Rating Act” to the United States Senate, stipulating that an “Interactive Entertainment Rating Commission” should be established, with five members appointed by President Bill Clinton himself; this temporary commission would be tasked with founding a new permanent governmental body to do what the industry had so far not been willing to do for itself. Shortly thereafter, Representative Tom Lantos, a Democrat from California, introduced parallel legislation in the House. Everyone involved made it clear, however, that they would be willing to scrap their legislation if the industry could demonstrate to their satisfaction that it was now addressing the problem itself. Lieberman, Kohl, and Lantos were all pleased when Sega dropped Night Trap from their product line as a sort of gesture of good faith; the controversial game had never been a particularly big seller, and had now become far more trouble than it was worth. (Mortal Kombat, on the other hand, was still posting sales that made it worth the controversy…)

On March 4, 1994, three representatives of the videogame industry appeared before Lieberman, Kohl, and Lantos at a hearing that was billed as a “progress report.” The only participant in the fractious hearing of three months before who returned for this one was Howard Lincoln of Nintendo, who had established something of a rapport with Senator Lieberman on that earlier occasion. Sega kept Bill White, who most definitely had not, well away, sending instead a white-haired senior vice president named Edward Volkwein. But most of the talking was done by the industry’s third representative, Jack Heistand. His overriding goal was to convince the lawmakers that he and his colleagues were moving as rapidly as possible toward a consistent industry-wide rating system, and should be allowed the balance of the year to complete their work before any legislation went forward. He accordingly emphasized over and over that ratings would appear on the boxes of all new videogames released after October 31.

The shift in tone from the one hearing to the next was striking; this one was a much more relaxed, even collegial affair than last time out. Lieberman, Kohl, and Lantos all praised the industry’s efforts so far, and kept the “think of the children!” rhetoric to a minimum in favor of asking practical questions about how the rating system would be implemented. “I don’t need to get into that argument again,” said Senator Lieberman when disagreements over the probability of a linkage between videogame violence and real-world aggression briefly threatened to ruin the good vibe in the room.

“I think you’re doing great,” said Senator Kohl at the end of the hearing. “It’s a wonderful start. I really am very pleased.” Mission accomplished: Heistand had bought himself enough time to either succeed or fail before the heavy hand of government came back on the scene.



Heistand’s remit was rapidly growing into something much more all-encompassing than just a content-rating board. To view his progress was to witness nothing less than an industry waking up to its shared potential and its shared problems. As I’ve already noted, the videogame industry as a whole had long been dissatisfied with its degree of representation in the SPA, as well as with the latter’s overall competence as a trade organization. This, it suddenly realized, was a chance to remedy that. Why not harness the spirit of cooperation that was in the air to create an alternative to the SPA that would focus solely on the needs of videogame makers? Once that was done, this new trade organization could tackle the issue of a rating system as just the first of many missions.

The International Digital Software Association (IDSA) was officially founded in April of 1994. Its initial members included Acclaim, Atari, Capcom, Crystal Dynamics, Electronic Arts, Konami, Nintendo, Philips, Sega, Sony, Viacom, and Virgin, companies whose combined sales made up no less than 60 percent of the whole videogame industry. Its founding chairman was Jack Heistand, and its first assigned task was the creation of an independent Entertainment Software Rating Board (ESRB).

Heistand managed to convince Nintendo and the others to accept the man who had chaired Sega’s ratings board for the same role in the industry-wide system. Arthur Pober had a reputation for being, as Heistand puts it, “very honorable. A man of integrity.” “Arthur was the perfect guy,” says Tom Kalinske, then the president and CEO of Sega of America. “He had good relationships inside of the education world, inside of the child-development world, and knew the proper child psychologists and sociologists. Plus, we knew he could do it — because he had already done it for us!”

Neutral parties like Pober helped to ease some of the tension that inevitably sprang up any time so many fierce competitors were in the room together. Heistand extracted a promise from everyone not to talk publicly about their work here — a necessary measure given that Howard Lincoln and Tom Kalinske normally used each and every occasion that offered itself to advance their own company and disparage their rival. (Witness Lincoln’s performance at the hearing of December 9…)

Over the course of the next several months, the board hammered out a rating system that was more granular and detailed than the one Sega had been using. It divided games into five rather than three categories: “Early Childhood” (EC) for children as young as age three; “Kids to Adults” (K-A) for anyone six years of age or older; “Teen” (T) for those thirteen or older; “Mature” (M) for those seventeen or older; and “Adults Only” (AO) for those eighteen or older. It was not a coincidence that these ratings corresponded fairly closely to the movie industry’s ratings of G, PG, PG-13, R, and NC-17. A team of graphic artists came up with easily recognizable icons for each of the categories — icons which proved so well-designed for their purpose that most of them are still used to this day.

The original slate of ESRB icons. Since 1994, remarkably few changes have been made: the “Kids to Adults” category has been renamed “Everyone,” and a sixth category of games suitable for those ten years and older, known in the rating system’s nomenclature as “Everyone 10+,” has been added.

The ESRB itself was founded as a New York-based non-profit. Each game would be submitted to it in the form of a videotape of 30 to 40 minutes in length, which must contain the game’s most “extreme” content. The board would then assign the game to one of its teams of three reviewers, all of whom were trained and overseen by the ESRB under the close scrutiny of Arthur Pober. The reviewers were allowed to have no financial or personal ties to the videogame industry, and were hired with an eye to demographic diversity: an example which Heistand gave of an ideal panel consisted of a retired black male elementary-school principal, a 35-year-old white full-time mother of two, and a 22-year-old white male law student. A measure of checks and balances was built into the process: publishers would have the chance to appeal ratings with which they disagreed, and all rated games would have to pass a final audit a week before release to ensure that the videotape which had been submitted had been sufficiently representative of the overall experience. The ESRB aimed to begin accepting videotapes on September 1, 1994, in keeping with the promise that all games released after October 31 would have a rating on the box. Everything was coming together with impressive speed.

But as Heistand prepared to return to Washington to report all of this latest progress on July 29, 1994, there remained one part of the games industry which had not fallen into line. The SPA was not at all pleased by the creation of a competing trade association, nor by having the rug pulled out from under its own rating initiative. And the computer-game makers among its members didn’t face the same compulsion to accept the ESRB’s system, given that they published on open platforms with no gatekeepers.



The relationship between computer games and their console-based brethren had always been more complicated than outsiders such as Senators Lieberman and Kohl were wont to assume. While the degree of crossover between the two had always been considerable, computer gaming had been in many ways a distinct form of media in its own right since the late 1970s. Computer-game makers claimed that their works were more sophisticated forms of entertainment, with more variety in terms of theme and subject matter and, in many cases, more complex and cerebral forms of gameplay on offer. They had watched the resurrection of the console marketplace with as much dismay as joy, being unimpressed by what many of them saw as the dumbed-down “kiddie aesthetic” of Nintendo and the stultifying effect which the consoles’ walled gardens had on creativity; there was a real feeling that the success of Nintendo and its ilk had come at the cost of a more diverse and interesting future for interactive entertainment as a whole. Perhaps most of all, computer-game makers and their older-skewing demographic of players profoundly resented the wider culture’s view of digital games of any stripe as essentially children’s toys, to be regulated in the same way that one regulated Barbie dolls and Hot Wheels cars. These resentments had not disappeared even as many of the larger traditional computer-game publishers, such as EA, had been tempted by the booming market for console-based videogames into making products for those systems as well.

Johnny L. Wilson, the editor-in-chief of Computer Gaming World magazine, voiced in an editorial the objections which many who made or played computer games had to the ESRB:

[The ESRB rating system] has been developed by videogame manufacturers and videogame publishers without significant input by computer-based publishers. The lone exception to this rule is Electronic Arts, which publishes personal-computer titles but nets more than two-thirds of its proceeds from videogame sales. The plan advocated by this group of videogame-oriented companies calls for every game to be viewed by an independent panel prior to release. This independent panel would consist of parents, child psychologists, and educators.

How does this hurt you? This panel is not going to understand that you are a largely adult audience. They are not going to perceive that there is a marketplace of mature gamers. Everything they evaluate will be examined under the rubric, “Is it good for children?” As a result, many of the games covered in Computer Gaming World will be rated as unsuitable for children, and many retailers will refuse to handle these games because they perceive themselves as family-oriented stores and cannot sell unsuitable merchandise.

The fate of Night Trap, an unusually “computer-like” console game, struck people like Wilson as an ominous example of how rating games could lead to censoring them.

Honestly held if debatable opinions like the above, combined perhaps with pettier resentments about the stratospheric sales of console games in comparison to those that ran on computers and its own sidelining by the IDSA, led the SPA to reject the ESRB, and to announce the formation of its own ratings board just for computer games. It was to be called the Recreational Software Advisory Council (RSAC), and its founding president was to be Robert Roden, the general counsel and director of business affairs for the computer-game publisher LucasArts. This choice of an industry insider rather than an outside expert like Arthur Pober reflected much of what was questionable about the alternative rating initiative.

Indeed, and although much of the reasoning used to justify a competing standard was cogent enough, the RSAC’s actual plan for its rating process was remarkable mostly for how comprehensively it failed to address the senators’ most frequently stated concerns about any self-imposed rating standard. Instead of asking publishers to submit videotapes of gameplay for review by an independent panel, the RSAC merely provided them with a highly subjective questionnaire to fill out; in effect, it allowed them to “self-rate” their own games. And, in a reflection of computer-game makers’ extreme sensitivity to any insinuation that their creations were just kids’ stuff, the RSAC rejected outright any form of age-based content rating. Age-based rating systems were “patronizing,” claimed the noted RSAC booster Johnny L. Wilson, because “different people of widely disparate ages have different perceptions of what is appropriate.” In lieu of sorting ratings by age groups, the RSAC would use descriptive labels stipulating the amount and type of violence, sex, and profanity, with each being ranked on a scale from zero to four.

The movie industry’s rating system was an obvious counterexample to this idea that age-based classification must necessarily entail the infantilization of art; certainly cinema still enjoyed vastly more cultural cachet than computer games, despite its own longstanding embrace of just such a system. But the computer-game makers were, it would seem, fairly blinded by their own insecurities and resentments.

A representative of the SPA named Mark Traphagen was invited to join Jack Heistand at the hearing of July 29 in order to make the case for the RSAC’s approach to rating computer games. The hearing began in an inauspicious fashion for him. Senator Lieberman, it emerged during opening statements, had discovered id Software’s hyper-violent computer game of DOOM in the interim between this hearing and the last. This occasion thus came to mark the game’s coming-out party on the national stage. For the first but by no means the last time, a politician showed a clip of it in action, then lit into what the audience had just seen.

What you see there is an individual with a successive round of weapons — a handgun, machine gun, chainsaw — just continuing to attack targets. The bloodshed, the gunfire, and the increasingly realistic imagery combine to create a game that I would not want my daughter or any other child to see or to play.

What you have not seen is some of the language that is displayed onscreen when the game is about to be played. “Act like a man!” the player is told. “Slap a few shells into your shotgun and let’s kick some demonic butt! You’ll probably end up in Hell eventually. Shouldn’t you know your way around before you make an extended visit?”

Well, some may say this is funny, but I think it sends the wrong message to our kids. The game’s skill levels include “I’m Too Young To Die” and “Hurt Me Plenty.” That obviously is not the message parents want their kids to hear.

Mark Traphagen received quite a grilling from Lieberman for the patent failings of the RSAC self-rating system. He did the best he could, whilst struggling to educate his interrogators on the differences between computer and console games. He stipulated that the two were in effect different industries entirely — despite the fact that many software publishers were, as we’ve seen, active in both. This was an interesting stand to take, not least in the way that it effectively ceded the ground of console-based software to the newly instituted IDSA, in the hope that the SPA could hang onto computer games.

Traphagen: Despite popular misconceptions and their admitted similarities to consumers, there are major differences between the personal-computer-software industry and the videogame industry. While personal-computer software and videogame software may be converging toward the compact disc as the preferred storage medium, those of us who develop and publish entertainment software see no signs of a convergence in either product development or marketing.

The personal-computer-software industry is primarily U.S.-based, small to medium in size, entrepreneurial, and highly innovative. Like our plan to rate software, it is based on openness. Its products run on open-platform computers and can be produced by any of thousands of companies of different sizes, without restrictive licensing agreements. There is intense competition between our industry and the videogame industry, marked by the great uncertainty about whether personal computers or some closed platform will prevail in the forthcoming “information superhighway.”

Senator Lieberman: Maybe you should define what a closed platform is in this regard.

Traphagen: A closed platform, Senator, is one in which the ability to create software that will run on that particular equipment is controlled by licensing agreements. In order to create software that will run on those platforms, one has to have the permission and consent of the equipment manufacturer.

Senator Lieberman: And give us an example of that.

Traphagen: A closed platform would be a videogame player.

Senator Lieberman: Such as a Sega or Nintendo?

Traphagen: That is right. In contrast, personal computers are an open platform in which any number of different companies can simply buy a development package at a retailer or a specialty store and then create software that will operate on the computer.

Traphagen explained the unwillingness of computer-game makers to fall under the thumb of the IDSA by comparing them to indie film studios attempting to negotiate the Hollywood machine. Yet he was able to offer little in defense of the RSAC’s chosen method of rating games. He made the dubious claim that creating a videotape for independent evaluation would be too technically burdensome on a small studio, and had even less to offer when asked what advantage accrued to not rating games by suitable age groups: “I do not believe there is an advantage, Senator. There was simply a decision that was taken that the ratings would be as informative as possible, without being judgmental.”

Some five weeks after this hearing, the RSAC would hold a press conference in Dallas, Texas, the home of id Software of DOOM fame. In fact, that game was used to illustrate how the rating system would work. Even some of the more sanguine members of the gaming press were surprised when it received a rating of just three out of four for violence. The difference maker, the RSAC representatives explained, was the fact that DOOM‘s violence wasn’t “gratuitous”; the monsters were trying to kill you, so you had no choice but to kill them. One has to presume that Senators Lieberman and Kohl would not have been impressed, and that Mark Traphagen was profoundly thankful that the press conference occurred after his appearance before them.

Even as it was, the senators’ skepticism toward the RSAC’s rating system at the hearing stood out all the more in contrast to their reception of the ESRB’s plan. The relationship between Senator Lieberman and Jack Heistand had now progressed from the cordial to the downright genial; the two men, now on a first-name basis, even made room for some banter on Heistand’s abortive youthful attempts to become a rock star. The specter of government legislation was never even raised to Heistand. It was, needless to say, a completely different atmosphere from the one of December 9. When the hearing was finished, both sides sent out press notices praising the wisdom and can-do spirit of the other in glowing terms.

But much of the rest of the games industry showed far less good grace. As the summer became the fall and it became clear that game ratings really were happening, the rants began, complete with overheated references to Fahrenheit 451 and all of the other usual suspects. Larry O’Brien, the editor of the new Game Developer magazine, made his position clear in the first line of his editorial: “Rating systems are crap.”

With the entire entertainment industry rolling over whenever Congress calls a hearing, it’s fallen on us to denounce these initiatives for what they are: cynical posturing and electioneering with no substance. Rating systems, whether for movies, television, videogames, or any other form of communication, don’t work, cost money, and impede creativity. Everyone at those hearings, politicians and witnesses alike, knows that. But there’s nothing politicians love more than “standing up for the family” and blaming America’s cultural violence on Hollywood. So the entertainment industry submissively pisses all over itself and proposes “voluntary” systems from the pathetic to the laughable.

Parents should decide. If parents don’t want their kids to play X-COM or see Terminator 2, they should say no and put up with the ensuing argument. They don’t need and shouldn’t get a rating system to supplement their authority. The government has no right to help parents say no at the video store if that governmental interference impedes your right to develop whatever content you feel appropriate.

We all have responsibilities. To create responsibly, to control the viewing and gaming habits of our own children, and to call the government’s ratings initiatives what they are: cynical, ineffective, and wrong-headed.

The libertarian-leaning Wired magazine, that voice of cyber-futurism, published a jeremiad from Rogier Van Bakel that was equally strident.

Violent games such as DOOM, Night Trap, and Mortal Kombat are corrupting the minds and morals of millions of American children. So what do you do? Easy.

You elect people like Herb Kohl and Joe Lieberman to the US Senate. You applaud them when they tell the videogame industry that it’s made up of irrepressible purveyors of gratuitous gore and nefarious nudity. You nod contentedly when the senators give the industry an ultimatum: “Either you start rating and stickering your games real soon, or we, the government, will do it for you.”

You are pleasantly surprised by the industry’s immediate white flag: a rating system that is almost as detailed as the FDA-mandated nutrition information on a can of Campbell’s. You contend that that is, in fact, a perfect analogy: all you want, as a consumer, is honest product labeling. Campbell’s equals Sega equals Kraft equals 3DO.

Finally, you shrug when someone remarks that it may not be a good idea to equate soup with freedom of speech.

All that was needed now was a good conspiracy theory. This Karen Crowther, a spokesperson for makers of shareware computer games, helpfully provided when she said that the government had gotten “hoodwinked by a bunch of foreign billion-dollar corporations (such as Sony, Nintendo, and Sega) out to crush their US competition.”

Robert Peck, a lawyer for the American Civil Liberties Union, flirted with a legal challenge:

This [rating] system is a response to the threat of Senators Lieberman and Kohl that they would enact legislation requiring labels unless the industry did something to preempt them. The game manufacturers are being required to engage in speech that they would otherwise not engage in. These ratings have the government’s fingerprints all over them.

This present labeling system isn’t going to be the end of it. I think some games are going to be negatively affected, sales-wise, and the producers of those games will probably bring a lawsuit. We will then see that this system will be invalidated.

The above bears a distinct whiff of legalistic wishful thinking; none of it came to pass.

While voices like these ranted and raved, Jack Heistand, Arthur Pober, and their associates buckled down soberly to the non-trivial task of putting a rating on all new console-based videogames that holiday season, and succeeded in doing so with an efficiency that one has to admire, regardless of one’s position on the need for such a system. Once the initial shock to the media ecosystem subsided, even some of the naysayers began to see the value in the ESRB’s work.

Under the cover of the rating system, for example, Nintendo felt able to relax many of their strict “family-friendly” content policies. The second “Mortal Monday,” heralding the release of Mortal Kombat II on home consoles, came in September of 1994, before the ESRB’s icons had even started to appear on games. Nevertheless, Nintendo improvised a stopgap badge labeling the game unsuitable for those under the age of seventeen, and felt protected enough by it to allow the full version of the coin-op original on their platform this time, complete with even more blood and gore than its predecessor. It was an early sign that content ratings might, rather than leading game makers to censor themselves, give them a feeling of carte blanche to be more extreme.

By 1997, Game Developer was no longer railing against the very idea of a rating system, but was fretting instead over whether the ESRB’s existing approach was looking hard enough at the ever more lifelike violence made possible by the latest graphics hardware. The magazine worried about unscrupulous publishers submitting videotapes that did not contain their games’ most extreme content, and the ESRB failing to catch on to this as games continued to grow larger and larger: “The ESRB system uses three (count ’em, three) ‘demographically diverse’ people to rate a game. (And I thought television’s Nielsen rating system used a small sample set.) As the stakes go up in the ratings game, the threat of a publisher abusing our rating system grows larger and larger.”

Meanwhile the RSAC strolled along in a more shambolic manner, stickering games here and there, but never getting anything close to the complete buy-in from computer-game publishers that the ESRB received from console publishers. These respective patterns held throughout the five years in which the dueling standards existed.

In the end, in other words, the computer-game people got what they had really wanted all along: a continuing lack of any concerted examination of the content of their works. Some computer games did appear with the ESRB icons on their boxes, others with the RSAC schemas, but plenty more bothered to include no content guidance at all. Satisfied for the time being with the ESRB, Senators Lieberman and Kohl didn’t call any more hearings, allowing the less satisfying RSAC system to slip under the radar along with the distinct minority of digital games to which it was applied, even as computer games like Duke Nukem 3D raised the bar for violence far beyond the standard set by DOOM. The content of computer games wouldn’t suffer serious outside scrutiny again until 1999, the year that a pair of rabid DOOM and Duke Nukem fans shot up their high school in Columbine, Colorado, killing thirteen teachers and students and injuring another 24. But that is a tragedy and a controversy for a much, much later article…

(Sources: the books Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, The Ultimate History of Video Games by Steven L. Kent, and Game Over: How Nintendo Conquered the World by David Sheff; Game Developer of September 1994, December 1994, August/September 1995, September 1997, and January 1998; Computer Gaming World of June 1994, December 1994, May 1996, and July 1999; Electronic Entertainment of November 1994 and January 1995; Mac Addict of January 1996; Sierra’s newsletter InterAction of Spring 1994; Washington Post of July 29 1994; the article “Regulating Violence in Video Games: Virtually Everything” by Alex Wilcox in the Journal of the National Association of Administrative Law Judiciary, Volume 31, Issue 1; the United States Senate Committee on the Judiciary’s publication Rating Video Games: A Parent’s Guide to Games; the 1994 episode of the television show Computer Chronicles entitled “Consumer Electronics Show.” Online sources include Blake J. Harris’s “Oral History of the ESRB” at VentureBeat and C-SPAN’s coverage of the Senate hearings of December 9 1993, March 4 1994, and July 29 1994.)

 

Tags: , , , , , , , ,

Origin Sells Out

One day in early June of 1992, a group of executives from Electronic Arts visited Origin Systems’s headquarters in Austin, Texas. If they had come from any other company, the rank and file at Origin might not have paid them much attention. As it was, though, the visit felt a bit like Saddam Hussein dropping in at George Bush’s White House for a fireside chat. For Origin and EA, you see, had a history.

Back in August of 1985, just prior to the release of Ultima IV, the much smaller Origin had signed a contract to piggyback on EA’s distribution network as an affiliated label. Eighteen months later, when EA released an otherwise unmemorable CRPG called Deathlord whose interface hewed a little too closely to that of an Ultima, a livid Richard Garriott attempted to pull Origin out of the agreement early. EA at first seemed prepared to crush Origin utterly in retribution by pulling at the legal seams in the two companies’ contract. Origin, however, found themselves a protector: Brøderbund Software, whose size and clout at the time were comparable to that of EA. At last, EA agreed to allow Origin to go their own way, albeit probably only after the smaller company paid them a modest settlement for breaking the contract. Origin quickly signed a new distribution contract with Brøderbund, which lasted until 1989, by which point they had become big enough in their own right to take over their own distribution.

But Richard Garriott wasn’t one to forgive even a small personal slight easily, much less a full-blown threat to destroy his company. From 1987 on, EA was Public Enemy #1 at Origin, a status which Garriott marked in ways that only seemed to grow pettier as time went on. Garriott built a mausoleum for “Pirt Snikwah” — the name of Trip Hawkins, EA’s founder and chief executive, spelled backward — at his Austin mansion of Britannia Manor. Ultima V‘s parser treated the phrase “Electronic Arts” like a curse word; Ultima VI included a gang of evil pirates named after some of the more prominent members of EA’s executive staff. Time really did seem to make Garriott more rather than less bitter. Among his relatively few detail-oriented contributions to Ultima VII were a set of infernal inter-dimensional generators whose shapes together formed the EA logo. He also demanded that the two villains who went on a murder spree across Britannia in that game be named Elizabeth and Abraham. Just to drive the point home, the pair worked for a “Destroyer of Worlds” — an inversion of Origin’s longstanding tagline of “We Create Worlds.”

And yet here the destroyers were, just two months after the release of Ultima VII, chatting amiably with their hosts while they gazed upon their surroundings with what seemed to some of Origin’s employees an ominously proprietorial air. Urgent speculation ran up and down the corridors: what the hell was going on? In response to the concerned inquiries of their employees, Origin’s management rushed to say that the two companies were merely discussing “some joint ventures in Sega Genesis development,” even though “they haven’t done a lot of cooperative projects in the past.” That was certainly putting a brave face on half a decade of character assassination!

What was really going on was, as the more astute employees at Origin could all too plainly sense, something far bigger than any mere “joint venture.” The fact was, Origin was in a serious financial bind — not a unique one in their evolving industry, but one which their unique circumstances had made more severe for them than for most others. Everyone in the industry, Origin included, was looking ahead to a very near future when the enormous storage capacity of CD-ROM, combined with improving graphics and sound and exploding numbers of computers in homes, would allow computer games to join television, movies, and music as a staple of mainstream entertainment rather than a niche hobby. Products suitable for this new world order needed to go into development now in order to be on store shelves to greet it when it arrived. These next-generation products with their vastly higher audiovisual standards couldn’t be funded entirely out of the proceeds from current games. They required alternative forms of financing.

For Origin, this issue, which really was well-nigh universal among their peers, was further complicated by the realities of being a relatively small company without a lot of product diversification. A few underwhelming attempts to bring older Ultima games to the Nintendo Entertainment System aside, they had no real presence on videogame consoles, a market which dwarfed that of computer games, and had just two viable product lines even on computers: Ultima and Wing Commander. This lack of diversification left them in a decidedly risky position, where the failure of a single major release in either of those franchises could conceivably bring down the whole company.

The previous year of 1991 had been a year of Wing Commander, when the second mainline title in that franchise, combined with ongoing strong sales of the first game and a series of expansion packs for both of them, had accounted for fully 90 percent of the black ink in Origin’s books. In this year of 1992, it was supposed to have been the other franchise’s turn to carry the company while Wing Commander retooled its technology for the future. But Ultima VII: The Black Gate, while it had been far from an outright commercial failure, had garnered a more muted response than Origin had hoped and planned for, plagued as its launch had been by bugs, high system requirements, and the sheer difficulty of configuring it to run properly under the inscrutable stewardship of MS-DOS.

Even more worrisome than all of the specific issues that dogged this latest Ultima was a more diffuse sort of ennui directed toward it by gamers — a sense that the traditional approach of Ultima in general, with its hundred-hour play time, its huge amounts of text, and its emphasis on scope and player freedom rather than multimedia set-pieces, was falling out of step with the times. Richard Garriott liked to joke that he had spent his whole career making the same game over and over — just making it better and bigger and more sophisticated each time out. It was beginning to seem to some at Origin that that progression might have reached its natural end point. Before EA ever entered the picture, a sense was dawning that Ultima VIII needed to go in another direction entirely — needed to be tighter, flashier, more focused, more in step with the new types of customers who were now beginning to buy computer games. Ultima Underworld, a real-time first-person spinoff of the core series developed by the Boston studio Blue Sky Productions rather than Origin themselves, had already gone a considerable distance in that direction, and upon its near-simultaneous release with Ultima VII had threatened to overshadow its more cerebral big brother completely, garnering more enthusiastic reviews and, eventually, higher sales. Needless to say, had Ultima Underworld not turned into such a success, Origin’s financial position would have been still more critical than it already was. It seemed pretty clear that this was the direction that all of Ultima needed to go.

But making a flashier next-generation Ultima VIII — not to mention the next-generation Wing Commander — would require more money than even Ultima VII and Ultima Underworld together were currently bringing in. And yet, frustratingly, Origin couldn’t seem to drum up much in the way of financing. Their home state of Texas was in the midst of an ugly series of savings-and-loan scandals that had made all of the local banks gun-shy; the country as a whole was going through a mild recession that wasn’t helping; would-be private investors could see all too clearly the risks associated with Origin’s non-diversified business model. As the vaguely disappointing reception for Ultima VII continued to make itself felt, the crisis began to feel increasingly existential. Origin had lots of technical and creative talent and two valuable properties — Wing Commander in particular was arguably still the hottest single name in computer gaming — but had too little capital and a nonexistent credit line. They were, in other words, classic candidates for acquisition.

It seems that the rapprochement between EA and Origin began at the Summer Consumer Electronics Show in Chicago at the very beginning of June of 1992, and, as evidenced by EA’s personal visit to Origin just a week or so later, proceeded rapidly from there. It would be interesting and perhaps a little amusing to learn how the rest of Origin’s management team coaxed Richard Garriott around to the idea of selling out to the company he had spent the last half-decade vilifying. But whatever tack they took, they obviously succeeded. At least a little bit of sugar was added to the bitter pill by the fact that Trip Hawkins, whom Garriott rightly or wrongly regarded as the worst of all the fiends at EA, had recently stepped down from his role in the company’s management to helm a new semi-subsidiary outfit known as 3DO. (“Had Trip still been there, there’s no way we would have gone with EA,” argues one former Origin staffer — but, then again, necessity can almost always make strange bedfellows.)

Likewise, we can only wonder what if anything EA’s negotiators saw fit to say to Origin generally and Garriott specifically about all of the personal attacks couched within the last few Ultima games. I rather suspect they said nothing; if there was one thing the supremely non-sentimental EA of this era had come to understand, it was that it seldom pays to make business personal.

Richard and Robert Garriott flank Stan McKee, Electronic Arts’s chief financial officer, as they toast the consummation of one of the more unexpected acquisitions in gaming history at EA’s headquarters in San Mateo, California.

So, the deal was finalized at EA’s headquarters in San Mateo, California, on September 25, 1992, in the form of a stock exchange worth $35 million. Both parties were polite enough to call it a merger rather than an acquisition, but it was painfully clear which one had the upper hand; EA, who were growing so fast they had just gone through a two-for-one stock split, now had annual revenues of $200 million, while Origin could boast of only $13 million. In a decision whose consequences remain with us to this day, Richard Garriott even agreed to sign over his personal copyrights to the Ultima franchise. In return, he became an EA vice president; his brother Robert, previously the chief executive in Austin, now had to settle for the title of the new EA subsidiary’s creative director.

From EA’s perspective, the deal got them Ultima, a franchise which was perhaps starting to feel a little over-exposed in the wake of a veritable flood of Origin product bearing the name, but one which nevertheless represented EA’s first viable CRPG franchise since the Bard’s Tale trilogy had concluded back in 1988. Much more importantly, though, it got them Wing Commander, in many ways the progenitor of the whole contemporary craze for multimedia “interactive movies”; it was a franchise which seemed immune to over-exposure. (Origin had amply proved this point by releasing two Wing Commander mainline games and four expansion packs in the last two years, plus a “Speech Accessory Pack” for Wing Commander II, all of which had sold very well indeed.)

As you do in these situations, both management teams promised the folks in Austin that nothing much would really change. “The key word is autonomy,” Origin’s executives said in their company’s internal newsletter. “Origin is supposed to operate independently from EA and maintain profitability.” But of course things did — had to — change. There was an inescapable power imbalance here, such that, while Origin’s management had to “consult” with EA when making decisions, their counterparts suffered no such obligation. And of course what might happen if Origin didn’t “maintain profitability” remained unspoken.

Thus most of the old guard at Origin would go on to remember September 25, 1992, as, if not quite the end of the old, freewheeling Origin Systems, at least the beginning of the end. Within six months, resentments against the mother ship’s overbearing ways were already building in such employees as an anonymous letter writer who asked his managers why they were “determined to eradicate the culture that makes Origin such a fun place to work.” Within a year, another was asking even more heatedly, “What happened to being a ‘wholly owned independent subsidiary of EA?’ When did EA start telling Origin what to do and when to do it? I thought Richard said we would remain independent and that EA wouldn’t touch us?!? Did I miss something here?” Eighteen months in, an executive assistant named Michelle Caddel, the very first new employee Origin had hired upon opening their Austin office in 1987, tried to make the best of the changes: “Although some of the warmth at Origin has disappeared with the merger, it still feels like a family.” For now, at any rate.

Perhaps tellingly, the person at Origin who seemed to thrive most under the new arrangement was one of the most widely disliked: Dallas Snell, the hard-driving production manager who was the father of a hundred exhausting crunch times, who tended to regard Origin’s games as commodities quantifiable in floppy disks and megabytes. Already by the time Origin had been an EA subsidiary for a year, he had managed to install himself at a place in the org chart that was for all practical purposes above that of even Richard and Robert Garriott: he was the only person in Austin who was a “direct report” to Bing Gordon, EA’s powerful head of development.

On the other hand, becoming a part of the growing EA empire also brought its share of advantages. The new parent company’s deep pockets meant that Origin could prepare in earnest for that anticipated future when games would sell more copies but would also require more money, time, and manpower to create. Thus almost immediately after closing the deal with EA, Origin closed another one, for a much larger office space which they moved into in January of 1993. Then they set about filling up the place; over the course of the next year, Origin would double in size, going from 200 to 400 employees.

The calm before the storm: the enormous cafeteria at Origin’s new digs awaits the first onslaught of hungry employees. Hopefully someone will scrounge up some tables and chairs before the big moment arrives…

And so the work of game development went on. When EA bought Origin, the latter naturally already had a number of products, large and small, in the pipeline. The first-ever expansion pack for an existing Ultima game — an idea borrowed from Wing Commander — was about to hit stores; Ultima VII: Forge of Virtue would prove a weirdly unambitious addition to a hugely ambitious game, offering only a single dungeon to explore that was more frustrating than fun. Scheduled for release in 1993 were Wing Commander: Academy, a similarly underwhelming re-purposing of Origin’s internal development tools into a public-facing “mission builder,” and Wing Commander: Privateer, which took the core engine and moved it into a free-roaming framework rather than a tightly scripted, heavily story-driven one; it thus became a sort of updated version of the legendary Elite, and, indeed, would succeed surprisingly well on those terms. And then there was also Ultima Underworld II: Labyrinth of Worlds, developed like its predecessor by Blue Sky up in Boston; it would prove a less compelling experience on the whole than Ultima Underworld I, being merely a bigger game rather than a better one, but it would be reasonably well-received by customers eager for more of the same.

Those, then, were the relatively modest projects. Origin’s two most expensive and ambitious games for the coming year consisted of yet one more from the Ultima franchise and one that was connected tangentially to Wing Commander. We’ll look at them a bit more closely, taking them one at a time.

The game which would be released under the long-winded title of Ultima VII Part Two: Serpent Isle had had a complicated gestation. It was conceived as Origin’s latest solution to a problem that had long bedeviled them: that of how to leverage their latest expensive Ultima engine for more than one game without violating the letter of a promise Richard Garriott had made more than a decade before to never use the same engine for two successive mainline Ultima games. Back when Ultima VI was the latest and greatest, Origin had tried reusing its engine in a pair of spinoffs called the Worlds of Ultima, which rather awkwardly shoehorned the player’s character from the main series — the “Avatar” — into plots and settings that otherwise had nothing to do with Richard Garriott’s fantasy world of Britannia. Those two games had drawn from early 20th-century science and adventure fiction rather than Renaissance Faire fantasy, and had actually turned out quite magnificently; they’re among the best games ever to bear the Ultima name in this humble critic’s opinion. But, sadly, they had sold like the proverbial space heaters in the Sahara. It seemed that Arthur Conan Doyle and Edgar Rice Burroughs were a bridge too far for fans raised on J.R.R. Tolkien and Lord British.

So, Origin adjusted their approach when thinking of ways to reuse the even more expensive Ultima VII engine. They conceived two projects. One would be somewhat in the spirit of Worlds of Ultima, but would stick closer to Britannia-style fantasy: called Arthurian Legends, it would draw from, as you might assume, the legends of King Arthur, a fairly natural thematic fit for a series whose creator liked to call himself “Lord British.” The other game, the first to go into production, would be a direct sequel to Ultima VII, following the Avatar as he pursued the Guardian, that “Destroyer of Worlds” from the first game, from Britannia to a new world. This game, then, was Serpent Isle. Originally, it was to have had a pirate theme, all fantastical derring-do on an oceanic world, with a voodoo-like magic system in keeping with Earthly legends of Caribbean piracy.

This piratey Serpent Isle was first assigned to Origin writer Jeff George, but he struggled to find ways to adapt the idea to the reality of the Ultima VII engine’s affordances. Finally, after spinning his wheels for some months, he left the company entirely. Warren Spector, who had become Origin’s resident specialist in Just Getting Things Done, then took over the project and radically revised it, dropping the pirate angle and changing the setting to one that was much more Britannia-like, right down to a set of towns each dedicated to one of a set of abstract virtues. Having thus become a less excitingly original concept but a more practical one from a development perspective, Serpent Isle started to make good progress under Spector’s steady hand. Meanwhile another small team started working up a script for Arthurian Legends, which was planned as the Ultima VII engine’s last hurrah.

Yet the somewhat muted response to the first Ultima VII threw a spanner in the works. Origin’s management team was suddenly second-guessing the entire philosophy on which their company had been built: “Do we still create worlds?” Arthurian Legends was starved of resources amidst this crisis of confidence, and finally cancelled in January of 1993. Writer and designer Sheri Graner Ray, one of only two people left on the project at the end, invests its cancellation with major symbolic importance:

I truly believe that on some level we knew that this was the death knell for Origin. It was the last of the truly grass-roots games in production there… the last one that was conceived, championed, and put into development purely by the actual developers, with no support or input from the executives. It was actually, kinda, the end of an era for the game industry in general, as it was also during this time that we were all adjusting to the very recent EA buyout of Origin.

Brian Martin, one of the last two developers remaining on the Arthurian Legends project, made this odd little memorial to it with the help of his partner Sheri Graner Ray after being informed by management that the project was to be cancelled entirely. Ray herself tells the story: “Before we left that night, Brian laid down in the common area that was right outside our office and I went around his body with masking tape… like a chalk line… we added the outline of a crown and the outline of a sword. We then draped our door in black cloth and put up a sign that said, ‘The King is Dead. Long live the King.’ …. and a very odd thing happened. The next morning when we arrived, there were flowers by the outline. As the day wore on more flowers arrived.. and a candle.. and some coins were put on the eyes… and a poem arrived… it was uncanny. This went on for several days with the altar growing more and more. Finally, we were told we had to take it down, because there was a press junket coming through and they didn’t want the press seeing it.”

Serpent Isle, on the other hand, was too far along by the time the verdict was in on the first Ultima VII to make a cancellation realistic. It would instead go down in the recollection of most hardcore CRPG fans as the last “real” Ultima, the capstone to the process of evolution a young Richard Garriott had set in motion back in 1980 with a primitive BASIC game called Akalabeth. And yet the fact remains that it could have been so, so much better, had it only caught Origin at a less uncertain, more confident time.

Serpent Isle lacks the refreshingly original settings of the two Worlds of Ultima games, as it does the surprisingly fine writing of the first Ultima VII; Raymond Benson, the head writer on the latter project, worked on Serpent Isle only briefly before decamping to join MicroProse Software. In compensation, though, Serpent Isle is arguably a better game than its predecessor through the first 65 percent or so of its immense length. Ultima VII: The Black Gate can at times feel like the world’s most elaborate high-fantasy walking simulator; you really do spend most of your time just walking around and talking to people, an exercise that’s made rewarding only by the superb writing. Serpent Isle, by contrast, is full to bursting with actual things to do: puzzles to solve, dungeons to explore, quests to fulfill. It stretches its engine in all sorts of unexpected and wonderfully hands-on directions. Halfway in, it seems well on its way to being one of the best Ultima games of all, as fine a sendoff as any venerable series could hope for.

In the end, though, its strengths were all undone by Origin’s crisis of faith in the traditional Ultima concept. Determined to get its sales onto the books of what had been a rather lukewarm fiscal year and to wash their hands of the past it now represented, management demanded that it go out on March 25, 1993, the last day of said year. As a result, the last third or so of Serpent Isle is painfully, obviously unfinished. Conversations become threadbare, plot lines are left to dangle, side quests disappear, and bugs start to sprout up everywhere you look. As the fiction becomes a thinner and thinner veneer pasted over the mechanical nuts and bolts of the design, solubility falls by the wayside. By the end, you’re wandering through a maze of obscure plot triggers that have no logical connection with the events they cause, making a walkthrough a virtual necessity. It’s a downright sad thing to have to witness. Had its team only been allowed another three or four months to finish the job, Serpent Isle could have been not only a great final old-school Ultima but one of the best CRPGs of any type that I’ve ever played, a surefire entrant in my personal gaming hall of fame. As it is, though, it’s a bitter failure, arguably the most heartbreaking one of Warren Spector’s storied career.

Unfashionable though such an approach was in 1993, almost all of the Serpent Isle team’s energy went into gameplay and script rather than multimedia assets; the game looks virtually identical to the first Ultima VII. An exception is the frozen northlands which you visit later in the game. Unfortunately, the change in scenery comes about the time that the design slowly begins to fall apart.

And there was to be one final note of cutting irony in all of this: Serpent Isle, which Origin released without a lot of faith in its commercial potential, garnered a surprisingly warm reception among critics and fans alike, and wound up selling almost as well as the first Ultima VII. Indeed, it performed so well that the subject of doing “more games in that vein,” in addition to or even instead of a more streamlined Ultima VIII, was briefly discussed at Origin. As things transpired, though, its success led only to an expansion pack called The Silver Seed before the end of the year; this modest effort became the true swansong for the Ultima VII engine, as well as the whole era of the 100-hour-plus, exploration-focused, free-form single-player CRPG at Origin in general. The very philosophy that had spawned the company, that had been at the core of its identity for the first decade of its existence, was fading into history. Warren Spector would later have this to say in reference to a period during which practical commercial concerns strangled the last shreds of idealism at Origin:

There’s no doubt RPGs were out of favor by the mid-90s. No doubt at all. People didn’t seem to want fantasy stories or post-apocalypse stories anymore. They certainly didn’t want isometric, 100 hour fantasy or post-apocalypse stories, that’s for sure! I couldn’t say why it happened, but it did. Everyone was jumping on the CD craze – it was all cinematic games and high-end-graphics puzzle games… That was a tough time for me – I mean, picture yourself sitting in a meeting with a bunch of execs, trying to convince them to do all sorts of cool games and being told, “Warren, you’re not allowed to say the word ‘story’ any more.” Talk about a slap in the face, a bucket of cold water, a dose of reality.

If you ask me, the reason it all happened was that we assumed our audience wanted 100 hours of play and didn’t care much about graphics. Even high-end RPGs were pretty plain-jane next to things like Myst and even our own Wing Commander series. I think we fell behind our audience in terms of the sophistication they expected and we catered too much to the hardcore fans. That can work when you’re spending hundreds of thousands of dollars – even a few million – but when games start costing many millions, you just can’t make them for a relatively small audience of fans.

If Serpent Isle and its expansion were the last gasps of the Origin Systems that had been, the company’s other huge game of 1993 was every inch a product of the new Origin that had begun to take shape following the worldwide success of the first Wing Commander game. Chris Roberts, the father of Wing Commander, had been working on something called Strike Commander ever since late 1990, leaving Wing Commander II and all of the expansion packs and other spinoffs in the hands of other Origin staffers. The new game took the basic idea of the old — that of an action-oriented vehicular simulator with a strong story, told largely via between-mission dialog scenes — and moved it from the outer space of the far future to an Earth of a very near future, where the international order has broken down and mercenaries battle for control over the planet’s dwindling resources. You take to the skies in an F-16 as one of the mercenaries — one of the good ones, naturally.

Origin and Chris Roberts pulled out all the stops to make Strike Commander an audiovisual showcase; the game’s gestation time of two and a half years, absurdly long by the standards of the early 1990s, was a product of Roberts constantly updating his engine to take advantage of the latest cutting-edge hardware. The old Wing Commander engine was starting to look pretty long in the tooth by the end of 1992, so this new engine, which replaced its predecessor’s scaled sprites with true polygonal 3D graphics, was more than welcome. There’s no point in putting a modest face on it: Strike Commander looked downright spectacular in comparison with any other flight simulator on offer at the time. It was widely expected, both inside and outside of Origin, to become the company’s biggest game ever. In fact, it became the first Origin game to go gold in the United States — 100,000 copies sold to retail — before it had actually shipped there, thanks to the magic of pre-orders. Meanwhile European pre-orders topped 50,000, an all-time record for EA’s British subsidiary. All in all, more than 1.1 million Strike Commander floppy disks — 30 tons worth of plastic, metal, and iron oxide — were duplicated before a single unit was sold. Why not? This game was a sure thing.

The hype around Strike Commander was inescapable for months prior to its release. At the European Computer Trade Show in London, the last big event before the release, Origin put together a mock-up of an airplane hangar. Those lucky people who managed to seize control for few minutes got to play the game from behind a nose cowl and instrument panel. What Origin didn’t tell you was that the computer hidden away underneath all the window dressing was almost certainly much, much more powerful than one you had at home.

Alas, pride goeth before a fall. Just a couple of weeks after Strike Commander‘s worldwide release on April 23, 1993, Origin had to admit to themselves in their internal newsletter that sales from retail to actual end users were “slower than expected.” Consumers clearly weren’t as enamored with the change in setting as Origin and just about everyone else in their industry had assumed they would be. Transporting the Wing Commander formula into a reasonably identifiable version of the real world somehow made the story, which hovered as usual in some liminal space between comic book and soap opera, seem rather more than less ludicrous. At the same time, the use of an F-16 in place of a made-up star fighter, combined with the game’s superficial resemblance to the hardcore flight simulators of the day, raised expectations among some players which the game had never really been designed to meet. The editors of Origin’s newsletter complained, a little petulantly, about this group of sim jockeys who were “ready for a cockpit that had every gauge, altimeter, dial, and soft-drink holder in its proper place. This is basically the group which wouldn’t be happy unless you needed the $35 million worth of training the Air Force provides just to get the thing off the ground.” There were advantages, Origin was belatedly learning, to “simulating” a vehicle that had no basis in reality, as there were to fictions similarly divorced from the real world. In hitting so much closer to home, Strike Commander lost a lot of what had made Wing Commander so appealing.

The new game’s other problem was more immediate and practical: almost no one could run the darn thing well enough to actually have the experience Chris Roberts had intended it to be. Ever since Origin had abandoned the Apple II to make MS-DOS their primary development platform at the end of the 1980s, they’d had a reputation for pushing the latest hardware to its limit. This game, though, was something else entirely even from them. The box’s claim that it would run on an 80386 was a polite fiction at best; in reality, you needed an 80486, and one of the fastest ones at that — running at least at 50 MHz or, better yet, 66 MHz — if you wished to see anything like the silky-smooth visuals that Origin had been showing off so proudly at recent trade shows. Even Origin had to admit in their newsletter that customers had been “stunned” by the hardware Strike Commander craved. Pushed along by the kid-in-a-candy-store enthusiasm of Chris Roberts, who never had a passing fancy he didn’t want to rush right out and implement, they had badly overshot the current state of computing hardware.

Of course, said state was always evolving; it was on this fact that Origin now had to pin whatever diminished hopes they still had for Strike Commander. The talk of the hardware industry at the time was Intel’s new fifth-generation microprocessor, which abandoned the “x86” nomenclature in favor of the snazzy new focus-tested name of Pentium, another sign of how personal computers were continuing their steady march from being tools of businesspeople and obsessions of nerdy hobbyists into mainstream consumer-electronics products. Origin struck a promotional deal with Compaq Computers in nearby Houston, who, following what had become something of a tradition for them, were about to release the first mass-market desktop computer to be built around this latest Intel marvel. Compaq placed the showpiece that was Strike Commander-on-a-Pentium front and center at the big PC Expo corporate trade show that summer of 1993, causing quite a stir at an event that usually scoffed at games. “The fuse has only been lit,” went Origin’s cautiously optimistic new company line on Strike Commander, “and it looks to be a long and steady burn.”

But time would prove this optimism as well to be somewhat misplaced: one of those flashy new Compaq Pentium machines cost $7000 in its most minimalist configuration that summer. By the time prices had come down enough to make a Pentium affordable for gamers without an absurd amount of disposable income, other games with even more impressive audiovisuals would be available for showing off their hardware. Near the end of the year, Origin released an expansion pack for Strike Commander that had long been in the development pipeline, but that would be that: there would be no Strike Commander II. Chris Roberts turned his attention instead to Wing Commander III, which would raise the bar on development budget and multimedia ambition to truly unprecedented heights, not only for Origin but for their industry at large. After all, Wing Commander: Academy and Privateer, both of which had had a fraction of the development budget of Strike Commander but wound up selling just as well, proved that there was still a loyal, bankable audience out there for the core series.

Origin had good reason to play it safe now in this respect and others. When the one-year anniversary of the acquisition arrived, the accountants had to reveal to EA that their new subsidiary had done no more than break even so far. By most standards, it hadn’t been a terrible year at all: Ultima Underworld II, Serpent Isle, Wing Commander: Academy, and Wing Commander: Privateer had all more or less made money, and even Strike Commander wasn’t yet so badly underwater that all hope was lost on that front. But on the other hand, none of these games had turned into a breakout hit in the fashion of the first two Wing Commander games, even as the new facilities, new employees, and new titles going into development had cost plenty. EA was already beginning to voice some skepticism about some of Origin’s recent decisions. The crew in Austin really, really needed a home run rather than more base hits if they hoped to maintain their status in the industry and get back into their overlord’s good graces. Clearly 1994, which would feature a new mainline entry in both of Origin’s core properties for the first time since Ultima VI had dropped and Wing Commander mania had begun back in 1990, would be a pivotal year. Origin’s future was riding now on Ultima VIII and Wing Commander III.

(Sources: the book Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland; Origin’s internal newsletter Point of Origin from March 13 1992, June 19 1992, July 31 1992, September 25 1992, October 23 1992, November 6 1992, December 4 1992, December 18 1992, January 29 1993, February 12 1993, February 26 1993, March 26 1993, April 9 1993, April 23 1993, May 7 1993, May 21 1993, June 18 1993, July 2 1993, August 27 1993, September 10 1993, October 13 1993, October 22 1993, November 8 1993, and December 1993; Questbusters of April 1986 and July 1987; Computer Gaming World of October 1992 and August 1993. Online sources include “The Conquest of Origin” at The Escapist, “The Stars His Destination: Chris Roberts from Origin to Star Citizen at US Gamer, Shery Graner Ray’s blog entry “20 Years and Counting — Origin Systems,” and an interview with Warren Spector at RPG Codex.

All of the Origin games mentioned in this article are available for digital purchase at GOG.com.)

 

 
45 Comments

Posted by on September 6, 2019 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , , ,

The Designer’s Designer

Dan Bunten delivers the keynote at the 1990 Game Developers Conference.

Dan Bunten and his little company Ozark Softscape could look back on a tremendous 1984 as that year came to an end. Seven Cities of Gold had been a huge success, Electronic Arts’s biggest game of the year, doing much to keep the struggling publisher out of bankruptcy court by selling well over 100,000 copies. Bunten himself had become one the most sought-after interviewees in the industry. Everyone who got the chance to speak with him seemed to agree that Seven Cities of Gold was only the beginning, that he was destined for even greater success.

As it turned out, though, 1984 would be the high-water mark for Bunten, at least in terms of that grubbiest but most implacable metric of success in games: quantity of units shifted. The years that followed would be frustrating as often as they would be inspiring, as Bunten pursued a vision that seemed at odds with every trend in the industry, all the while trying to thread the needle between artistic fulfillment and commercial considerations.


In the wake of Seven Cities of Gold‘s success, EA badly wanted a follow-up with a similar theme, so much so that they offered Bunten a personal bonus of $5000 to make it Ozark’s next project. The result was Heart of Africa, a game which at first glance looks like precisely the sequel EA was asking for but that actually plays quite differently. Instead of exploring the Americas as Hernán Cortés during the 1600s, it has you exploring Africa as an intrepid Victorian adventurer (“Livingston, I presume?”). In keeping with the changed time and location, your goal isn’t to conquer the land for your country — Africa had, for better or for worse, already been thoroughly partitioned among the European nations by 1890, the year in which the game takes place — but simply to discover and to map. In the best tradition of Victorian adventure novels like King Solomon’s Mines, your ultimate goal is to find the tomb of a mythical Egyptian pharaoh. Bunten later admitted that the differences from Heart of Africa‘s predecessor weren’t so much a product of original design intent as improvisation after he had bumbled into an historical context that just wouldn’t work as a more faithful sequel.

Indeed, Bunten in later years dismissed Heart of Africa, his most adventure-like game ever and his last ever that was single-player only, as nothing more than “a game done to please EA”: “I honestly didn’t want to do the project.” Its biggest problem hinges on the fact that its environment is randomly generated each time you start a new game, itself an attempt to remedy the most obvious failing of adventure games as a commercial proposition: their lack of replayability. Yet the random maps can never live up to what a hand-crafted map, designed for challenge and dramatic effect, might have been; the “story” in Heart of Africa is all too clearly just a bunch of shifting interchangeable parts. Bunten later acknowledged that “the attempt to make a replayable adventure game made for a shallow product (which seems true in every other case designers have tried it as well). I guess that if elements are such that they can be randomly shifted then they [aren’t] substantive enough to make for a compelling game. So, even though I don’t like linear games, they seem necessary to have the depth a good story needs.”

Heart of Africa did quite well for EA upon its release in 1985 — well enough, in fact, to become Bunten’s third most successful game of all time. Yet the whole experience left a bad taste in his mouth. He came away from the project determined to return to the guiding vision behind his first game for EA, the commercially unsuccessful but absolutely brilliant M.U.L.E.: a vision of computer games that people played together rather than alone. In the future, he would continue to compromise at times on the style and subject matter of his games in order to sell them to his publishers, but he would never again back away from his one great principle. All of his games henceforward would be multiplayer — first, foremost, and in one case exclusively. In fact, that one case would be his very next game.

The success of his previous two games having opened something of a window of opportunity with EA, Bunten charged ahead on what he would later describe as his single “most experimental game.” Robot Rascals is a multiplayer scavenger hunt in which two physical decks of cards are integral to the game. Each player controls a robot, and must use it to collect the four items shown on the cards in her hand and return with them to home base in order to win. The game lives on the razor’s edge of pure chaos, the product both of random events generated by the computer and of a second deck of cards — the “specials” — which among other things can force players to draw new item cards, trash their old cards, or trade cards among one another; thus everyone’s goals are shifting almost constantly. As always in a Dan Bunten game, there are lots of thoughtful features here, from ways to handicap the game for players of different ages or skill levels to three selectable levels of overall complexity. He designed it to be “a game that anyone could play” rather than one limited to “special-interest groups like role-playing people or history buffs.” It can be a lot of fun, even if it’s not quite on the level of M.U.L.E. (then again, what is, right?). But this latest bid to make computer games acceptable family entertainment wound up selling hardly at all upon its release in 1986, ending Bunten’s two-game commercial hot streak.

By this point in Bunten’s career, changes in his personal life were beginning to have a major impact on the games he made. In 1985, while still working on Heart of Africa, he had divorced his second wife and married his third, with all the painful complications such disruptions entail when one is leaving children behind with the former spouse. In 1986, he and his new wife moved from Little Rock, Arkansas, to Hattiesburg, Mississippi, so she could complete a PhD. This event marked the effective end of Ozark Softscape as anything but a euphemism for Dan Bunten himself and whatever programmers and artists he happened to contract work out to. The happy little communal house/office where Dan and Bill Bunten, Jim Rushing, and Alan Watson had created games, with a neighborhood full of eager testers constantly streaming through the living room, was no more; only Watson continued to work on Bunten’s games from Robot Rascals on, and then more as just another hired programmer than a valued design voice. Even after moving back to Little Rock in 1988, Bunten would never be able to recapture the communal alchemy of 1982 to 1985.

Coupled with these changes were other, still more ominous ones in Dan Bunten himself. Those who knew him during these years generally refer only vaguely to his “problems,” and this discretion of course does them credit; I too have no desire to psychoanalyze the man. What does seem clear, however, is that he was growing increasingly unhappy as time wore on. He became more demanding of his colleagues, difficult enough to work with that many of them decided it just wasn’t worth it, even as he became more erratic in his own habits, perhaps due to an alcohol intake that struck many as alarming.

Yet Bunten was nothing if not an enigmatic personality. At the same time that close friends were worrying about his moodiness and his drinking, he could show up someplace like The Computer Game Developers Conference and electrify the attendees with his energy and ideas. Certainly his eyes could still light up when he talked about the games he was making and wanted to make. The worrisome questions were how much longer he would be allowed to make those games in light of their often meager sales, and, even more pressingly, why his eyes didn’t seem to light up about much else in his life anymore.

But, to return to the firmer ground of the actual games he was continuing to make: Modem Wars, his next one, marked the beginning of a new chapter in his tireless quest to get people playing computer games together. “We’ve failed at gathering people around the computer,” Bunten said before starting work on it. “We’re going to have to connect them out of the back by connecting their computers to each other.” He would make, in other words, a game played by two people on two separate computers, connected via modem.

Modem Wars was known as Sport of War until just prior to its release by EA in 1988, and in many ways that was a better title. Its premise is a new version of Bunten’s favorite sport of football, played not by individual athletes but by infantry, artillery, and even aircraft, if you can imagine such a thing. One might call it a mashup between two of his early designs for SSI: the strategic football simulator Computer Quarterback and the proto-real-time-strategy game Cytron Masters.

It’s the latter aspect that makes Modem Wars especially visionary. The game was nothing less than an online real-time-strategy death match years before the world had heard of such a thing. While a rudimentary artificial intelligence was provided for single-player play, it was made clear by the game’s very title that this was strictly a tool for learning to play rather than the real point of the endeavor. Daniel Hockman’s review of Modem Wars for Computer Gaming World ironically describes the qualities of online real-time strategy as a potential “problem” and “marketing weakness” — the very same qualities which a later generation would take as the genre’s main attractions:

A sizable number of gamers are not used to thinking in real-time situations. They can spend hours ordering tens of thousands of men into mortal combat, but they wimp out when they have to think under fire. They want to play chess instead of speed chess. They want to analyze instead of act. As the enemy drones zero in on their comcen, they throw up their hands in frustration when it’s knocked out before they can extract themselves from the maelstrom of fire that has engulfed them.

Whether because gamers really were daunted by this need to think on their feet or, more likely, because of the relative dearth of fast modems and stable online connections in 1988, Modem Wars became another crushing commercial disappointment for Bunten. EA declared themselves “hesitant” to keep pursuing this direction in the wake of the game’s failure. Rather than causing Bunten to turn away from multiplayer gaming, this loss of faith caused him to turn away from EA.

In the summer of 1989, MicroProse Software announced that they had signed a five-year agreement with Bunten, giving them first rights to all of the games he made during that period. The great hidden driver behind the agreement was MicroProse’s own star designer Sid Meier, who had never hidden his enormous admiration for Bunten’s work. Bunten doubtless hoped that a new, more supportive publisher would mark the beginning of a new, more commercially successful era in his career. And in the beginning at least, such optimism would, for once, prove well-founded.

Known at first simply as War!, then as War Room, and finally as Command H.Q., Bunten’s first game for MicroProse was aptly described by its designer as being akin to an abstract, casual board game of military strategy, like Risk or Axis & Allies. The big wrinkle was that this beer-and-pretzels game was to be played in real time rather than turns. But, perhaps in response to complaints about his previous game like those voiced by Daniel Hockman above, the pace is generally far less frenetic this time around. Not only can the player select an overall speed, but the program itself actually takes charge to speed up the action when not much is happening and slow it down when things heat up. Although a computer opponent is provided, the designer’s real focus was once more on modem-to-modem play.

But, whatever its designer’s preferences, MicroProse notably de-emphasized the multiplayer component in their advertising upon Command H.Q.‘s release in 1990, and this, combined with a more credible artificial intelligence for the computer opponent, gave it more appeal to the traditional wargame crowd than Modem Wars had demonstrated. Ditto a fair measure of evangelizing done by Computer Gaming World, with whom Bunten had always had a warm relationship, having even authored a regular column there for a few years in the mid-1980s. The magazine’s lengthy review concluded by saying, “This is the game we’ve all been waiting for”; they went on to publish two more lengthy articles on Command H.Q. strategy, and made it their “Wargame of the Year” for 1990. For all these reasons, Command H.Q. sold considerably better than had Bunten’s last couple of games; one report places its total sales at around 75,000 units, enough to make it his second most successful game ever.

With that to buoy his spirits, Bunten made big plans for his next game, Global Conquest. “Think of it as Command H.Q. meets Seven Cities of Gold meets M.U.L.E.,” he said. Drawing heavily from Command H.Q. in particular, as well as the old grand-strategy classic Empire, he aimed to make a globe-spanning strategy game where economics would be as important as military maneuvers. He put together a large and vocal group of play testers on CompuServe, and tried to incorporate as many of their suggestions as possible, via a huge options panel that allowed players to customize virtually every aspect of the game, from the rules themselves to the geography and topography of the planet they were fighting over, all the way down to the look of the icons representing the individual units. This time, up to four humans could play against one another in a variety of ways: they could all play together by taking turns on one computer, or they could each play on their own computer via a local-area network, or four players could share two computers that were connected via modem. The game was turn-based, but with an interesting twist designed to eliminate analysis paralysis: when the first player mashed the “next turn” button, everyone else had just twenty seconds to finish up their own turns before the execution phase began.

In later years, Dan Bunten himself had little good to say about what would turn out to be his last boxed game. In fact, he called it his absolute “worst game” of all the ones he had made. While play-testing in general is a wonderful thing, and every designer should do as much of it as possible, a designer also needs to keep his own vision for what kind of game he wants to make at the forefront. In the face of prominent-in-their-own-right, opinionated testers like Computer Gaming World‘s longtime wargame scribe Alan Emrich, Bunten failed to do this, and wound up creating not so much a single coherent strategy game as a sort of strategy-game construction set that baffled more than it delighted. “This game was a hodgepodge rather than an integration,” he admitted several years later. “It was just the opposite of the KISS doctrine. It was a kitchen-sink design. It had everything. Build your own game by struggling through several options menus.” He acknowledged as well that the mounting unhappiness in his personal life, which had now led to a divorce from his third wife, was making it harder and harder to do good work.

Released in 1992, Global Conquest under-performed commercially as well. In addition to the game’s intrinsic failings, it didn’t help matters that MicroProse had just five months prior released Sid Meier’s Civilization, another exercise in turn-based grand strategy on a global scale, also heavily influenced by Empire, that managed to be far more thematically and texturally ambitious while remaining more focused and playable as a game — albeit without the multiplayer element that was so important to Bunten.

But of course, there’s more to a game than whether it’s played by one person or more than one, and it strikes me as reasonable to question whether Bunten was beginning to lose his way as a designer in other respects even as he stuck so obstinately to his multiplayer guns. Setting aside their individual strengths and failings, the final three boxed games of Bunten’s career, with their focus on “wars” and “command” and “conquest,” can feel a little disheartening when compared to what came before. Games like M.U.L.E., Robot Rascals, and to some extent even Seven Cities of Gold and Heart of Africa had a different, friendlier, more welcoming personality. This last, more militaristic trio feels like a compromise, the product of a Dan Bunten who said that, if he couldn’t bring multiplayer gaming to the masses, he would settle for the grognard crowd, indulging their love for guns and tanks and bombs. So be it. Now, though, he was about to give that same crowd the shock of their lives.

In November of 1992, just months after completing the supremely masculine wargame Global Conquest, Dan Bunten had sexual-reassignment surgery, becoming the woman Danielle “Dani” Bunten Berry. (For continuity’s sake, I’ll generally continue to refer to her by the shorthand of “Bunten” rather than “Berry” for the remainder of this article.) It’s not for us to speculate about the personal trauma that must have accompanied such a momentous decision. What we can and should take note of, however, is that it was an unbelievably brave decision. For all that we still have a long way to go today when it comes to giving transsexuals the rights and respect they deserve, the early 1990s were a far less enlightened time than even our own on this issue. And it wasn’t as if Bunten could take comfort in the anything-goes anonymity of a New York City or San Francisco.  Dan Bunten had lived, and as Dani Bunten now continued to live, in the intensely conservative small-town atmosphere of Little Rock, Arkansas. Many of those closest to her disowned her, including her mother and her ex-wives, making it heartbreakingly difficult for her to maintain a relationship with her children. She had remained in Little Rock all these years, at no small cost to her career prospects, largely because of these ties of blood, which she had believed to be indissoluble. This rejection, then, must have felt like the bitterest of betrayals.

Dan Bunten with his beverage of choice.

The games industry as well, with its big-breasted damsels in distress and its machine-gun-toting male heroes, wasn’t exactly notable for its enlightened attitudes toward sex and gender. Many of Bunten’s old friends and colleagues would see her for the first time after her surgery and convalescence at the Game Developers Conference scheduled for April of 1993, and they looked forward to that event with almost as much trepidation as Bunten herself must have felt. It was all just so very unexpected. To whatever extent they had carried around a mental image of a man who would choose to become a woman, Dan Bunten didn’t fit the profile at all. He had been the games’ industry own Ozark Mountains boy, a true son of the South, always ready with his “folksy mountain humor” (read, “dirty jokes”). His rangy frame stood six feet two inches tall. He loved nothing more than a rough-and-tumble game of back-lot football, unless it be beer and poker afterward. As his three ex-wives and three children attested, he had certainly seemed to like women, but no one had ever imagined that he liked them enough to want to be one. What were they supposed to say to him — er, to her — now?

They needn’t have worried. Dani Bunten handled her coming-out party with the same low-key grace and humor she would display for the rest of her life as a woman. She said that she had made the switch to do her part to redress the gender imbalance inside the industry, and to help improve the aesthetics of game designers to match the improving aesthetics of their games. The tension dissipated, and soon everyone got into the spirit of the thing. A straw poll named Dani Bunten the game designer most likely to appear on the Oprah Winfrey Show. A designer named Gordon Walton had a typical experience: “I was put off when she made the change to become Dani, until the minute I spoke to her. It was clear to me she was much happier as Dani, and if anything an even more incredible person.” Another GDC regular remembered the “unhappy man” from the 1992 event, “sitting on the hallway floor drinking and smoking,” and contrasted him with the “happy woman” he now saw.

No one with any interest in the inner workings of those strangest of creatures, their fellow humans, could fail to be fascinated by Bunten’s dispatches from both sides of the gender divide. “Aren’t there things you’ve always wanted to know about women but were afraid to ask?” she said. “Well, now’s your chance!”

I had to learn a lot to actually “count” as a woman! I had to learn how to walk, speak, dress as a woman. Those little things which are necessary so that other people don’t [feel] alienated.There’s a little summary someone gave me to make clear what being a woman means: as a woman you have to sing when you speak, dance when you walk, and you have to open your heart… I know how stereotypical that sounds, but it is true! Speech for a man is something completely different: the melody of speech is fast, monotone, and decreases at the end of a sentence. Sometimes, this still happens to me, and people are always irritated. Female speech is a little bit like song – we have a lot more melody and different speech patterns. Walking is really a bit like dancing: slower and connected, with a lot of subtle movements. I enjoyed it at once.

She had few filters when talking about the nitty-gritty details:

One of the saddest changes I had to deal with after my operation was the fact that I couldn’t aim anymore when urinating. Boys — I have two little sons and a daughter — simply love to aim.

Bunten said that, in keeping with her new identity, she didn’t feel much desire to design any more wargames; this led to the end of her arrangement with MicroProse. By way of compensation, Electronic Arts that year released a nicely done “commemorative edition” of Seven Cities of Gold, complete with dramatically upgraded graphics and sound to suit the times. Bunten had little to nothing to do with the project, but it sold fairly well, and perhaps helped to remind her of her roots.

In the same spirit, Bunten’s first real project after her transformation became a new version of M.U.L.E. EA’s founder Trip Hawkins had always named that game as one of his all-time favorites, and had frequently stated how disappointed he was that it had never gotten the attention it deserved. Now, Hawkins had left his day-to-day management role at EA to run 3DO, a spin-off company peddling a multimedia set-top box for the living room. Hawkins thought M.U.L.E. would be perfect for the platform, and recruited Bunten to make it happen. It was a dream project; showing excellent taste, she still regarded M.U.L.E. as the best thing she had ever done. But the dream quickly began to sour.

3DO first requested that, instead of taking turns managing their properties on the map, players all be allowed to do so simultaneously. Bunten somewhat reluctantly agreed. And then:

As soon as I added the simultaneity, it instantly put into their heads, “Why can’t we shoot at each other?” And I said, “No guns.” And they said, “What about bombs? Can we drop a bomb in front of you? It won’t hurt you. It will be a cartoon thing, it will just slow you down.” And I said, “You don’t get it. It’s changing the whole notion of how this thing works!”

[3DO is] staking its future on the idea of a new generation of hardware and therefore, you’d assume, a new generation of software, but they said, “No, our market is still 18 to 35, male. We need something with action, something with intensity.” Chrome and sizzle. Ugh.

In the end, Bunten walked out, disappointed enough that she seriously considered getting out of games altogether, going so far as to apply for jobs as the industrial engineer Dan Bunten had once been before his first personal computer came along.

Instead she found a role with a new company called Mpath as a design and strategy consultant. The goal of that venture was to bring multiplayer gaming to the new frontier of the World Wide Web, and its founders included her fellow game designer Brian Moriarty, of Infocom and LucasArts fame. She also studied the elusive concept of “games for girls” in association with a think tank set up by Microsoft co-founder Paul Allen; some of her proposals would later come to market as the products of Purple Moon, Brenda Laurel’s brief-lived but important publisher of games for girls aged 8 to 14.

Offers to do conventional boxed games as sole designer, however, weren’t forthcoming; how much that was down to lingering personal prejudices against her for her changed sex and how much to the fact that the games she wanted to make just weren’t considered commercially viable must always be open for debate. Refusing as usual to be a victim, Bunten said that her “priorities had shifted” since her change anyway: “I don’t identify myself with the job as strongly as before.” Deciding that, for her, heaven was other people after a life spent programming computers, she devoured anthropology texts and riffed on Karl Jung’s theories of a collective unconscious. “Literature, anthropology, and even dance,” she noted, “have a good deal more to teach designers about human drives and abilities than the technologists of either end of California, who know silicon and celluloid but not much else.” So, she bided her time as a designer, waiting for a more inclusive ludic future to arrive. At the 1997 GDC, she described a prescient vision of “small creative shops” freed from the inherent conservatism of the “distribution trap” by the magic of the Internet.

That future would indeed come to pass — but, sadly, not in time for Dani Bunten Berry to see it. Shortly after delivering that speech, she went to see her doctor about a persistent cough, whereupon she was diagnosed with an advanced case of lung cancer. In one of those cruel ironies which always seem to dog the lives of us poor mortals, she had finally kicked a lifelong habit of heavy smoking just a few months before.

She appeared in public for the last time in May of 1998. The occasion was, once again, the Game Developers Conference, where she had always shone so. She struggled audibly for breath as she gave the last presentation of her life, entitled “Do Online Games Still Suck?,” but her passion carried her through. At the end of the conference, at a special ceremony held aboard the Queen Mary in Long Beach Harbor, she was presented with the first ever GDC Lifetime Achievement Award. The master of ceremonies for that evening was her friend and colleague Brian Moriarty, who knew, like everyone else in attendance, that the end was near. He closed his heartfelt tribute thus:

It is no exaggeration to characterize tonight’s honoree as the world’s foremost authority on multiplayer computer games. Nobody has worked harder to demonstrate how technology can be used to realize one of the noblest of human endeavors: bringing people together. Historians of electronic gaming will find in these eleven boxes the prototypes of the defining art form of the 21st century.

As one of those historians, I can only heartily concur with his assessment.

It would be nice to say that Dani Bunten passed peacefully to her rest. But, as anyone with any experience with lung cancer will recognize, that just isn’t how the disease works. Throughout her life, she had done nothing the easy way, and her death — ugly, painful, and slow — was no exception. On the brighter side, she did reconcile to some extent with her mother and other family members and friends who had rejected her. The end came on July 3, 1998. Rather incredibly in light of the prodigious, multifaceted life she had lived, she was just 49 years old.

It’s a life which resists pigeonholing or sloganeering. Bunten herself explicitly rejected the role of transgender advocate, inside or outside of the games industry. Near the end of her life, she expressed regret for her decision to change her physical sex, saying she could have found ways to live in a more gender-fluid way without taking such a drastic step. Whether this was a reasoned evaluation or a product of the pain and trauma of terminal illness must remain, like so much else about her, an enigma.

What is clear, however, is that Bunten, through the grace and humor with which she handled her transition and through her refusal to go away and hide thereafter as some might have wished, taught others in the games industry who were struggling with similar issues of identity that a new gender need not mean a decisive break with every aspect of one’s past — that a prior life in games could continue to be a life in games even with a different pronoun attached. She did this in a quieter way than the speechifying some might have wished for from her, but, nevertheless, do it she did. Jessica Mulligan, who transitioned from male to female a few years after her, remembers meeting Bunten shortly before her own sexual-reassignment surgery, hoping to hear some “profound words on The Transition”: “While I was looking for spiritual guidance, she was telling me where to shop for shoes. Talk about keeping someone honest! Every change in our personal lives is profound to us. You still have to pay attention to the nuts and bolts or the change is meaningless.”

Danielle Bunten Berry does her makeup.

For some, of course — even for some with generally good intentions — Danielle Bunten Berry’s transgenderism will always be the defining aspect of her life, her career in games a mere footnote to that other part of her story. But that’s not how she would have wanted it. She regarded her games as her greatest legacy after her children, and would doubtless want to be remembered as a game designer above all else.

Back in 1989, after Modem Wars had failed in the marketplace, Electronic Arts decided that the lack of “a network of people to play” was a big reason for its failure. The great what-if question pertaining to Bunten’s career is what she might have done in partnership with an online network like CompuServe, which could have provided stable connectivity along with an eager group of players and all the matchmaking and social intrigue anyone could ask for. She finally began to explore this direction late in her life, through her work with Mpath. But what might have happened if she had made the right connections — forgive the pun! — earlier? We can only speculate.

As it is, though, it’s true that, in terms of units shifted and profits generated, there have been far more impressive careers. She suffered the curse of any pioneer who gets too far out in front of the culture. All of her eleven games combined probably sold no more than 400,000 copies at the outside, a figure some prominent designers’ new games can easily better on their first week today. Certainly her commercial disappointments far outnumber her successes. But then, sales aren’t the only metric by which to measure success.

Dani Bunten, one might say, is the designer’s designer. Greg Costikyan once told what happened when he offered to introduce Warren Spector — one of those designers who can sell more games in a week than Bunten did in a lifetime — to her back in the day: “He regretfully refused; he had loved M.U.L.E. so much he was afraid he wouldn’t know what to say. He would sound like a blithering fanboy and be embarrassed.” Chris Crawford calls the same title simply “the best computer-game design of all time.” Brenda Laurel dedicated Purple Moon’s output to Bunten. Sid Meier was so taken with Seven Cities of Gold that Pirates!, Railroad Tycoon, and Civilization, his trilogy of masterpieces, can all be described as extensions in one way or another of what Bunten first wrought. And Seven Cities of Gold was only Meier’s second favorite Bunten game: he loved M.U.L.E. so much that he was afraid to even try to improve on it.

Ironically, the very multiplayer affordances that Bunten so steadfastly refused to give up on, much to the detriment of her income, continue to make it difficult for her games to be seen at their best today. M.U.L.E. can be played as its designer really intended it only on an Atari 8-bit computer — real or emulated — with four vintage joysticks plugged in and four players holding onto them in a single living room; that is, needless to say, not a trivial thing to arrange in this day and age. Likewise, the need to have the exceedingly rare physical cards to hand has made it impossible for most people to even try out Robot Rascals today. (It took me months to track down a pricey German edition on eBay.) And Bunten’s final run of boxed games, reliant on ancient modem hookups as they are, are even more difficult to play with others today than they were in their own time.

Dani Bunten didn’t have an easy life, internally or externally. She remained always an enigma — the life of the party who goes home alone, the proverbial stranger among her best friends. One person who knew her after she became a woman claimed she still had a “shadowed, slightly haunted look, even when she was smiling.” Given the complicated emotions that are still stirred up in so many of us by transgenderism, that may have been projection. On the other hand, though, it may have been perception. Even Bunten’s childhood had been haunted by the specter of familial discord and possibly abuse, to such an extent that she refused to talk much about it. But she did once tell Greg Costikyan that she grew up loving games mainly because it was only when playing them that her family wasn’t “totally dysfunctional.”

I think that for Dani Bunten games were most of all a means of communication, a way of punching through that bubble of ego and identity that isolates all of us to one degree or another, and that perhaps isolated her more so than most. Thus her guiding vision became, as Sid Meier puts it, “the family gathered around the computer.” After all, it’s a small step to go from communicating to connecting, from connecting to loving. She openly stated that she had made Robot Rascals for her own family most of all: “They’ve never played my games. I think they found them too esoteric or complex. I wanted something that I could enjoy with them, that they’d all be able to relate to.” The tragedy for her — perhaps a key to the essential sadness many felt at Bunten’s core, whether she was living as a man or a woman — is that reality never quite lived up to that Norman Rockwell dream of the happy family gathered around a computer; her daughter, the duly appointed caretaker of her legacy, still calls M.U.L.E. “boring and tedious” today. But the dream remains, and her games have given those of us privileged to discover them great joy and comfort in the midst of lives that have admittedly — hopefully! — been far easier than that of their creator. And so I’ll close, in predictable but unavoidable fashion, with Danielle Bunten Berry’s most famous quote — a quote predictable precisely because it so perfectly sums up her career: “No one on their death bed ever said, ‘I wish I had spent more time alone with my computer!'” Words to live by, my fellow gamers. Words to live by.

Danielle Bunten Berry, 1949-1998.

(Sources: Compute! of March 1989, December 1989, April 1990, January 1992, and December 1993; Questbusters of May 1986; Commodore Power Play of June/July 1986; Commodore Magazine of July 1987, October 1988, and June 1989; Ahoy! of March 1987; Computer Gaming World of January/February 1987, May 1988, February 1989, February 1990, December 1990, February 1991, March 1991, May 1991, April 1992, June 1992, August 1992, June 1993, August 1993, July 1994, September 1995, and October 1998; Family Computing of January 1987; Compute!’s Gazette of August 1989; The One of April 1991; Game Players PC Entertainment of September 1992; Game Developer of February/March 1995, July 1998, September 1998, and October 1998; Electronic Arts’s newsletter Farther of Winter 1986; Power Play of January 1995; Arkansas Times of February 8 2012. Online sources include the archived contents of the old World of Mule site, the archived contents of a Danielle Bunten Berry tribute site, the Salon article “Get Behind the M.U.L.E.”, and Bunten’s interview at Halcyon Days.)

 
33 Comments

Posted by on November 16, 2018 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

The Lost Files of Sherlock Holmes

In 1989, Trip Hawkins reluctantly decided to shift Electronic Arts’s strategic focus from home computers to videogame consoles, thereby to “reach millions of customers.” That decision was reaching fruition by 1992. For the first time that year, EA’s console games outsold those they published for personal computers. The whole image of the company was changing, leaving behind the last vestiges of the high-toned “software artists” era of old in favor of something less intellectual and more visceral — something aimed at the mass market rather than a quirky elite.

Still, corporate cultures don’t change overnight, and the EA of 1992 continued to release some computer games which were more in keeping with their image of the 1980s than that of this new decade. One of the most interesting and rewarding of these aberrations — call them the product of corporate inertia — was a game called The Lost Files of Sherlock Holmes, whose origin story doesn’t exactly lead one to expect a work of brilliance but which is in fact one of the finest, most faithful interpretations of the legendary detective in the deerstalker cap ever to make its way onto a monitor screen.

The initial impetus for Lost Files was provided by an EA producer named Christopher Erhardt. After studying film and psychology at university, Erhardt joined the games industry in 1987, when he came to Infocom to become the in-house producer for their latter-day lineup of graphical games from outside developers, such as Quarterstaff, BattleTech: The Crescent Hawk’s Inception, and Arthur: The Quest for Excalibur. When Infocom was shuttered in 1989, he moved on to EA in the same role, helming a number of the early Sega Genesis games that did so much to establish the company’s new identity. His success on that front gave him a fair amount of pull, and so he pitched a pet idea of his: for a sort of computerized board game that would star Sherlock Holmes along with a rotating cast of suspects, crimes, and motives, similar to the old 221B Baker Street board game as well as a classic computer game from Accolade called Killed Until Dead. It turned out that EA’s management weren’t yet totally closed to the idea of computer games that were, as Erhardt would later put it, “unusual and not aimed at the mass market” — as long, that is, as they could be done fairly inexpensively.

Mythos Software. On the top row are David Wood, Elinor Mavor, and Scott Mavor. On the bottom row are James Ferguson and John Dunn.

In order to meet the latter condition, Erhardt enlisted a tiny Tempe, Arizona, company known as Mythos Software — not to be confused with the contemporaneous British strategy-games developer Mythos Games. This Mythos was being run by one James Ferguson, its fresh-out-of-university founder, from the basement of his parents’ house. He was trying to break into the wider world of software development that lay outside the bounds of the strictly local contracts he had fulfilled so far; his inexperience and eagerness ensured that Mythos would work cheap. And in addition to cut-rate pricing, Ferguson had another secret weapon to deploy: an artist named Scott Mavor who had a very special way with pixel graphics, a technique that EA’s in-house employees would later come to refer to as “the Mavor glow.” The highly motivated Mythos, working to Erhardt’s specifications, created a demo in less than two weeks that was impressive enough to win the project a tentative green light.

Eric Lindstrom and R.J. Berg.

Another EA employee, a technical writer named Eric Lindstrom, saw the demo and suggested turning what had been planned as a computerized board game into a more narratively ambitious point-and-click adventure game. When Erhardt proved receptive to the suggestion, Lindstrom put together the outline of a story, “The Mystery of the Serrated Scalpel.” He told Erhardt that he knew the perfect person to bring the story to life: one of his colleagues among EA’s manual writers, a passionate Sherlock Holmes aficionado — he claimed to have read Arthur Conan Doyle’s complete canon of Holmes stories “two or three times” — named R.J. Berg.

The project’s footing inside EA was constantly uncertain. Christopher Erhardt says he “felt like I was playing the Princess Bride, and the dread pirate Roberts was coming It was always, ‘Yep – we may cancel it.'” But in the end the team was allowed to complete their point-and-click mystery, despite it being so utterly out of step with EA’s current strategic focus, and it was quietly released in the fall of 1992.

I find the critical dialog that followed, both in the immediate wake of Lost Files‘s release and many years later in Internet circles, to be unusually interesting. In particular, I’d like to quote at some length from Computer Gaming World‘s original review, which was written by Charles Ardai, one of the boldest and most thoughtful — and most entertaining — game reviewers of the time; this I say even as I find myself disagreeing with his conclusions far more often than not. His review opens thus:

If there is any character who has appeared in more computer games than Nintendo’s plump little goldmine, Mario, it has to be Sherlock Holmes. There have been almost a dozen Holmes-inspired games over the years, one of the best being Sherlock Holmes Consulting Detective, which is currently available in two different CD-ROM editions from ICOM. Other valiant attempts have included Imagic’s Sherlock Holmes in Another Bow, in which Holmes took a sea voyage with Gertrude Stein, Picasso, Thomas Edison, and Houdini, among others; and Infocom’s deadly serious Sherlock: Riddle of the Crown Jewels.

The difference between Holmes and Mario games, however, is that new Mario games are always coming out because the old ones sold like gangbusters, while new Sherlock Holmes games come out in spite of the fact that their predecessors sold like space heaters in the Sahara. It is noteworthy that, until ICOM, no company had ever released more than one Sherlock Holmes game, while all the Mario games come from the same source. It is also worth noting that the Holmes curse is not limited to games: the last few Holmes movies, such as Without a Clue and Young Sherlock Holmes, were not exactly box-office blockbusters.

The paradox of Sherlock Holmes can be stated so: while not that many people actually like the original Sherlock Holmes stories, everyone seems to think that everyone else adores them. Like Tarzan and Hawkeye, Holmes is a literary icon, universally known and much-beloved as a character in the abstract — not, however, as part of any single work. Finding someone who has actually read and enjoyed the writing of Edgar Rice Burroughs, James Fenimore Cooper, or Arthur Conan Doyle requires the patience of Diogenes. Most people know the character from television and the movies, at best; at worst, from reviews of television shows and movies they never bothered to see.

So, why do new Holmes adaptations surface with such regularity? Because the character is already famous and the material is in the public domain (thereby mitigating the requisite licensing fees associated with famous characters of more recent vintage. Batman or Indiana Jones, for instance.) Another answer is that Sherlock Holmes is seen as bridging the gap between entertainment and literature. Game companies presumably hope to cash in on the recognition factor and have some of the character’s ponderous respectability rub off on their product. They also figure that they can’t go wrong basing their games on a body of work that has endured for almost a century.

Unfortunately for them, they are wrong. There are only so many copies of a game that one can sell to members of the Baker Street Irregulars (the world’s largest and best-known Sherlock Holmes fan club), and a vogue for Victoriana has never really caught on among the rest of the game-buying population. The result is that, while Holmes games have been good, bad, and indifferent, their success has been uniformly mediocre.

This delightfully cynical opening gambit is so elegantly put together that one almost hesitates to puncture its cogency with facts. Sadly, though, puncture we must. While there were certainly Sherlock Holmes games released prior to Lost Files that flopped, there’s no evidence to suggest that this was the fault of the famous detective with his name on the box, and plenty of evidence to the contrary: that his name could, under the right circumstances, deliver at least a modest sales boost. In addition to the Sherlock Holmes Consulting Detective CD-ROM productions, a counter-example to Ardai’s thesis that’s so huge even he has to acknowledge it — the first volume of that series sold over 1 million units — there’s also the Melbourne House text adventure Sherlock; that game, the hotly anticipated followup to the bestselling-text-adventure-of-all-time The Hobbit, likely sold well over 100,000 units in its own right in the much smaller market of the Europe of 1984. Even Infocom’s Riddle of the Crown Jewels, while by no means a smash hit, sold significantly better than usual for an Infocom game in the sunset of the company’s text-only era. (Nor would I describe that game as “deadly serious” — I could go with “respectful” at most — but that’s perhaps picking nits.)

Still, setting aside those inconvenient details, it’s worth considering this broader question of just why there have been so many Sherlock Holmes games over the years. Certainly the character doesn’t have the same immediate appeal with the traditional gaming demographic as heavyweight properties like Star Wars and Star Trek, Frodo Baggins and Indiana Jones — or, for that matter, the born-in-a-videogame Super Mario. The reason for Sherlock’s ubiquity in the face of his more limited appeal is, of course, crystal clear, as Ardai recognizes: he’s in the public domain, meaning anyone who wishes to can make a Sherlock Holmes game at any time without paying anyone. [1]There have been occasional questions about the extent to which Sherlock Holmes and his supporting cast truly are outside all bounds of copyright, usually predicated on the fact that the final dozen stories were published in the 1920s, the beginning of the modern copyright era, and thus remain protected. R.J. Berg remembers giving “two copies of the game and a really trivial amount of money” to Arthur Conan Doyle’s aged granddaughter, just to head off any trouble on that front. When a sequel to Lost Files of Sherlock Holmes was published in 1996, no permission whatsoever was sought or demanded.

If you’re going to do Sherlock Holmes, you just have to get the fog right.

As such, Holmes occupies a nearly unique position in our culture. He’s one of the last great fictional icons, historically speaking, who’s so blessedly free of intellectual-property restrictions. Absolutely everyone, whether they’ve ever read a story or seen a film featuring him or not, knows him. The only characters with a remotely similar degree of recognizability who postdate him are Dracula, the Wizard of Oz, and Peter Pan — and neither of the latter two at least presents writers with quite the same temptation to tell new story after story after story.

As is noted in Lost Files‘s manual, Sherlock Holmes has become such an indelible part of our cultural memory that when we see him we experience a sort of vicarious nostalgia for a London none of us ever knew: “Gas lamps, the sound of horses’ hooves, steam locomotives, and romantic street cries. And then there is the atmosphere of that cozy room in Baker Street: Holmes in his armchair before a roaring coal fire, legs stretched out before him, listening with Dr. Watson to yet another bizarre story.” One might say that Sherlock Holmes gets the chronological balance just right, managing to feel both comfortably, nostalgically traditional and yet also relevant and relatable. In contrast to the Victorian scenery around him, his point of view as a character feels essentially modern, applicable to modern modes of storytelling. I’m not sure that any other fictional character combines this quality to quite the same extent with a freedom from copyright lawyers. These factors have fostered an entire creative subculture of Sherlockia which spans the landscape of modern media, dwarfing Arthur Conan Doyle’s canonical four novels and 56 short stories by multiple orders of magnitude.

The relative modernity of Sherlock Holmes is especially important in the context of interactive adaptations. The player of any narrative-driven game needs a frame of reference — needs to understand what’s expected of her in the role she’s expected to play. Thankfully, the divide between Sherlock Holmes and the likes of C.S.I. is a matter of technology rather than philosophy; Sherlock too solves crimes through rationality, combining physical evidence, eyewitness and suspect interviews, and logical deduction to reach a conclusion. Other legendary characters don’t share our modern mindset; it’s much more difficult for the player to step into the role of an ancient Greek hero who solves problems by sacrificing to the gods or an Arthurian knight who views every event as a crucible of personal honor. (Anyone doubtful of Sherlock Holmes’s efficacy in commercial computer games should have a look at the dire commercial history of Arthurian games.)

With so much material to make sense of, post-Doyle adventures of Sherlock Holmes get sorted on the basis of various criteria. One of these is revisionism versus faithfulness. While some adaptations go so far as to transport Sherlock and his cronies hook, line, and sinker into our own times, others make a virtue out of hewing steadfastly to the character and setting described by Arthur Conan Doyle. This spirit of Sherlockian fundamentalism, if you will, is just one more facet of our long cultural dialog around the detective, usually manifesting as a reactionary return to the roots when other recent interpretations are judged to have wandered too far afield.

No matter how much the Sherlockian fundamentalists kick and scream, however, the fact remains that the Sherlock Holmes of the popular imagination has long since become a pastiche of interpretations reflecting changing social mores and cultural priorities. That’s fair enough in itself — it’s much of the reason why Doyle’s timeless sleuth remains so timeless — but it does make it all too easy to lose sight of Holmes and Watson as originally conceived in the stories. Just to cite the most obvious example: Holmes’s famous deerstalker cap is never mentioned in the text of the tales, and only appeared on a few occasions in the illustrations that originally accompanied them. The deerstalker became such an iconic part of the character only after it was sported by the actor Basil Rathbone as an item of daily wear — an odd choice for the urban Holmes, given that it was, as the name would imply, a piece of hunting apparel normally worn by sporting gentlemen in the countryside — in a long series of films, beginning with The Hound of the Baskervilles in 1939.

Although Lost Files doesn’t go so far as to forgo the deerstalker — there are, after all, limits to these things — it does generally try to take its cue from the original stories rather than the patchwork of interpretations that followed them. Berg:

I definitely aimed for Holmesian authenticity. I’d like to think that, if he were alive, Doyle would like the game. After all, the characters of Holmes and Watson have been manipulated quite a bit by the various media they’ve appeared in, especially the films. For example, the Watson of Lost Files is definitely Doyle’s Watson, competent and intelligent, rather than the bumbling character portrayed in many of the movies. I also wanted to retain Holmes’s peculiar personality. He’s really not that likable a character; he’s arrogant, a misogynist, and extremely smug.

This spirit of authenticity extends to the game’s portrayal of Victorian London. There are, I’ve always thought, two tiers when it comes to realistic portrayals of real places in fiction. Authors on the second tier have done a whole lot of earnest research into their subject, and they’re very eager to show it all to you, filling your head with explicit descriptions of things which a person who actually lived in that setting would never think twice about, so ingrained are they in daily existence. Authors on the top tier, by contrast, have seemingly absorbed the setting through their pores, and write stories that effortlessly evoke it without beating you over the head with all the book research they did to reach this point of imaginative mastery.

Indeed, Sherlock. Leaving the cozy environs of 221B Baker Street.

Lost Files largely meets the latter bar as it sends you around to the jewelers and tobacconists, theaters and pubs, opulent mansions and squalid tenements of fin-de-siècle London. The details are there for when you need them or decide to go looking for them; just try mousing around the interior of 221B Baker Street. (“A typical sitting-room chair. The sheen of its wine-red velveteen covering shows that it is well-used. A dark purple silk dressing gown with a rolled collar is carelessly crumpled on the seat and the antimacassar requires changing.”) More impressive, though, is the way that the game just breathes its setting in that subtle way that can only be achieved by a writer with both a lighter touch and countless hours of immersion in the period at his command. For example Berg spent time reading Charles Dickens as well as Arthur Conan Doyle in order to capture the subtle rhythms of Victorian English in his conversations. This version of Holmes’s London isn’t the frozen-in-amber museum exhibit it sometimes comes off as in other works of Sherlockia. “We wanted a dirty game,” says Eric Lindstrom. “We wanted people to feel that people were burning coal, that they could see who was walking in the streets. Just as it was in London at the time.”

There is, however, one important exception to the game’s rule of faithfulness to the original stories: Lost Files presents a mystery that the reader can actually solve. In light of the place Holmes holds in our cultural memory as the ultimate detective, one of the great ironies of Doyle’s stories is that they really aren’t very good mysteries at all by the standard of later mystery fiction — a standard which holds a good mystery to be an implicit contest between writer and reader, in which the reader is presented with all the clues and challenged to solve the case before the writer’s detective does so. Doyle’s stories cheat egregiously by this standard, hiding vital evidence from the reader, and often positing a case’s solution on a chain of conjecture that’s nowhere near as ironclad as the great detective presents it to be. Eric Lindstrom:

The [original] stories do not work the way we are used to today. They are not whodunnits; whodunnits only became popular later. Readers have virtually no way of finding out who the culprit is. Sometimes the offender does not even appear in the plot. These are adventure stories narrated from the perspective of Dr. Watson.

For obvious reasons, Lost Files can’t get away with being faithful to this aspect of the Sherlock Holmes tradition. And so the mystery it presents is straight out of Arthur Conan Doyle — except that it plays fair. Notably, you play as Holmes himself, not, as in the original stories, as Watson. Thus you know what Holmes knows, and the game can’t pull the dirty trick on you, even if it wanted to, of hiding information until the big reveal at the end. Many other works of Sherlockia — even the otherwise traditionalist ones — adapt the same approach, responding to our post-nineteenth-century perception of what a good mystery story should be.

And make no mistake: “The Case of the Serrated Scalpel” is a very good mystery indeed. I hesitate to spoil your pleasure in it by saying too much, and so will only state that what begins as the apparently random murder of an actress in an alley behind the Regency Theatre — perhaps by Jack the Ripper, leaving Whitechapel and trying his hand in the posher environs of Mayfair? — keeps expanding in scope, encompassing more deaths and more and more Powerful People with Secrets to Keep. As I played, I was excited every time I made a breakthrough. Even better, I felt like a detective, to perhaps a greater extent than in any computer game I’ve ever played. Among games in general, I can only compare the feeling of solving this mystery to that of tackling some of the more satisfying cases in the Sherlock Holmes Consulting Detective tabletop game.

Part of the reason the mystery comes together so well is just down to good adventure-game design principles, of the sort which eluded so many other contemporary practitioners of the genre. Berg:

The idea was to produce a game that was different from existing adventures, which I frankly felt were often tedious. We wanted to eliminate the elements that tend to detract from the reality of the experience — things like having to die in order to learn some crucial information, constantly having to re-cover the same territory, and the tendency to simply pick up and use every object you encounter. We wanted to give players a deeper experience.

So, there are none of the dreaded adventure-game dead ends in Lost Files. More interestingly, the design does, as Berg alludes above, mostly eschew the typical use-unlikely-object-in-unlikely-place model of gameplay. Tellingly, the few places where it fails to do so are the weakest parts of the game.

As I’ve noted before, the classic approach to the adventure game, as a series of physical puzzles to solve, can be hugely entertaining, but it almost inevitably pushes a game toward comedy, often in spite of its designers’ best intentions. Most of us have played alleged interactive mysteries that leave you forever messing about with slider puzzles and trivial practical problems of the sort that any real detective would solve in five minutes, just by calling for backup. In Infocom’s Sherlock: Riddle of the Crown Jewels, for example, you learn that a stolen ruby is hidden in the eye of the statue of Lord Nelson on top of Nelson’s Column, and then get to spend the next little while trying to get a pigeon to fetch it for you instead of, you know, just telling Inspector Lestrade to send out a work crew. Lost Files does its level best to resist the siren call of the trivial puzzle, and, with only occasional exceptions, it succeeds. Thereby is the game freed to become one of the best interactive invocations of a classic mystery story ever. You spend your time collecting and examining physical evidence, interviewing suspects, and piecing together the crime’s logic, not solving arbitrary road-block puzzles. Lost Files is one of the few ostensibly serious adventure games of its era which manages to maintain the appropriate gravitas throughout, without any jarring breaks in tone.

This isn’t to say that it’s po-faced or entirely without humorous notes; the writing is a consistent delight, filled with incisive descriptions and flashes of dry wit, subtle in all the ways most computer-game writing is not. Consider, for example, this description of a fussy jeweler: “The proprietor is a stern-looking woman, cordial more through effort than personality. She frequently stares at the cleaning girl who tidies the floor, to make sure she is still hard at work.” Yes, this character is a type more than a personality — but how deftly is that type conveyed! In two sentences, we come to know this woman. I’d go so far as to call R.J. Berg’s prose on the whole better than that of the rather stolid Arthur Conan Doyle, who tended to bloviate on a bit too much in that all too typical Victorian style.

The fine writing lends the game a rare quality that seems doubly incongruous when one considers the time in which it was created, when multimedia was all the rage and everyone was rushing to embrace voice acting and “interactive movies.” Ditto the company which published it, who were pushing aggressively toward the forefront of the new mass-media-oriented approach to games. In spite of all that, darned if Lost Files doesn’t feel downright literary — thoughtful, measured, intelligent, a game to take in slowly over a cup of tea. Further enhancing the effect is its most unique single technical feature: everything you do in the game is meticulously recorded in an in-game journal kept by the indefatigable Dr. Watson. The journal will run into the hundreds of onscreen “pages” by the time you’re all done. It reads surprisingly well too; it’s not inconceivable to imagine printing it out — the handy option to print it or save it to a file is provided — and giving it to someone else to read with pleasure. That’s a high standard indeed, one which vanishingly few games could meet. But I think that The Lost Files of Sherlock Holmes just about manages it.

Having given so much praise to Lindstrom and Berg’s design and writing, I have to give due credit as well to Mythos Software’s efforts to bring it all to life. The interface of Lost Files is thoroughly refined and pleasant to work with, a remarkable achievement considering that this was the first point-and-click graphic adventure to be made by everyone involved. An optional but extremely handy hotspot finder minimizes the burden of pixel hunting, and the interface is full of other thoughtful touches, like a default action that is attached to each object; this saves you more often than not from having to make two clicks to carry out an action.

Finally, praise must be given to Scott Mavor’s “Mavor glow” graphics as well. To minimize the jagged edges typical of pictures drawn in the fairly low resolution of 256-color VGA graphics, Mavor avoided sharp shifts in color from pixel to pixel. Instead he blended his edges together gradually, creating a lovely, painterly effect that does indeed almost seem to glow. Scott’s mother Elinor Mavor, who worked with him to finish up the art in the latter stages of the project: [2]Scott Mavor died of cancer in 2008

Working with just 256 colors, Scott showed me how he created graduating palettes of each one, which allowed him to do what he called “getting rid of the dots” in each scene. To further mute the pixels, he kept the colors on the darker side, which also enhanced the Victorian mood.

Weaving the illusion of continuous-tone artwork with all those little “dots” made us buggy-eyed after a long day’s work. One night, I woke up, went into the bathroom, turned on the light, and the world just pixilated in front of me. Scary imprints on my retinas had followed me away from the computer monitor, rendering my vision as a pointillistic painting à la George Seurat.

While the graphics of its contemporaries pop out at you with bright, bold colors, the palette of Lost Files of Sherlock Holmes smacks more of the “brown sauce” of the old masters — murky, mysterious, not initially jaw-dropping but totally in keeping with the mood of the script. As you, playing the diligent detective, begin to scan them carefully, the pictures reveal more and more details of the sort that are all too easy to overlook at a quick glance. It makes for an unusually mature aesthetic statement, and a look that can be mistaken for that of no other game.

Backstage at the opera.

Given all its strengths, I find it surprising that Lost Files has gotten more than its share of critical flak over the years. I have a theory as to why that should be, but before I get to that I’ll let one of the naysayers share his point of view. Even after admitting that the story is “a ripping yarn,” the graphics atmospheric, the period details correct, and the writing very good, Charles Ardai concludes his review thusly:

Don’t get me wrong: the dialogue is well-written, the choices are entertaining, and in most cases the actions the game requires the player to perform are very interesting. The story is good and the game is a pleasure to watch. Yet, that is what one does — watch.

This game wants, more than anything in the world, to be a Sherlock Holmes movie. Though it would be a very good one if it were, it is not. Therefore, it is deeply and resoundingly unsatisfying. The plot unfolds quite well, with plenty of twists, but the player has no more control over it than he would if he were reading a novel. The player is, at best, like an actor in a play. Unfortunately, said actor has not been given a copy of the script. He has to hit his marks and say his lines by figuring out the cues given by the other characters and reading his lines off the computer equivalent of cue cards.

If this is what one wants — a fine Sherlock Holmes pastiche played out on the computer screen, with the player nominally putting the lead character through his paces — fine. “The Case of the Serrated Scalpel” delivers all that one could hope for in that vein. If one wants a game — an interactive experience in which one’s decisions have an effect on what happens — this piece of software is likely to disappoint.

The excellent German podcast Stay Forever criticized the game along similar — albeit milder — lines in 2012. And in his mostly glowing 2018 review of the game for The Adventure Gamer joint-blogging project, Joe Pranevich as well noted a certain distancing effect, which he said made him feel not so much like he was playing Sherlock Holmes and solving a mystery as watching Sherlock do the solving. The mystery, he further notes — correctly — can for the most part be solved by brute force by the patient but obtuse player, simply by picking every single conversation option when talking to every single character and showing each of them every single object you’ve collected.

At the extreme, criticisms like these would seem to encroach on the territory staked out by the noted adventure-game-hater Chris Crawford, who insists that the entire genre is a lie because it cannot offer the player the ability to do anything she wants whenever she wants. I generally find such complaints to be a colossal bore, premised on a misunderstanding of what people who enjoy adventure games find most enjoyable about them in the first place. But I do find it intriguing that these sorts of complaints keep turning up so often in the case of this specific game, and that they’re sometimes voiced even by critics generally friendly to the genre. My theory is that the mystery of Lost Files may be just a little bit too good: it’s just enticing enough, and just satisfying enough to slowly uncover, that it falls into an uncanny valley between playing along as Sherlock Holmes and actually being Sherlock Holmes.

But of course, playing any form of interactive fiction must be an imaginative act on the part of the player, who must be willing to embrace the story being offered and look past the jagged edges of interactivity. Certainly Lost Files is no less interactive than most adventure games, and it offers rich rewards that few can match if you’re willing to not brute-force your way through it, to think about and really engage with its mystery. It truly is a game to luxuriate in and savor like a good novel. In that spirit, I have one final theory to offer you: I think this particular graphic adventure may be especially appealing to fans of textual interactive fiction. Given its care for the written word and the slow-build craftsmanship of its plotting, it reminds me more of a classic Infocom game than most of the other, flashier graphic adventures that jostled with it for space on store shelves in the early 1990s.

Which brings me in my usual roundabout fashion to the final surprising twist in this very surprising game’s history. After its release by a highly skeptical EA, its sales were underwhelming, just as everyone had been telling Christopher Erhardt they would be all along. But then, over a period of months and years, the game just kept on selling at the same slow but steady clip. It seemed that computer-owning Sherlock Holmes aficionados weren’t the types to rush out and buy games when they were hot. Yet said aficionados apparently did exist, and they apparently found the notion of a Sherlock Holmes adventure game intriguing when they finally got around to it. (Somehow this scenario fits in with every stereotype I carry around in my head about the typical Sherlock Holmes fan.) Lost Files‘s sales eventually topped the magical 100,000-unit mark that separated a hit from an also-ran in the computer-games industry of the early- and mid-1990s.

It wasn’t a very good idea, but they did it anyway. R.J. Berg on a sound stage with an actress, filming for the 3DO version of Lost Files of Sherlock Holmes. Pictures like this were in all the games magazines of the 1990s. Somehow such pictures — not to mention the games that resulted from them — seem far more dated than Pong these days.

Lost Files of Sherlock Holmes may not have challenged the likes of John Madden Football in the sales sweepstakes, but it did make EA money, and some inside the company did notice. In 1994, they released a version for the 3DO multimedia console. For the sake of trendiness, this version added voice acting and inserted filmed footage of actors into the conversation scenes, replacing the lovely hand-drawn portraits in the original game and doing it no new aesthetic favors in the process. In 1996, with the original still selling tolerably well, most of the old team got back together for a belated sequel — The Lost Files of Sherlock Holmes: Case of the Rose Tattoo — that no one would ever have dreamed they would be making a couple of years before.

But then, almost everything about the story of Lost Files is unlikely, from EA of all companies deciding to make it — or, perhaps better said, deciding to allow it to be made — to a bunch of first-time adventure developers managing to put everything together so much better than many established adventure-game specialists were doing at the time. And how incredibly lucky for everyone involved that such a Sherlock Holmes devotee as R.J. Berg should have been kicking around writing manuals for EA, just waiting for an opportunity like this one to show his chops. I’ve written about four Sherlock Holmes games now in the course of this long-running history of computer gaming — yet another measure of the character’s cultural ubiquity! — and this one nudges out Riddle of the Crown Jewels to become the best one yet. It just goes to show that, no matter how much one attempts to systematize the process, much of the art and craft of making games comes down to happy accidents.

(Sources: Compute! of April 1993 and June 1993; Computer Gaming World of February 1993; Questbusters of September 1988 and December 1992; Electronic Games of February 1993. Online sources include Elinor Mavor’s remembrances of the making Lost Files of Sherlock Holmes, the comprehensive Game Nostalgia page on the game, the Stay Forever podcast episode devoted to the game, Joe Pranevich’s playthrough for The Adventure Gamer, the archived version of the old Mythos Software homepage, and Jason Scott’s “Infocom Cabinet” of vintage documents.

Feel free to download Lost Files of Sherlock Holmes from right here, in a format designed to be as easy as possible to get running under your platform’s version of DOSBox or using ScummVM.)

Footnotes

Footnotes
1 There have been occasional questions about the extent to which Sherlock Holmes and his supporting cast truly are outside all bounds of copyright, usually predicated on the fact that the final dozen stories were published in the 1920s, the beginning of the modern copyright era, and thus remain protected. R.J. Berg remembers giving “two copies of the game and a really trivial amount of money” to Arthur Conan Doyle’s aged granddaughter, just to head off any trouble on that front. When a sequel to Lost Files of Sherlock Holmes was published in 1996, no permission whatsoever was sought or demanded.
2 Scott Mavor died of cancer in 2008
 
 

Tags: , , ,

Whither the Software Artist? (or, How Trip Hawkins Learned to Stop Worrying and Love the Consoles)

One of the places we ran the “Can a computer make you cry?” [advertisement] was in Scientific American. Scientific American readers weren’t even playing videogames. Why the hell are you wasting any of this really expensive advertising? You’re competing with BMW for that ad.

— Trip Hawkins (EA Employee #1)

Consumers were looking for a brand signal for quality. They didn’t lionize the game makers as these creators to fawn over. They thought of the game makers almost as collaborators in their experience. So apostatizing didn’t make sense to the consumers.

— Bing Gordon (EA Employee #7)

In the ’80s that was an interesting experiment, that whole trying-to-make-them-into-rock-stars kind of thing. It was certainly a nice way to recruit top talent. But the reality is that computer programmers and artists and designers are not rock stars. It may have worked for the developers, but I don’t think it had any impact on consumers.

— Stewart Bonn (EA Employee #19)

One of the stories that gamers most love to tell each other is that of Electronic Arts’s fall from grace. If you’re sufficiently interested in gaming history to be reading this blog, you almost certainly know the story in the broad strokes: how Trip Hawkins founded EA in 1982 as a haven for “software artists” doing cutting-edge work; how he put said artists front and center in rock-star-like poses in a series of iconic advertisements, the most famous of which asked whether a computer could make you cry; how he wrote on the back of every stylish EA “album cover” not about EA as a company but as “a collection of electronic artists who share a common goal to fulfill the potential of personal computing”; and how all the idealism somehow dissipated to give us the EA of today, a shambling behemoth that crushes more clever competitors under its sheer weight as it churns out sequel after sequel, retread after retread. The exact point where EA became the personification of everything retrograde and corporate in gaming varies with the teller; perhaps the closest thing to a popular consensus is the rise of John Madden Football and EA Sports in the early 1990s, when the last vestiges of software artistry in the company’s advertisements were replaced by jocks shouting, “It’s in the game!” Regardless of the specifics, though, everyone agrees that It All Went Horribly Wrong at some point. The story of EA has become gamers’ version of a Biblical tragedy: “For what shall it profit a man, if he shall gain the whole world, and lose his own soul?”

Of course, as soon as one starts pulling out Bible quotes, it profits to ask whether one has gone too far. And, indeed, the story of EA is often over-dramatized and over-simplified. Questions of authenticity and creativity are always fraught; to imagine that anyone is really in the arts just for the art strikes me as hopelessly naive. The EA of the early 1980s wasn’t founded by artists but rather by businessmen, backed by venture capitalists with goals of their own that had little to do with “fulfilling the potential of personal computing.” Thus, when the software-artists angle turned out not to work so well, it didn’t take them long to pivot. This, then, is the history of that pivot, and how it led to the EA we know today.


Advertising is all about image making — about making others see you in the light in which you wish to be seen. Without realizing that they were doing anything of the sort, EA’s earliest marketers cemented an image into the historical imagination at the same time that they failed in their more practical task of crafting a message that resonated with the hoped-for customers of their own time. The very same early EA advertising campaign which speaks so eloquently to so many today actually missed the mark entirely in its own day, utterly failing to set the public imagination afire with this idea of programmers and game designers as rock stars. When Trip Hawkins sent Bill Budge — the programmer of his who most naturally resembled a rock star — on an autograph-signing tour of software stores and shopping malls, it didn’t lead to any outbreak of Budgomania. “Nobody would ever show up,” remembers Budge today, still wincing at the embarrassment of sitting behind a deserted autograph booth.

Nor were customers flocking into stores to buy the games EA’s rock stars had created. Sales remained far below initial projections during the eighteen months following EA’s official launch in June of 1983, and the company skated on the razor’s edge of bankruptcy on multiple occasions. While their first year yielded the substantial hits Pinball Construction Set, Archon, and One-on-One, 1984 could boast only one comparable success story, Seven Cities of Gold. Granted, four hits in two years was more than plenty of other publishers managed, but EA had been capitalized under the expectation that their games would open up whole new demographics for entertainment software. “The idea was to make games for 28-year-olds when everybody else was making games for 13-year-olds,” says Bing Gordon, Trip Hawkins’s old university roommate and right-hand man at EA. When those 28-year-olds failed to materialize, EA was left in the lurch.

For better or for worse, One-on-One is the spiritual forefather of the unstoppable EA Sports lineup of today.

The most important architect of EA’s post-launch retrenchment was arguably neither Trip Hawkins nor Bing Gordon, but rather Larry Probst, who left the free-falling Activision to join EA as vice president for sales in 1984. Probst, who had worked at the dry-goods giants Johnson & Johnson and Clorox before joining Activision, had no particular attachment to the idea of software artists. He rather looked at the business of selling games much as he had that of selling toilet paper and bleach. He asked himself how EA could best make money in the market that existed rather than some fanciful new one they hoped to create. Steve Peterson, a product manager at EA, remembers that others “would still talk about how we were trying to create new forms of entertainment and break new boundaries.” But Probst, and increasingly Trip Hawkins as well, had the less high-minded goal of “going public and being a billion-dollar company.”

Probst had the key insight that distribution, more so than software artists or perhaps even product quality in the abstract, was the key to success in an industry that, following a major downturn in home computing in general in 1984, was only continuing to get more competitive. EA therefore spurned the existing distribution channels, which were nearly monopolized by SoftSel, the great behind-the-scenes power in the software industry to which everyone else was kowtowing; SoftSel’s head, Robert Leff, was the most important person in software that no one outside the industry had ever heard of. Instead of using SoftSel, EA set up their own distribution network piece by painful piece, beginning by cold-calling the individual stores and offering cut-rate deals in order to tempt them into risking the wrath of Leff and ordering from another source.

Then, once a reasonable distribution network was in place, EA leveraged the hell out of it by setting up a program of so-called “Affiliated Labels” — other publishers who would pay EA instead of a conventional distributor like SoftSel to get their products onto store shelves. It was a well-nigh revolutionary idea in game publishing, attractive to smaller publishers because EA was ready and able to help out with a whole range of the logistical difficulties they were always facing, from packaging and disk duplication to advertising campaigns. For EA, meanwhile, the Affiliated Labels yielded huge financial rewards and placed them in the driver’s seat of much of the industry, with the power of life and death over many of their smaller ostensible competitors.

Unsurprisingly, Activision, the only other publisher with comparable distributional clout, soon copied the idea, setting up a similar program of their own. But even as they did so, EA, seemingly always one step ahead, was becoming the first American publisher to send games — both their own and those of others — directly to Europe without going through a European intermediary like Britain’s U.S. Gold label.

There was always something a bit contrived, in that indelible Silicon Valley way, about how EA chose to present themselves to the world. Here we have Bing Gordon, head of technology Greg Riker, and producer Joe Ybarra indulging in some of the creative play which, an accompanying article is at pains to tell us, was constantly going on around the office.

Larry Probst’s strategy of distribution über alles worked a treat, yielding explosive growth that more than made up for the company’s early struggles. In 1986, EA became the biggest computer-game publisher in the United States and the world, with annual revenues of $30 million. Their own games were doing well, but were assuming a very different character from the “simple, hot, and deep” ideal of the launch — a phrase Trip Hawkins had once loved to apply to games that were less stereotypically nerdy than the norm, that he imagined would be suitable for busy young adults with a finger on the pulse of hip pop culture. Now, having failed to attract that new demographic, EA adjusted their product line to appeal to those who were already buying computer games. A case in point was The Bard’s Tale, EA’s biggest hit of 1985, a hardcore CRPG that might take a hundred hours or more to complete — fodder for 13-year-olds with long summer vacations to fill rather than 28-year-olds with jobs and busy social calendars.

If “simple, hot, and deep” and programmers as rock stars had been two of the three pillars of EA’s launch philosophy, the last was the one written into Hawkins’s original mission statement as “stay with floppy-disk-based computers only.” Said statement had been written, we should remember, just as the first great videogame fad, fueled by the Atari VCS, was passing its peak and beginning the long plunge into what would go down in history as the Great Videogame Crash of 1983. At the time, it certainly wasn’t only the new EA who believed that the toy-like videogame consoles were the past, and that more sophisticated personal computers, running more sophisticated games, were the future. “I think that computer games are fundamentally different from videogames,” said Hawkins on the Computer Chronicles television show. “It becomes a question of program size, when you want to know how good a program can I have, how much can I do with it, and how long will it take before I’m bored with it.” This third pillar of EA’s strategy would take a bit longer to fall than the others, but fall it would.

The origins of EA’s loss of faith in the home computer in general as the ultimate winner of the interactive-entertainment platform wars can ironically be traced to their decision to wholeheartedly endorse one computer in particular. In October of 1984, Greg Riker, EA’s director of technology, got the chance to evaluate a prototype of Commodore’s upcoming Amiga. His verdict upon witnessing this first truly multimedia personal computer, with its superlative graphics and sound, was that this was the machine that could change everything, and that EA simply had to get involved with it as quickly as possible. He convinced Trip Hawkins of his point of view, and Hawkins managed to secure Amiga Prototype Number 12 for the company within weeks. In the months that followed, EA worked to advance the Amiga with if anything even more enthusiasm than Commodore themselves: developing libraries and programming frameworks which they shared with their outside developers; writing tools internally, including what would become the Amiga’s killer app, Deluxe Paint; documenting the Interchange File Format, a set of standard specifications for sharing pictures, sounds, animations, and music across applications. All of these things and more would remain a part of the Amiga platform’s basic software ecosystem throughout its existence.

When the Amiga finally started shipping late in 1985, EA actually made a far better public case for the machine than Commodore, taking out a splashy editorial-style advertisement just inside the cover of the premiere issue of the new AmigaWorld magazine. It showed the eight Amiga games EA would soon release and explained “why Electronic Arts is committed to the Amiga,” the latter headline appearing above a photograph of Trip Hawkins with his arm proprietorially draped over the Amiga on his desk.

Trip Hawkins with an Amiga

But it all turned into an immense disappointment. Initially, Commodore priced the Amiga wrong and marketed it worse, and even after they corrected some of their worst mistakes it perpetually under-performed in the American marketplace. For Hawkins and EA, the whole episode planted the first seeds of doubt as to whether home computers — which at the end of the day still were computers, requiring a degree of knowledge to operate and associated in the minds of most people more with work than pleasure — could really be the future of interactive entertainment as a mass-media enterprise. If a computer as magnificent as the Amiga couldn’t conquer the world, what would it take?

Perhaps it would take a piece of true consumer electronics, made by a company used to selling televisions and stereos to customers who expected to be able to just turn the things on and enjoy them — a company like, say, Philips, who were working on a new multimedia set-top box for the living room that they called CD-I. The name arose from the fact that it used the magical new technology of CD-ROM for storage, something EA had been begging Commodore to bring to the Amiga to no avail. EA embraced CD-I with the same enthusiasm they had recently shown for the Amiga, placing Greg Riker in personal charge of creating tools and techniques for programming it, working more as partners in CD-I’s development with Philips than as a mere third-party publisher.

Once again, however, it all came to nought. CD-I turned into one of the most notorious slow-motion fiascos in the history of the games industry, missing its originally planned release date in the fall of 1987 and then remaining vaporware for years on end. In early 1989, EA finally ran out of patience, mothballing all work on the platform unless and until it became a viable product; Greg Riker left the company to go work for Microsoft on their own CD-ROM research.

CD-I had cost EA a lot of money to no tangible result whatsoever, but it does reveal that the idea of gaming on something other than a conventional computer was no longer anathema to them. In fact, the year in which EA gave up on CD-I would prove the most pivotal of their entire history. We should therefore pause here to examine their position in 1989 in a bit more detail.

Despite the frustrating failure of the Amiga and CD-I to open a new golden age of interactive entertainment, EA wasn’t doing badly at all. Following years of steady growth, annual revenue had now reached $63 million, up 27 percent from 1988. EA was actively distributing about 100 titles under their own imprint, and 250 more under the imprint of the various Affiliated Labels, who had become absolutely key to their business model, accounting for some 45 percent of their total revenues. About 80 percent of their revenues still came from the United States, with 15 percent coming from Europe — where EA had set up a semi-independent subsidiary, the Langley, England-based EA Europe, in 1987 — and the remainder from the rest of the world. The company was extremely diversified. They were producing software for ten different computing platforms worldwide, had released 40 separate titles that had earned them at least $1 million each, and had no single title that accounted for more than 6 percent of their total revenues.

What we have here, then, is a very healthy business indeed, with multiple revenue streams and cash in the bank. The games they released were sometimes good, sometimes bad, sometimes mediocre; EA’s quality standards weren’t notably better or worse than the rest of their industry. “We tried to create a brand that fell somewhere between Honda and Mercedes,” admits Bing Gordon, “but a lot of the time we shipped Chevy.” Truth be told, even in the earliest days the rhetoric surrounding EA’s software artists had been a little overblown; many of the games their rock stars came up with were far less innovative than the advertising that accompanied them. The genius of Larry Probst had been to explicitly recognize that success or failure as a games publisher had as much to do with other factors as it did with the actual games you released.

For all their success, though, no one at EA was feeling particularly satisfied with their position. On the contrary: 1989 would go down in EA’s history as the year of “crisis.” As successful as they had become selling home-computer software, they remained big fish in a rather small pond, a situation out of keeping with the sense of overweening ambition that had been a part of the company’s DNA since its founding. In 1989, about 4 million computers were being used to play games on a regular or semi-regular basis in American homes, enough to fuel a computer-game industry worth an estimated $230 million per year. EA alone owned more than 25 percent of that market, more than any competitor. But there was another, related market in which they had no presence at all: that of the videogame consoles, which had returned from the dead to haunt them even as they were consolidating their position as the biggest force in computer games. The country was in the grip of Nintendo mania. About 22 million Nintendo Entertainment Systems were already in American homes — a figure accounting for 24 percent of all American households — and cartridge-based videogames were selling to the tune of $1.6 billion per year.

Unlike many of their peers, EA hadn’t yet suffered all that badly under the Nintendo onslaught, largely because they had already diversified away from the Commodore 64, the low-end 8-bit computer which had been the largest gaming platform in the world just a couple of years before, and which the NES was now in the process of annihilating. But still, the future of the computer-games industry in general felt suddenly in doubt in a way that it hadn’t since at least the great home-computer downturn of 1984. A sizable coalition inside EA, including Larry Probst and most of the board of directors, pushed Trip Hawkins hard to get EA’s games onto the consoles. Fearing a coup, he finally came around. “We had to go into the [console-based] videogame business, and that meant the world of mass-market,” Hawkins remembers. “There were millions of customers we were going to reach.”

But through which door should they make their entrance? Accustomed to running roughshod over his Affiliated Labels, Hawkins wasn’t excited about the prospect of entering Nintendo’s walled garden, where the shoe would be on the other foot, thanks to that company’s infamously draconian rules for its licensees. Nintendo’s standard contract demanded that they receive the first $12 from every game a licensee sold, required every game to go through an exhaustive review process before publication, and placed strict limits on how many games a licensee was allowed to publish per year and how many units they were allowed to manufacture of each one. For EA, accustomed to being the baddest hombre in the Wild West that was the computer-game marketplace, this was well-nigh intolerable. Bing Gordon insists even today that, thanks to all of the fees and restrictions, no one other than Nintendo was doing much more than breaking even on the NES during this, the period that would go down in history as the platform’s golden age.

So, EA decided instead to back a dark horse: the much more modern Sega Genesis, which hadn’t even been released yet in North America. It was built around the same 16-bit Motorola 68000 CPU found in computers like the Commodore Amiga and Apple Macintosh, with audiovisual capabilities not all that far removed from the likes of the Amiga. The Genesis would give designers and programmers who were used to the affordances of full-fledged computers a far less limiting platform than the NES to work with, and it offered the opportunity to get in on the ground floor of a brand-new market, as opposed to the saturated NES platform. The only problem was that Sega’s licensing fees were comparable to those of Nintendo, even though they could only offer their licensees access to a much more uncertain pool of customers.

Determined to play hardball, Hawkins had a team of engineers reverse-engineer the Genesis, sufficient to let them write games for it with or without Sega’s official development kit. Then he met with Sega again, telling them that, if they refused to adjust their licensing terms, he would release games on the console without their blessing, forcing them to initiate an ugly court battle of the sort that was currently raging between Nintendo and Atari if they wished to bring him to heel. That, he was gambling, was expense and publicity of a sort which Sega simply couldn’t afford. And Sega evidently agreed with his assessment; they accepted a royalty rate half that being demanded by Nintendo. By this roundabout method, EA became the first major American publisher to support the new console, and from that point forward the two companies became, as Hawkins puts it, “good partners.”

EA initially invested $2.5 million in ten games for the Genesis, some of them original to the console, some ports of their more popular computer games. They started shipping the first of them in June of 1990, ten months after the Genesis itself had first gone on sale in the United States. This first slate of EA Genesis titles arrived in a marketplace that was still starving for quality games, just as Hawkins had envisioned it would be. Among them was the game destined to become the face of the new, mass-market-oriented EA: John Madden Football, a more action-oriented re-imagining of a 1988 computer game of the same name.

John Madden Football debuted as a rather cerebral, tactics-heavy computer game in 1988, just another in an EA tradition of famous-athlete-endorsed sports games stretching back to 1983’s (Dr. J and Larry Bird Go) One-on-One. No one in 1988 could have imagined what it would come to mean in the years to come for either its publisher or its spokesman/mascot, both of whom would ride it to iconic heights in American pop culture.

The Sega Genesis marked the third time EA had taken a leap of faith on a new platform. It was the first time, however, that their faith paid off. About 25 percent of the games EA sold in 1990 were for the Genesis. And when the console really started to take off in 1991, fueled not least by their own games, EA was there to reap the rewards. In that year, four of the ten best-selling Genesis games were published by EA. At the peak of their dominance, EA alone was publishing about 35 percent of all the games sold for the Genesis. Absent the boost their games gave it early on, it’s highly questionable whether the Genesis would have succeeded at all in the United States.

In the beginning, few of EA’s outside developers had been terribly excited about writing for the consoles. One of them remembers Hawkins “reading us the riot act” just to get them onboard. Indeed, Hawkins claims today that about 15 percent of EA’s internal employees were so unhappy with the new direction that they quit. Certainly his latest rhetoric could hardly have been more different from that of 1983:

I knew we had to let go of our attachment to machines that the public did not want to buy, and support the hardware that the public would embrace. I made this argument on the grounds of delivering customer satisfaction, and how quality is in the eye of the beholder. If the customer buys a Genesis, we want to give him the best we can for the machine he bought and not resent the consumer for not buying a $1000 computer.

By this point, Hawkins had finally bitten the bullet and done a deal with Nintendo, who, in the face of multiple government investigations and lawsuits over their business practices, were becoming somewhat more generous with both their competitors and licensees. When games like Skate or Die, a port of a Commodore 64 hit that just happened to be perfect for the Nintendo and Sega demographics as well, started to sell in serious numbers on the consoles, Hawkins’s developers’ aversion started to fade in the face of all that filthy lucre. Soon the developers of Skate or Die were happily plunging into a sequel which would be a console exclusive.

Even the much-dreaded oversight role played by Nintendo, in which they reviewed every game before allowing it to be published, proved less onerous than expected. When Will Harvey, the designer of an action-adventure called The Immortal, finally steeled himself to look at Nintendo’s critique thereof, he was happily surprised to find the list of “suggestions” to be very helpful on the whole, demonstrating real sensitivity to the effect he was trying to achieve. Even Bing Gordon, who had been highly skeptical of getting into bed with Nintendo, had to admit in the end that “the rating system is fair. On a scale from zero to a hundred, where zero meant the system was totally manipulated for Nintendo’s self-interest and a hundred meant that it was absolutely democratic, they’d probably get a ninety. I’ve seen a little bit of self-interest, but this is America, the land of self-interest.”

Although EA cut their Nintendo teeth on the NES, it was on the long-awaited follow-up console, 1991’s Super Nintendo, that they really began to thrive. That machine boasted capabilities similar to those of the Sega Genesis, meaning EA already had games ready to port over, along with developers with considerable expertise in writing for a more advanced species of console. Just in time for the Christmas of 1991, EA released a new version of John Madden FootballJohn Madden Football ’92 — simultaneously on the Super Nintendo and the Genesis. The sequel had been created, according to the recollections of several EA executives, against the advice of market researchers and retailers: “All you’re going to do is obsolete our old game.” But Trip Hawkins remembered how much, as a kid, he had loved the Strat-O-Matic Football board game, for which a new set of player and team cards was issued every year just before the beginning of football season, ensuring that you could always recreate in the board game the very same season you were watching every Sunday on television. So, he ignored the objections of the researchers and the retailers, and John Madden Football ’92 became an enormous hit, by far the biggest EA had yet enjoyed on any platform — thus inaugurating, for better or for worse, the tradition of annual versions of gaming’s most evergreen franchise. Like clockwork, we’ve gotten a new Madden every single year since, a span of time that numbers a quarter-century and change as of this writing.

All of this had a transformative effect on EA’s bottom line, bringing on their biggest growth spurt yet. Revenues increased from $78 million in 1990 to $113 million in 1991; then they jumped to $175 million in 1992, accompanied by a two-for-one stock split that was necessary to keep the share price, which had been at $10 just a few years before, from exceeding $50. In that year, six of the fifteen most popular console games, across all platforms, were published by EA. Their Sega Genesis games alone generated $77 million, 18 percent more than the entirety of the company’s product portfolio had managed in 1989. This was also the first year that EA’s console games in the aggregate outsold their offerings for computers. They were leaving no doubt now as to where their primary loyalty lay: “The 16-bit consoles are far better for games than PCs. The Genesis is a very sophisticated machine…” The disparity between the two sides of the company’s business would only continue to get more pronounced, as EA’s sales jumped by an extraordinary 70 percent — to $298 million — in 1993, a spurt fueled entirely by console-game sales.

But, despite all their success on the consoles, EA — and especially their founder, Trip Hawkins — continued to chafe under the restrictions of the walled-garden model of software distribution. Accordingly, Hawkins put together a group inside EA to research the potential for a CD-ROM-based multimedia set-top box of their own, one that would be used for more than just playing games — sort of a CD-I done right. “The Japanese videogame companies,” he said, “are too shortsighted to see where this is going.” In contrast to their walled gardens, his box would be as open as possible. Rather than a single new hardware product, it would be a set of hardware specifications and an operating system which manufacturers could license, which would hopefully result in a situation similar to the MS-DOS marketplace, where lots of companies competed and innovated within the bounds of an established standard. The marketplace for games and applications as well on the new machine would be far less restricted than the console norm, with a more laissez-faire attitude to content and a royalty fee of just $3 per unit sold.

In 1991, EA spun off the venture under the name of 3DO. Hawkins turned most of his day-to-day responsibilities at EA over to Larry Probst in order to take personal charge of his new baby, which took tangible form for the first time with the release of the Panasonic “Real 3DO Player” in late 1993. It and other implementations of the 3DO technology managed to sell 500,000 units worldwide — 200,000 of them in North America — by January of 1995. Yet those numbers were still a pittance next to those of the dedicated game consoles, and the story of 3DO became one of constant flirtations with success that never quite led to that elusive breakthrough moment. As 3DO struggled, Hawkins’s relations with his old company worsened. He believed they had gone back on promises to support his new venture wholeheartedly; “I didn’t feel like I was leaving EA, but it turned out that way,” he says today with lingering bitterness. The long, frustrating saga of 3DO wouldn’t finally straggle to a bankruptcy until 2003.

EA, meanwhile, was flying ever higher absent their founder. Under Larry Probst — always the most hard-nosed and sober-minded of the executive staff, the person most laser-focused on the actual business of selling videogames — EA cemented their reputation as the conservative, risk-averse giant of their industry. This new EA was seemingly the polar opposite of the company that had once asked with almost painful earnestness if a computer could make you cry. And yet, paradoxically, it was a place still inhabited by a surprising number of the people who had come up with that message. Most prominent among them was Bing Gordon, who notes cryptically today only that “people’s ideals get tested in the face of love or money.” Part of the problem — assuming one judges EA’s current less-than-boldly-innovative lineup of franchises to be a problem — may be a simple buildup of creative cruft that has resulted from being in business for so long. Every franchise that debuts in inspiration and innovation, then goes on to join John Madden Football on the list of EA perennials, sucks some of the bandwidth away that might otherwise have been devoted to the next big innovator.

In the summer of 1987, when EA was still straddling the line between their old personality and their new, Trip Hawkins wrote the following lines in their official newsletter — lines which evince the keenly felt tension between art and commerce that has become the defining aspect of EA’s corporate history for so many in the years since:

Unfortunately, simply being creative doesn’t always mean you’ll be wildly successful. Van Gogh sold only one painting during his lifetime. Lots of people would still rather go see Porky’s Revenge IV, ignoring well-produced movies like Amadeus or Chariots of Fire. As a result, film producers take fewer risks, and we get less variety, and pretty soon the Porky’s and Rambo clones are all you can find on a Friday night. Software developers have the same problem. (To this day, all of us M.U.L.E. fans wonder why the entire world hasn’t fallen in love with our favorite game.)

The only way to solve the problem is to do it together. On our end, we’ll keep innovating, researching, experimenting with new ways to use this new medium; on your end, you can support our efforts by taking an occasional risk, by buying something new and different… maybe Robot Rascals, or Make Your Own Murder Party.

You may be very pleasantly surprised — and you’ll help our software artists live to innovate another day.

Did EA go the direction they did because of gamers’ collective failure to support their most innovative, experimental work? Does it even matter if so? The more pragmatic among us might note that the EA of today is delivering games that millions upon millions of people clearly want to play, and where’s the harm in that?

Still, as we look upon this industry that has so steadfastly refused to grow up in so many ways, there remain always those pictures of EA’s first generation of software artists — pictures that, yes, are a little pretentious and a lot contrived, but that nevertheless beckon us to pursue higher ideals. They’ve taken on an identity of their own now, quite apart from the history of the company that once splashed them across the pages of glossy lifestyle magazines. Long may they continue to inspire.

(Sources: the book Gamers at Work: Stories Behind the Games People Play by Morgan Ramsay and Game Over: How Nintendo Conquered the World by David Sheff; Harvard Business School’s case study “Electronic Arts in 1995”; ACE of April 1990; Amazing Computing of July 1992; Computer Gaming World of March 1988, October 1988, and June 1989; MicroTimes of April 1986; The One of November 1988; Electronic Arts’s newsletter Farther from Summer 1987; AmigaWorld premiere issue; materials relating to the Software Publishers Association included in the Brøderbund archive at the Strong Museum of Play; the episode of the Computer Chronicles television series entitled “Computer Games.” Online sources include “We See Farther — A History of Electronic Arts” at Gamasutra, “How Electronic Arts Lost Its Soul” at Polygon, and Funding Universe‘s history of Electronic Arts.)

 
 

Tags: , ,