RSS

Tag Archives: apple ii

Lode Runner

It’s always been a bit of a balancing act to decide which games I write about in detail here — a matter of balancing my level of personal interest in each candidate against its historical importance. In the early years of this project especially, when I still saw it as focusing almost exclusively on narrative-oriented games, I passed over some worthy candidates because I considered them somewhat out of scope. And now, needless to say, I regret some of those omissions.

One of the games that’s been made most conspicuous by its absence here is Lode Runner, Doug Smith’s seminal action-puzzle platformer from 1983. “Iconic” is a painfully overused adjective today, but, if any game truly can be called an icon of its era, it’s this one. So, I decided to take the release of Lode Runner: The Legend Returns, a 1994 remake/re-imagining that does fit neatly into our current position in the historical chronology, as an opportunity to have a belated look back at the original.


Doug Smith

In late 1981, Doug Smith was studying architecture and numerical analysis at the University of Washington in Seattle. Meanwhile he had a part-time job in one of the university’s computer labs, where he met two other students named James Bratsanos and Tracy Steinbeck, who were tinkering with a game they called Kong, a not so-thinly-veiled reference to the arcade hit Donkey Kong. Bratsanos had first created Kong the previous year on one of his high school’s Commodore PET microcomputers, and the two were now in the process of porting it to one of the university’s DEC VAX-11/780 minicomputers. Smith soon joined the effort. When their fellow students started to show some interest in what they were doing, they made the game publicly available.

In Kong, you guided a little man through a single-screen labyrinth of tunnels linked by ladders, implemented entirely using monochrome textual characters; your man was a dollar sign, your enemies paragraph symbols. Armed only with a pick axe that was more tool than weapon, you must steal all of the gold that was lying around the place, whilst avoiding or delaying the guards who protected it, generally by digging pits into which they could fall. The group hid their game from the university’s administrators by embedding it into an otherwise broken graphing program. “‘Graph’ would prompt the user for a function,” remembers a fellow student named Rick LaMont, “then crap out unless the secret password was entered to play Kong.”

With its captive audience of playtesters in the form of the students who hung around the computer labs, the game grew organically as the weeks passed. Soon students were coming by only to play Kong; LaMont claims that “a ‘show process’ command would often report 80 percent of the users running ‘graph.'” Eager players began to queue up behind the university’s computer terminals, and Kong became a fixture of campus life, the University of Washington’s equivalent of what Zork had once been at MIT. Along the way, it gradually evolved from an arcade game into something that required as much thought as reflexes; the levels just kept getting more and more complex.

According to Smith, it was his eight-year-old nephew who convinced him to port the game to the Apple II; having visited the computer lab once or twice and seen it in action there, the little boy was decidedly eager for a version he could play at home. “After he bugged me enough,” said Smith in a 1999 interview, “one weekend I rewrote it for the Apple II, basically in three days.” This first microcomputer version was a copy of the DEC VAX version right down to its monochrome ASCII graphics. Smith made just one big change: he renamed the game Miner to avoid legal entanglements. After paying James Bratsanos $1500 for the rights to the game, he submitted it to Brøderbund Software, only to get a terse rejection letter back: “Thank you for submitting your game concept. Unfortunately, it does not fit with our product line.”

But, seeing how popular the game continued to be at the university, Smith decided to take another stab at it. He borrowed enough money to buy a color monitor and joystick for his Apple II, and programmed a second, much-improved version with color bitmap graphics and controls that took advantage of one of the Apple’s unique affordances: its joysticks had two buttons rather than the standard one, which in this case allowed the player’s avatar to drill to the left or right of himself without the player ever having to reach for the keyboard. In late 1982, Smith sent this new version to four different publishers, among them Brøderbund and Sierra. All of them knew as soon as they saw this latest version of the game that they wanted it for themselves. John Williams, the little brother of Sierra founder Ken Williams, and the company’s chief financial officer from the tender age of twenty, later claimed that he “almost lost his job” because he spent so much time playing the game Smith sent to them. But Smith wouldn’t end up publishing his game through Sierra. Instead he wound up entrusting it to Brøderbund after all.

Founded and run as a family business by a personable former lawyer and real-estate developer named Doug Carlston, Brøderbund would consistently demonstrate an uncanny talent for identifying exactly the software product that Middle America was looking for at any given moment, securing it for themselves, and then delivering it to their customers in the most appealing possible way. (At the risk of sounding unkind, I might note at this juncture that, whereas Ken Williams loved to talk about the mainstreaming of games and other software, the Carlston family talked less but proved more adept at the practical work of doing so.) In the years to come, this talent would result in a quantity of truly iconic Brøderbund titles out of all proportion to the relatively modest number of products which the company released in total: titles like Karateka, Carmen Sandiego, Bank Street Writer, The Print Shop, Prince of Persia, SimCity, Myst. But before any of them came Doug Smith’s game.

Brøderbund offered Smith a $10,000 advance and a very generous 23-percent royalty. And they also promised to get behind his game with the kind of concerted, professional marketing push that was still a rarity in the industry of that era. Showing a remarkable degree of restraint for his age as well as faith in his game’s potential, Smith signed with Brøderbund rather than accept another publisher’s offer of $100,000 outright, with no royalty to follow. He would be amply rewarded for his foresight.

For example, it was Brøderbund’s savvy marketers who gave Miner its final name. Well aware of the existence of another, superficially similar platform game called Miner 2049er, they proposed the alternate title of Lode Runner, as in “running after the mother lode.” Soon after choosing this new name that held fast the idea that the player was some sort or other of miner, they devised a more detailed fictional context for the whole affair that abandoned that notion entirely. It involved the evil Bungeling Empire, the antagonist of their 1982 hit Choplifter!:

You are a galactic commando deep in enemy territory. Power-hungry leaders of the repressive Bungeling Empire have stolen a fortune in gold from the people by means of excessive fast-food taxes. Your task? To infiltrate each of 150 different treasure rooms, evade the deadly Bungeling guards, and recover every chest of Bungeling booty.

In the spirit of this narrative, the hero’s pick axe became a laser drill.

Still, none of this background would be remembered by anyone who actually played the game. Instead the supposed Bungeling guards would become popularly known as “mad monks,” which their pudgy low-resolution shapes rather resembled. Doubtless plenty of imaginative young gamers made up new narratives of their own to fit the bizarre image of greedy monks chasing an intrepid adventurer up and down a maze of scaffolding dotted with gold.

Lode Runner on the Apple II.

Smith dropped out of university at the end of 1982, and worked closely with Brøderbund over the course of six months or so to polish his game in a concerted, methodical way, something that was seldom done at this early date. They helped him to tweak each of the 150 levels — some designed by Smith himself, some by the kids who lived around Smith’s family home, whom he paid out of his own pocket on a per-level basis — to a state of near-perfection, and arranged them all so that they steadily progressed in difficulty as you played through them one after another. And then Brøderbund encouraged Smith to polish up his level editor and include that as well.

Lode Runner got a rapturous reception upon its release in June of 1983, quickly becoming the best-selling product Brøderbund had ever released to that point; Smith was soon collecting more than $70,000 per month in royalties. If anything, its reputation among students of game design has become even more hallowed today. It stands out from its peers of 1983 like a young Glenn Gould in a beginner’s piano course.

That said, Lode Runner is not quite the sui generis game which its more enraptured devotees are sometimes tempted into claiming it to be. When James Bratsanos first created what would eventually become Lode Runner on the Commodore PET, he was according to his own testimony working from a friend’s description of an arcade game: “He didn’t explain it well, and I took creative liberties and assumed I understood what he meant. So for certain elements I completely misinterpreted it.” Bratsanos, an acknowledged non-gamer, may later have come to believe that the game his friend had been describing was Donkey Kong, and assumed that the major differences between that game and his stemmed from his youthful “misinterpretation” of his friend’s description of the former. But the chronology here doesn’t pass muster: Donkey Kong was first released in the summer of 1981, while Bratsanos is sure that he started working on his game, which originally went under the rather unpleasant name of Suicide, in 1980. Suicide became Kong only after Donkey Kong had been released and become an arcade sensation, and Bratsanos had started at the University of Washington the following fall.

So, what was it that his friend actually described to him back in 1980? The best candidate is Space Panic, a largely forgotten Japanese stand-up arcade game from that year which would seem to be the first ever example of the evergreen genre that would become known as the platformer.  Not only did Space Panic have you running and climbing your way through a vertical labyrinth, but it also allowed you to dig holes in it to trap your enemies, just like SuicideKong, and finally Lode Runner. Space Panic was not a commercial success, perhaps because it asked for too much too soon from an audience still enthused with simpler fare like Space Invaders; it was reported that the average session with it lasted all of 30 seconds. But it does appear that it entranced one anonymous teenage boy enough that he told his buddy James Bratsanos all about it. And from that random conversation — from that butterfly flapping its wings, one might say — eventually stemmed one of the biggest games of the 1980s.

Space Panic, the 1980 standup-arcade game at the root of the Lode Runner family tree.

But if it isn’t quite an immaculate creation, Lode Runner is a brilliant one, a classic lesson in the way that fiendish complexity can arise out of deceptive simplicity in game design. It offers just six verbs — move left, right, up, or down; dig left or right — combined with only slightly more nouns — platforms of diggable brick or impenetrable metal, ladders, trap doors, overhead poles for shimmying, monks, treasures. And yet from this disarmingly short list of ingredients arises a well-nigh infinite buffet of devious possibility.

Although Lode Runner does retain some vestiges of its arcade inspirations in the form of a score and limited lives, it’s as much a puzzle or even a strategy game as an action game at heart. (Your lives are essentially meaningless in the end; you can save your progress at any point.) Playing each level entails first experimenting and dying — dying a lot — until you can devise a thoroughgoing plan for how to tackle it. Then, it’s just a matter of executing the plan perfectly; this is where the action elements come into play. The levels in Lode Runner are dynamic enough that getting through them doesn’t require stumbling across a single rote, set-piece solution envisioned by the designer; there’s space here for player creativity, space for variation, space for quick thinking that gets you out of an unanticipated jam — or that fails to do so just when you believe you’re on the brink of victory.

The levels build upon one another, each one training you for what’s still to come as it forces you to think about your limited menu of verbs and nouns in new ways. This sort of progressive design was not a hallmark of most computer games of 1983, and thus serves to make Lode Runner stand out all the more. The world would arguably have to wait until the release of DMA Designs’s Lemmings in 1991 to play another action-puzzler that was its equal in terms of design.

Just as in Lemmings, every single detail of Lode Runner‘s implementation becomes relevant as the levels become more complex, from the timing of events in the environment to the rudimentary but completely predictable artificial intelligence of the monks. Consider: the pits you drill are automatically filled in again after ten seconds, while monks climb out of pits into which they’ve fallen in just a few seconds. But what would happen if you could time things so that a pit is filled in while a monk is still inside it? The monk would get buried there permanently, that’s what, giving you a precious reprieve before the replacement who is spawned at the very top of the screen makes his way down to you once again. By the time you reach level 30 or so, you’ll be actively using the monks as your helpmates, taking advantage of the fact that they too like to pick up gold — for there’s now gold in places which you can’t reach, meaning you must depend on them to be your delivery men. Once one of them has what you need, you just need to make him fall into a pit, then walk on his head to steal the booty. Easy peasy, right? If you think so, don’t worry: there’s still 120 levels to go, each one more insidiously intricate than the last.

And then, when you’re done with all 150 levels, there’s still the level editor. Even by the standards of today, the original Apple II Lode Runner provides a lot of content. By the standards of 1983, its generosity was mind-boggling.

A phenomenal game by any standard, Lode Runner became a phenomenon of another sort in the months after its release. Doug Smith, a private, retiring fellow who loathed the spotlight, nevertheless became a household name among hardcore gamers, joining the likes of Bill Budge, Richard Garriott, and Nasir Gebelli as one of the last of the Apple II scene’s auteur-programmer stars. At a time when a major hit was a game that sold 50,000 copies, his game sold in the hundreds of thousands on the Apple II and in ports to the Commodore 64, the IBM PC, and virtually every other commercially viable computer platform under the sun. First it became the Apple II game of 1983; then it became the game of whatever year it happened to be ported to each other platform, collecting award after award almost by default. And then there was Japan.

Lode Runner appeared on the Macintosh soon after that machine’s release in 1984. Although the construction set was a a natural fit for that machine’s GUI, the actual game proved less satisfying. “What used to be a struggle strictly between the commando and the Bungeling guards is now also a battle between you and the [mouse] pointer,” wrote Macworld magazine. Such complaints would become something of a theme: Apple II purists insist to this day that no Lode Runner has ever played quite as well as the one that Doug Smith personally programmed for their favorite platform.

One of Doug Carlston’s smartest moves in the early days of Brøderbund was to forge links with the burgeoning software and gaming scene in Japan. He was particularly chummy with Yuji Kudo, the founder of Hudson Soft, Japan’s biggest software publisher of all. (A model-train enthusiast extraordinaire, Kudo took his company’s very un-Japanese name from his favorite type of steam locomotive.) The two men already had a deal in place to bring Lode Runner to Japan even before it was released in the United States. During the summer of 1983, it became one of the first ten games to be made available for the Nintendo Famicom — the videogame console that would later conquer the world as the Nintendo Entertainment System.

Like Wizardry before it and Populous after it, Lode Runner turned into that rarest of birds, a Western videogame which the Japanese embraced with all the fannish obsessiveness of which they’re capable — which is, to be clear, a lot of obsessiveness indeed. Before there was Super Mario Bros. to drive sales of Nintendo consoles all over the world, there was Lode Runner to get the ball rolling in Japan itself. Sales of the game in Japan alone topped 1 million in the first eighteen months, prompting one journalist to declare Lode Runner Japan’s new “national pastime.”

The country’s Lode Runner mania reached its peak in the summer of 1985, when Hudson Soft, Brøderbund, and Sony joined forces to sponsor a national competition in the game. Of the 3700 players between the ages of nine and fourteen who entered the competition, 50 became finalists, invited to come to Tokyo and play the game on what was at that time the largest video screen in the world, 86 feet in width. A slightly uncomfortable-looking Doug Smith, coaxed into the spotlight by Brøderbund’s marketers, presided over the affair and even agreed to join the competition. (He didn’t last very long.) “I like the people of Japan,” he said. “There’s an honesty among the people that is so refreshing — they would never think of pirating computer games, for instance.” (A more likely explanation for Lode Runner‘s high sales in Japan than the people’s innate honesty was, of course, the fact that piracy on the cartridge-based Famicom was a possibility for only the most technically adept.)

Lode Runner running on the biggest of all big screens in Japan.

A rare shot of Doug Smith in person, giving prizes to the winners of the Japanese Lode Runner competition.

By decade’s end, Lode Runner‘s worldwide sales had topped 2.5 million copies. I can hardly emphasize enough what absurdly high figures these are for a game first sold on the humble Apple II.

When you take Brøderbund’s generous royalty and combine it with sales like this, then reckon in the fact that Lode Runner was essentially a one-man production, you wind up with one very wealthy young game programmer. Still in his early twenties, Doug Smith found himself in the enviable position of never having to work another day in his life. He bought, according to his friend Rick LaMont, “a Porsche 911 Carrera, a Bayliner speedboat, and a house in Issaquah.”

In the face of distractions like these, Doug Smith became one of a number of early Apple II auteurs, such as the aforementioned Bill Budge and Nasir Gebelli, who weren’t able to sustain their creative momentum as lone-wolf developers became teams and the title of game designer slowly separated itself from that of game programmer. He did provide Brøderbund with one Lode Runner sequel of a sort: Championship Lode Runner, with 50 new levels that had mostly been sent to the company by fans and that were (correctly) advertised as picking up in difficulty right where the first game had left off. But its technology and graphics were barely tweaked, and the decision to aim it exclusively at the hardest-core of the hardcore put a natural limit on its appeal.

After that, there followed several years of silence from Smith, off enjoying his riches and pondering the strange course his life had taken, from starving student to wealthy man of leisure in a matter of months. And truly, his is a story that could only have happened at this one brief window in time, when videogames had become popular enough to sell in the millions but could still be made by a single person.

Just as they did with Wizardry, the impatient Japanese soon took Lode Runner into their own hands, making and releasing a string of sequels in their country that would never appear elsewhere. But what ought to have been a natural ongoing franchise remained oddly under-served by Brøderbund in its country of origin; they released only one more under-realized, under-promoted sequel, for the Commodore 64 and Atari 8-bit line only, created by their recently purchased subsidiary Synapse Software without Smith’s involvement. Perhaps they were just too busy turning all those other products into icons of their era.

It wasn’t until 1994, when Brøderbund’s ten-year option expired and all rights to the game and its trademarks reverted to Smith, that anyone attempted a full-fledged revival in the United States. Irony of ironies, the company behind said revival was Sierra, finally getting their chance with a game that had slipped through their fingers a decade before. The project was driven by Jeff Tunnell, the founder of what was now the Sierra subsidiary known as Dynamix, who had just made the classic puzzler The Incredible Machine.

Lode Runner: The Legend Returns was a symbol of everything that was right and wrong with the games industry of the mid-1990s. Dynamix added beautiful hand-painted backgrounds and a stereo soundtrack to the old formula, but in the minds of many the new version just didn’t play as well as the old; it had something to do with the timing, something to do with the unavoidably different feel of a 1990s 32-bit computer game versus the vintage 8-bit variety — and perhaps something to do as well with Tunnell’s decision to add a lot more surface complexity to the elegantly simple mix of the original, including locks and keys, snares, gas traps, bombs, jackhammers, buckets of goo, and even light and darkness. The Legend Returns did reasonably well for Sierra, but never became the phenomenon that the original had been in its home country. And as for Japan… well, it now preferred homegrown platformers that featured a certain Italian plumber. The various revivals since have generally met the same fate: polite interest, decent sales, but no return to the full-blown Lode Runner mania of the 1980s.

Lode Runner: The Legend Returns definitely looks a lot more impressive than the original, which was far from an audiovisual wonder even in its own time. Opinions are at best divided, however, on whether it plays better. One can detect the influence of Lemmings 2: The Tribes in its diverse, ever-shifting collection of obstacles and affordances, but the end result is somehow less compelling.

Smith did return to playing an active role in the games industry in the 1990s, working as the producer of a couple of Nintendo games among other things. He disappeared from view once again after the millennium, occupying himself mostly with the raising of his five children. He died by suicide in 2014 at the age of 53.

(Sources: the book Software People: Inside the Computer Business by Douglas G. Carlston; Retro Gamer 111; Ahoy! of April 1986; A.N.A.L.O.G. of March 1984; Computer Gaming World of January/February 1983, October 1983, and March 1986; Electronic Games of June 1983 and January 1985; inCider of April 1984; InfoWorld of October 31 1984; Macworld of August 1985; MicroTimes of December 1984 and September 1985; Brøderbund News of April of Fall 1985; InterAction of Fall 1994. Online sources include IGN‘s 1999 interview with Doug Smith, Jeremy Parish’s eulogy to Smith, and a 1991 Usenet reminiscence by Rick LaMont.

Feel free to download the original Lode Runner and its manual for play in the Apple II emulator of your choice.)

 
81 Comments

Posted by on December 18, 2020 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

How Jordan Mechner Made a Different Sort of Interactive Movie (or, The Virtues of Restraint)

One can learn much about the state of computer gaming in any given period by looking to the metaphors its practitioners are embracing. In the early 1980s, when interfaces were entirely textual and graphics crude or nonexistent, text adventures like those of Infocom were heralded as the vanguard of a new interactive literature destined to augment or entirely supersede non-interactive books. That idea peaked with the mid-decade bookware boom, when just about every entertainment-software publisher (and a few traditional book publishers) were rushing to sign established authors and books to interactive projects. It then proceeded to collapse just as quickly under the weight of its own self-importance when the games proved less compelling and the public less interested than anticipated.

Prompted by new machines like the Commodore Amiga with their spectacular graphics and sound, the industry reacted to that failure by turning to the movies for media mentorship. This relationship would prove more long-lasting. By the end of the 1980s, companies like Cinemaware and Sierra were looking forward confidently to a blending of Hollywood and Silicon Valley that they believed might just replace the conventional non-interactive movie, not to mention computer games as people had known them to that point. Soon most of the major publishers would be conducting casting calls and hiring sound stages, trying literally to make games out of films. It was an approach fraught with problems — problems that were only slowly and grudgingly acknowledged by these would-be unifiers of Southern and Northern Californian entertainment. Before it ran its course, it spawned lots of really terrible games (and, it must be admitted, against all the odds the occasional good one as well).

Given the game industry’s growing fixation on the movies as the clock wound down on the 1980s, Jordan Mechner would seem the perfect man for the age. Struggling with the blessing or curse of an equally abiding love for both mediums, his professional life had already been marked by constant vacillation between movies and games. Inevitably, his love of film influenced him even when he was making games. But, perhaps because that love was so deep and genuine, he accomplished the blending in a more even-handed, organic way than would most of the multi-CD, multi-gigabyte interactive movies that would soon be cluttering store shelves. Mechner’s most famous game, by contrast, filled just two Apple II disk sides — less than 300 K in total. And yet the cinematic techniques it employs have far more in common with those found in the games of today than do those of its more literal-minded rivals.


 

As a boy growing up in the wealthy hamlet of Chappaqua, New York, Jordan Mechner dreamed of becoming “a writer, animator, or filmmaker.” But those ambitions got modified if not discarded when he discovered computers at his high school. Soon after, he got his hands on his own Apple II for the first time. Honing his chops as a programmer, he started contributing occasional columns on BASIC to Creative Computing magazine at the age of just 14. Yet fun as it was to be the magazine’s youngest contributor, his real reason for learning programming was always to make games. “Games were the only kind of software I knew,” he says. “They were the only kind that I enjoyed. At that time, I didn’t really see any use for a word processor or a spreadsheet.” He fell into the throes of what he describes as an “obsession” to get a game of his own published.

Initially, he did what lots of other game programmers were doing at the time: cloning the big standup-arcade hits for fun and (hopefully) profit. He made a letter-perfect copy of Atari’s Asteroids, changed the titular space rocks to bright bouncing balls in the interest of plausible deniability, and sent the resulting Deathbounce off to Brøderbund for consideration; what with Brøderbund having been largely built on the back of Apple Galaxian, an arcade clone which made no effort whatsoever to conceal its source material, the publisher seemed a very logical choice. But Doug Carlston was now trying to distance his company from such fare for reasons of reputation as well as his fear of Atari’s increasingly aggressive legal threats. Nice guy that he was, he called Mechner personally to explain why Deathbounce wasn’t for Brøderbund. He promised to send Mechner a free copy of Brøderbund’s latest hit, Choplifter, suggesting he think about whether he might be able to apply the programming chops he had demonstrated in Deathbounce to a more original game, as Choplifter‘s creator Dan Gorlin had done. Mechner remembers the conversation as well-nigh life-changing. He had been so immersed in the programming side of making games that the idea of doing an original design had never really occurred to him before: “I didn’t have to copy someone else’s arcade game. I was allowed to design my own!”

Carlston’s phone call came in May of 1982, when Mechner was finishing up his first year at Yale University; undecided about his major as he was so much else in his life at the time, he would eventually wind up with a Bachelors in psychology. We’re granted an unusually candid and personal glimpse into his life between 1982 and 1993 thanks to his private journals, which he published (doubtless in a somewhat expurgated form) in 2012. The early years paint a picture of a bright, sensitive young man born into a certain privilege that carries with it the luxury of putting off adulthood for quite some time. He romanticizes chance encounters (“I saw a heartbreakingly beautiful young blonde out of the corner of my eye. She was wearing a blue down vest. As she passed, our eyes met. She smiled at me. As I went out I held the door for her; her fingers grazed mine. Then she was gone.”); frets frequently about cutting classes and generally not being the man he ought to be (“I think Ben is the only person who truly comprehends the depths of how little classwork I do.”); alternates between grand plans accompanied by frenzies of activity and indecision accompanied by long days of utter sloth (“Here’s what I do do: listen to music. Browse in record stores. Read newspapers, magazines, play computer games, stare out the windows. See a lot of movies.”); muses with all the self-obliviousness of youth on whether he would prefer “writing a bestselling novel or directing a blockbusting film,” as if attaining fame and fortune was as simple as deciding on one or the other.

At Yale, film, that other constant of his creative life, came to the fore. He joined every film society he stumbled upon, signed up for every film-studies course in the catalog, and set about “trying to see in four years every film ever made”; Akira Kurosawa’s classic adventure epic Seven Samurai (a major inspiration behind Star Wars among other things) emerged as his favorite of them all. He also discovered an unexpected affinity for silent cinema, which naturally led him to compare that earliest era of film with the current state of computer games, a medium that seemed in a similar state of promising creative infancy. All of this, combined with the example of Choplifter and the karate lessons he was sporadically attending, led to Karateka, the belated fruition of his obsession with getting a game published.

To a surprising degree given his youth and naivete, Mechner consciously designed Karateka as the proverbial Next Big Thing in action games after the first wave of simple quarter munchers, whose market he watched collapse over the two-plus years he spent intermittently working on it. Plenty of fighting games had appeared on the Apple II and other platforms before, some of them very playable; Mechner wasn’t sure he could really improve on their templates when it came to pure game play. What he could do, however, was give his game some of the feel and emotional resonance of cinema. Reasoning that computer games were technically on par with the first decade or two of film in terms of the storytelling tools at his disposal, he mimicked the great silent-film directors in building his story out of the broadest archetypal elements: an unnamed hero must assault a mountain fortress to rescue an abducted princess, fighting through wave after wave of enemies, culminating in a showdown with the villain himself. He energetically cross-cut the interactive fighting sequences with non-interactive scenes of the villain issuing orders to his minions while the princess looks around nervously in her cell — a suspense-building technique from cinema dating back to The Birth of a Nation. He mimicked the horizontal wipes Kurosawa used for transitions in Seven Samurai; mimicked the scrolling textual prologue from Star Wars. When the player lost or won, he printed “THE END” on the screen in lieu of “GAME OVER.” And, indeed, he made it possible, although certainly not easy, to win Karateka and carry the princess off into the sunset. The player was, in other words, playing for bigger stakes than a new high score.

Karateka

The most technically innovative aspect of Karateka — suggested, like much in the game, by Mechner’s very supportive father — involved the actual people on the screen. To make his fighters move as realistically as possible, Mechner made use for the first time in a computer game of an old cartoon-animation technique known as rotoscoping. After shooting some film footage of his karate instructor in action, doing various kicks and punches, Mechner used an ancient Moviola editing machine that had somehow wound up in the basement of the family home to isolate and make prints out of every third frame. He imported the figure at the center of each print into his Apple II by tracing it on a contraption called the VersaWriter. Flipped through in sequence, the resulting sprites appeared to “move” in an unusually fluid and realistic fashion. “When I saw that sketchy little figure walk across the screen,” he wrote in his journal, “looking just like Dennis [his karate instructor], all I could say was ‘ALL RIGHT!’ It was a glorious moment.”

Karateka

Doug Carlston, who clearly saw something special in this earnest kid, was gently encouraging and almost infinitely patient with him. When it looked like Mechner had come up with something potentially great at last, Carlston signed him to a contract and flew him out to California in the summer of 1984 to finish it up with the help of Brøderbund’s in-house staff. Released just a little too late to fully capitalize on the 1984 Christmas rush, Karateka started slowly but gradually turned into a hit, especially once the Commodore 64 port dropped in June of 1985. Once ported to Nintendo for the domestic Japanese market, it proceeded to sell many hundreds of thousand units, making Jordan Mechner a very flush young man indeed.

So, Mechner, about to somehow manage to graduate despite all the missed assignments and cut classes spent working on Karateka, seemed poised for a fruitful career making games. Yet he continued to vacillate between his twin obsessions. Even as his game, the most significant accomplishment of his young life and one of which anyone could justly be proud, had entered the homestretch, he had written how “I definitely want my next project to be film-related. Videogames have taken up enough of my time for now.” In the wake of his game’s release, the steady stream of royalties therefrom only made it easier to dabble in film.

Mechner spent much of the year after graduating from university back at home in Chappaqua working on his first screenplay. In between writing dialog and wracking himself with doubt over whether he really wanted to do another game at all, he occasionally turned his attention to the idea of a successor to Karateka. Already during that first summer after Yale, he and Gene Portwood, a Brøderbund executive, dreamed up a scenario for just such a beast: an Arabian Nights-inspired story involving an evil sultan, a kidnapped princess, and a young man — the player, naturally — who must rescue her. Karateka in Middle Eastern clothing though it may have been in terms of plot, that was hardly considered a drawback by Brøderbund, given the success of Mechner’s first game.

Seven frames of animation ready to be photocopied and digitized.

Seven frames of animation ready to be photocopied and digitized.

Determined to improve upon the rotoscoping of Karateka, Mechner came up with a plan to film a moving figure and use a digitizer to capture the frames into the computer, rather than tracing the figure using the VersaWriter. He spent $2500 on a high-end VCR and video camera that fall, knowing he would return them before his month’s grace period was out (“I feel so dishonest,” he wrote in his journal). The technique he had in the works may have been an improvement over what he had done for Karateka, but it was still very primitive and hugely labor-intensive. After shooting his video, he would play it back on the VCR, pausing it on each frame he wanted to capture. Then he would take a picture of the screen using an ordinary still camera and get the film developed. Next step was to trace the outline of the figure in the photograph using Magic Marker and fill him in using White-Out. Then he would Xerox the doctored photograph to get a black-and-white version with a very clear silhouette of the figure. Finally, he would digitize the photocopy to import it into his Apple II, and erase everything around the figure by hand on the computer to create a single frame of sprite animation. He would then get to go through this process a few hundred more times to get the prince’s full repertoire of movements down.


On October 20, 1985, Jordan Mechner did his first concrete work on the game that would become Prince of Persia, using his ill-gotten video camera to film his 16-year-old brother David running and jumping through a local parking lot. When he finally got around to buying a primitive black-and-white image digitizer for his trusty Apple II more than six months later, he quickly determined that the footage he’d shot was useless due to poor color separation. Nevertheless, he saw potential magic.

I still think this can work. The key is not to clean up the frames too much. The figure will be tiny and messy and look like crap… but I have faith that, when the frames are run in sequence at 15 fps, it’ll create an illusion of life that’s more amazing than anything that’s ever been seen on an Apple II screen. The little guy will be wiggling and jiggling like a Ralph Bakshi rotoscope job… but he’ll be alive. He’ll be this little shimmering beacon of life in the static Apple-graphics Persian world I’ll build for him to run around in.

For months after that burst of enthusiasm, however, he did little more with the game.

At last in September of 1986, having sent his screenplay off to Hollywood and thus with nothing more to do on that front but wait, Mechner moved out to San Rafael, California, close to Brøderbund’s offices, determined to start in earnest on Prince of Persia. He spent much time over the next few months refining his animation technique, until by Christmas everyone who saw the little running and jumping figure was “bowled over” by him. Yet after that progress again slowed to a crawl, as he struggled to motivate himself to turn his animation demos into an actual game.

And then, on May 4, 1987, came the phone call that would stop the little running prince in his tracks for the better part of a year. A real Hollywood agent called to tell him she “loved” his script for Birthstone, a Spielbergian supernatural comedy/thriller along the lines of Gremlins or The Goonies. Within days of her call, the script was optioned by Larry Turman, a major producer with films like The Graduate on his resume. For months Mechner fielded phone calls from a diverse cast of characters with a diverse cast of suggestions, did endless rewrites, and tried to play the Hollywood game, schmoozing and negotiating and trying not to appear to be the awkward, unworldly kid he still largely was. Only when Birthstone seemed permanently stuck in development hell — “Hollywood’s the only town where you can die of encouragement,” he says wryly, quoting Pauline Kael —  did he give up and turn his attention back to games. Mechner notes today that just getting as far as he did with his very first script was a huge achievement and a great start in itself. After all, he was, if not quite hobnobbing with the Hollywood elite, at least getting rejection letters from such people as Michael Apted, Michael Crichton, and Henry Winkler; such people were reading his script. But he had been spoiled by the success of Karateka. If he wrote another screenplay, there was no guarantee it would get even as far as his first had. If he finished Prince of Persia, on the other hand, he knew Brøderbund would publish it.

And so, in 1988, it was back to games, back to Prince of Persia. Inspired by “puzzly” 8-bit action games like Doug Smith’s Lode Runner and Ed Hobbs’s The Castles of Dr. Creep, his second game was shaping up to be more than just a game of combat. Instead his prince would have to make his way through area after area full of tricks, traps, and perilous drops. “What I wanted to do with Prince of Persia,” Mechner says, “was a game which would have that kind of logical, head-scratching, fast-action, Lode Runner-esque puzzles in a level-based game but also have a story and a character that was trying to accomplish a recognizable human goal, like save a princess. I was trying to merge those two things.” Ideally, the game would play like the iconic first ten minutes of Raiders of the Lost Ark, in which Indiana Jones runs and leaps and dodges and sometimes outwits rather than merely outruns a series of traps. For a long while, Mechner planned to make the hero entirely defenseless, as a sort of commentary on the needless ultra-violence found in so many other games. In the end, he didn’t go that far — the allure of sword-fighting, not to mention commercial considerations, proved too strong — but Prince of Persia was nevertheless shaping up to be a far more ambitious, multi-faceted work than Karateka, boasting much more than just improved running and jumping animations.

With just 128 K of memory to work with on the Apple II, Mechner was forced to make Prince of Persia a modular design, relying on a handful of elements which are repeatedly reused and recombined. Take, for instance, the case of the loose floorboards. The first time they appear, they’re a simple trap: you have to jump over a section of the floor to avoid falling into a pit. Later, they appear on the ceiling, as part of the floor above your own; caught in an apparent cul de sac, you have to jump up and bash the ceiling to open an escape route. Still later, they can be used strategically: to kill guards below you by dropping the floorboards on their heads, or to hold down a pressure plate below you that opens a door on the level on which you’re currently standing. It’s a fine example of a constraint in game design turning into a strength. “There’s a certain elegance to taking an element the player is already familiar with,” says Mechner, “and challenging him to think about it in a different way.”


On July 14, 1989, Mechner shot the final footage for Prince of Persia: the denouement, showing the prince — now played by the game’s project manager at Brøderbund, Brian Ehler — embracing the rescued princess — played by Tina LaDeau, the 18-year-old daughter of another Brøderbund employee, in her prom dress. (“Man, she is a fox,” Mechner wrote in his journal. “Brian couldn’t stop blushing when I had her embrace him.”)

The game shipped for the Apple II on October 6, 1989. And then, despite a very positive review in Computer Gaming World — Charles Ardai called it nothing less than “the Star Wars of its field,” music to the ears of a movie buff like Mechner — it proceeded to sell barely at all: perhaps 500 units a month. It was, everyone at Brøderbund agreed, at least a year too late to hope to sell significant numbers of a game like this on the Apple II, whose only remaining commercial strength was educational software, thanks to the sheer number of the things still installed in American schools. Mechner’s procrastination and vacillation had spoiled this version’s commercial prospects entirely.

Thankfully, the Apple II version wasn’t to be the only one. Brøderbund already had programmers and artists working on ports to MS-DOS and the Amiga, the last two truly viable computer-gaming platforms in North America. Mechner as well turned his attention to the versions for these more advanced machines as soon as the Apple II version was finished. And once again his father pitched in, composing a lovely score for the luxuriously sophisticated sound hardware now at the game’s disposal. “This is going to be the definitive version of Prince of Persia,” Mechner enthused over the MS-DOS version. “With VGA [graphics] and sound card, on a fast machine, it’ll blow the Apple away. It looks like a Disney film. It’s the most beautiful game I’ve ever seen.” Reworked though they were in almost all particulars, at the heart of the new versions lay the same digitized film footage that had made the 8-bit prince run and leap so fluidly.

Prince of Persia

And yet, after it shipped on April 19, 1990, the MS-DOS version also disappointed. Mechner chafed over his publisher’s disinterest in promoting the game; they seemed on the verge of writing it off, noting how the vastly superior MS-DOS version was being regarded as just another port of an old 8-bit game, and thus would likely never be given a fair shake by press or public. True as ever to the bifurcated pattern of his life, he decided to turn back to film. Having tried and failed to get into New York University film school, he resorted to working as a production assistant in movies by way of supporting himself and trying to drum up contacts in the film-making community of New York. Thus the first anniversary of Prince of Persia‘s original release on the Apple II found him schlepping crates around New York City. His career as a game developer seemed to be behind him, and truth be told his prospects as a filmmaker didn’t look a whole lot brighter.

The situation began to reverse itself only after the Amiga version was finished — programmed, as it happened, by Dan Gorlin, the very fellow whose Choplifter had first inspired Mechner to look at his own games differently. In Europe, the Amiga’s stronghold, Prince of Persia was free of the baggage which it carried in North America — few in Europe had much idea of what an Apple II even was — and doubtless benefited from a much deeper and richer tradition on European computers of action-adventures and platform puzzlers. It received ebullient reviews and turned into a big hit on European Amigas, and its reputation gradually leaked back across the pond to turn it at last into a hit in its homeland as well. Thus did Prince of Persia become a slow grower of an international sensation — a very unusual phenomenon in the hits-driven world of videogames, where shelf lives are usually short and retailer patience shorter. Soon came the console releases, along with releases for various other European and Japanese domestic computers, sending total sales soaring to over 2 million units.

By the beginning of 1992, Mechner was far removed from his plight of just eighteen months before. He was drowning in royalties, consulting intermittently with Brøderbund on a Prince of Persia 2 — it was understood that his days in the programming trenches were behind him — and living a globetrotting lifestyle, jaunting from Paris to San Rafael to Madrid to New York as whim and business took him. He was also planning his first film, a short documentary to be shot in Cuba, and already beginning to mull over what would turn into his most ambitious and fascinating game production of all, known at this point only as “the train game.”

Prince of Persia, which despite the merits of that eventual “train game” is and will likely always remain Mechner’s signature work, strikes me most of all as a triumph of presentation. The actual game play is punishingly difficult. Each of its twelve levels is essentially an elaborate puzzle that can only be worked out by dying many times when not getting trapped into one of way too many dead ends. Even once you think you have it all worked out, you still need to execute every step with perfect precision, no mean feat in itself. Messing up at any point in the process means starting that level over again from the beginning. And, because you only have one hour of real time to rescue the princess, every failure is extremely costly; a perfect playthrough, accomplished with absolute surety and no hesitations, takes about half an hour, leaving precious little margin for error. At least there is a “save” feature that will let you bookmark each level starting with the third, so you don’t have to replay the whole game every time you screw up — which, believe me, you will, hundreds if not thousands of times before you finally rescue the princess. Beating Prince of Persia fair and square is a project for a summer vacation of those long-gone adolescent days when responsibilities were few and distractions fewer. As a busy adult, I find it too repetitive and too reliant on rote patterns, as well as — let’s be honest here — just too demanding on my aging reflexes. In short, the effort-to-reward ratio strikes me as way out of whack. Of course, I’m sure that, given Prince of Persia‘s status as a beloved icon of gaming, many of you have a different opinion.

So, let’s turn back to something on which we can hopefully all agree: the brilliance of that aforementioned presentation, which brings to aesthetic maturity many of the techniques Mechner had first begun to experiment with in Karateka. Rather than using filmed footage as a tool for the achievement of fluid, lifelike motion, as Mechner did, games during the years immediately following Prince of Persia would be plastered with jarring chunks of poorly acted, poorly staged “full-motion video.” Such spectacles look far more dated today than the restrained minimalism of Prince of Persia. The industry as a whole would take years to wind up back at the place where Jordan Mechner had started: appropriating some of the language of cinema in the service of telling a story and building drama, without trying to turn games into literal interactive movies. Mechner:

Just as theater is its own thing — with its own conventions, things that it does well, things it does badly — so is film, and so [are] computer games. And there is a way to borrow from one medium to another, and in fact that’s what an all-new medium does when it’s first starting out. Film, when it was new, looked like someone set up a camera front and center and filmed a staged play. Then the things that are specific to film — like the moving camera, close-ups, reaction shots, dissolves — all these kinds of things became part of the language of cinema. It’s the same with computer games. To take a long film sequence and to play that on your TV screen is the bad way to make a game cinematic. The computer game is not a VCR. But if you can borrow from the knowledge that we all carry inside our heads of how cuts work, how reaction shots work, what a low angle means dramatically, what it means when the camera suddenly pulls back… We’ve got this whole collective unconscious of the vocabulary of film, and that’s a tremendously valuable tool to bring into computer gaming.

In a medium that has always struggled to tamp down its instinct toward aesthetic maximalism, Mechner’s games still stand out for their concern with balance and proportion. Mechner again:

Visuals are [a] component where it’s often tempting to compromise. You think, “Well, we could put a menu bar across here, we could put a number in the upper right-hand corner of the screen representing how many potions you’ve drunk,” or something. The easy solution is always to do something that as a side effect is going to make the game look ugly. So I took as one of the ground rules going in that the overall screen layout had to be pleasing, had to be strong and simple. So that somebody who was not playing the game but who walked into the room and saw someone else playing it would be struck by a pleasing composition and could stop to watch for a minute, thinking, “This looks good, this looks as if I’m watching a movie.” It really forces you as a designer to struggle to find the best solution for things like inventory. You can’t take the first solution that suggests itself, you have to try to solve it within the constraints you set yourself.

Mechner’s take on visual aesthetics can be seen as a subversion of Ken Williams’s old “ten-foot rule,” which, as you might remember, stated that every Sierra game ought to be visually arresting enough to make someone say “Wow!” when glimpsing it from ten feet away across a crowded shop. Mechner believed that game visuals ought to be more than just striking; they ought to be aesthetically good by the more refined standards of film and the other, even older visual arts. All that time Mechner spent obsessing over films and film-making, which could all too easily be labeled a complete waste of time, actually allowed him to bring something unique to the table, something that made him different from virtually all of his many contemporaries in the interactive-movie business.

There are various ways to situate Jordan Mechner’s work in general and Prince of Persia in particular within the context of gaming history. It can be read as the last great swan song of the Apple II and, indeed, of the entire era of 8-bit computer gaming, at least in North America. It can be read as yet one more example of Brøderbund’s downright bizarre commercial Midas touch, which continued to yield a staggering number of hits from a decidedly modest roster of new releases (Brøderbund also released SimCity in 1989, thus spawning two of the most iconic franchises in gaming history within bare months of one another). It can be read as the precursor to countless cinematic action-adventures and platformers to come, many of whose designers would acknowledge it as a direct influence. In its elegant simplicity, it can even be read as a fascinating outlier from the high-concept complexity that would come to dominate American computer gaming in the very early 1990s. But the reading that makes me happiest is to simply say that Prince of Persia showed how less can be more.

(Sources: Game Design Theory and Practice by Richard Rouse III; The Making of Karateka and The Making of Prince of Persia by Jordan Mechner; Creative Computing of March 1979, September 1979, and May 1980; Next Generation of May 1998; Computer Gaming World of December 1989; Jordan Mechner’s Prince of Persia postmortem from the 2011 Game Developers Conference; “Jordan Mechner: The Man Who Would Be Prince” from Games™; the Jordan Mechner and Brøderbund archives at the Strong Museum of Play.)

 
 

Tags: , ,

Generation Nintendo

Nintendo

In the final months of World War II, when the United States was trying to burn out the will of a starving Japan via the most sustained campaign of aerial incendiary bombardment in history, a handful of obvious targets remained strangely untouched. Among those targets was Kyoto: population 1 million plus, founded in the year 793, capital of the nation and home of the Emperor for most of the intervening centuries, home to more national shrines and other historic sites than any other city in Japan, world famous for its silk and cloisonné. If a single city can be said to embody the very soul of the Japanese people, it must be this one.

If the citizens of Kyoto believed that their city was being left untouched by the bombs raining down on the rest of the country out of respect for the special place it occupied in the Japanese psyche, they were partially correct. Yet the motivation behind their seeming good fortune was cold-blooded rather than humanitarian. American Air Force planners were indeed aware of Kyoto’s symbolic importance, but they hardly saw that importance as grounds for sparing the city. Far from it. Kyoto was being reserved as a target for a special new weapon, one which was referred to only obliquely in Air Force internal memoranda as “the gadget.” Today we know the gadget as the atomic bomb. Entirely destroying Kyoto with one bomb would deliver a shock to the rest of Japan unequaled by the destruction of any other possible target: “From the psychological point of view there is the advantage that Kyoto is an intellectual center for Japan and the people there are more apt to appreciate the significance of such a weapon as the gadget.” Kyoto must be left untouched while the gadget was made ready for service so that mission planners and scientists could properly evaluate the bomb’s effect on an undamaged clean slate of a target.

Hundreds of thousands of Kyoto residents would wind up owing their lives to Henry L. Stimson, a humane man tortured daily by the orders he had to issue as the American Secretary of War; never was there a Secretary of War who hated war more. In response to Stimson’s demand after the successful first test of the gadget in New Mexico, General Leslie Groves, head of the Manhattan Project, reluctantly presented the Air Force’s list of planned targets to him, with Kyoto at the top. Stimson was horrified. Citing the proposed destruction of Kyoto as an unforgivable act from which Japan would never recover, Stimson, 77 years old and in poor health, faced down virtually the entire entrenched bureaucracy of the American military to demand that the first atomic bomb to be used in anger be dropped somewhere, anywhere else: “This is one time I’m going to be the final deciding authority. Nobody’s going to tell me what to do on this.” His stubborn stance resulted at last in Kyoto being stricken from the list by grumbling generals who would have been perfectly happy if its destruction really had been a death blow to the culture it symbolized, thank you very much. Of course, in saving hundreds of thousands of Kyoto residents Stimson was also consigning to death hundreds of thousands of others in Hiroshima. Such are the wages of war.

The decision to spare Kyoto had another unintended consequence, one which may seem trivial — even disrespectful — to mention in parallel with such immense tolls in human lives saved and lost, but one which in its own way illustrates the interconnectness of all things. Hidden away within Kyoto’s blissfully undamaged warren of ancient streets was a little family-owned company called Nintendo, maker of ornate playing cards and other games and collectibles. Absolutely dedicated to the war effort, as all good Japanese were expected to be at the time, they had lately taken to giving their products jingoist themes, such as a backgammon board illustrated by cartoon animals dressed up as soldiers, with Japanese flags flying proudly above them and British and American flags lying crumpled in the dust at their feet.

More than four decades later, Stimson’s determination to spare Kyoto and with it Nintendo boomeranged back on his country in a way that no one could have seen coming. Many contemporary commentators, conditioned by the Reagan Revolution to cast all things in terms of nationalism and patriotism, saw in the arrival of Nintendo on American shores the opening of the latest front in a new war, economic rather than military this time, between the United States and Japan. And this time it seemed that Japan was winning the war handily. They had come for our steel, and we had done nothing. They had come for our auto industry, and we had done nothing. They had come for our televisions and stereos, and we had done nothing. Now they were coming for our videogame consoles. How long would it be until the PC industry, arguably the biggest economic success story of the 1980s, was threatened as well?

Given the subject of this article, I should take a moment to clarify right now that this blog has not been and will never become a history of console-based videogames. This blog is rather a history of computer games, a culture possessed of plenty of interconnections and collisions with the larger, more mainstream culture of the consoles, but one which has nevertheless remained largely its own thing ever since the first popular videogame console and the first three pre-assembled PCs were all launched during the single fecund year of 1977. In addition to reasons of pure personal preference, I justify this focus by noting that a fair number of people are doing great, rigorous history in the realm of videogames, while the realm of computer games has been comparatively neglected.

Still, we can’t really understand the history of computer games without reckoning with those aforementioned interconnections and collisions with the world of the consoles. And one of the biggest and most obvious collisions of all was that crazy time at the tail end of the 1980s when Nintendo arrived to sweep the rug out from under a computer-game industry which had spent the last few years convinced that it was destined to become the next great movement in mainstream American entertainment — i.e., destined to hold exactly the position that this Japanese upstart had just swept in and taken over with breathtaking speed. Small wonder that coded allusions to the dark days of World War II, accompanied by thinly veiled (or blatantly unveiled) racism, became the order of the day in many sectors of American culture, industry, and government alike. Meanwhile the bewildered computer-game executives were trying to figure out what the hell had just hit them and what they should do about it. Let’s join them now in asking the first of those questions.

Hiroshi Yamauchi

Hiroshi Yamauchi

The history of the company known as Nintendo — the name can be very roughly translated as an admonition to work hard but also to accept that one’s ultimate success is in the hands of greater powers — dates all the way back to 1889, when it was founded by Fusajiro Yamauchi as a maker of intricately painted playing cards, known as “hanafuda” in Japanese. Nintendo managed to survive and grow modestly amid many changes in Japanese life over the course of the next half-century and beyond. The company’s modern history, however, begins in 1949, when Hiroshi Yamauchi, latest scion of the family-owned business, took over as president. Far more ambitious than his forebears, this latest Yamauchi was inspired by the entrepreneurial ferment of the rebuilding postwar Japan to expand Nintendo beyond playing cards and collectibles. The results of his efforts were decidedly mixed in the early years. Among his less successful initiatives were a line of instant-rice meals — a sort of ricey Ramen Noodles before Ramen Noodles were cool — and a chain of “love motels” offering busy executives the convenience of paying for their trysts by the hour. (Ironic as they might seem in light of Nintendo’s later rigorously enforced family-friendly image, at the time the love motels seemed to everyone around him a natural innovation for Yamauchi to have dreamed up; he was a notorious philanderer.) More successful, for a while, was a Nintendo taxi service. Yet even it was hardly a world-beater. Throughout the first two decades of Yamauchi’s lengthy reign he continued to cast restlessly about for the Big One, the idea that would finally take Nintendo to the next level.

In 1969, he made a big step in the direction of finding his company’s life’s purpose when he founded a new division called simply “Toys.” Employing a number of young gadget freaks as inventors, Toys began to churn out a series of strange contraptions straight out of Rube Goldberg, such as the Ultra Hand, a scissor-like reach extender that was more whimsical than practical; the Ultra Machine, an indoor mechanical baseball pitcher; and the Ultra Scope, a periscope for peeking around corners and over fences. (Parents were not terribly fond of this last one in particular.) All were quite successful, opening at last the sustainable new business front for Nintendo that Yamauchi had been dreaming of for so long.

With electronic components getting smaller and cheaper by the year, Nintendo’s innovative toys inevitably began to take on more and more of an electronic character as time wore on. The first big success in the realm of electronic gadgets was something called the Nintendo Beam Gun, which combined a light gun with a set of targets equipped with the appropriate photoelectric sensors; more than 1 million of them were sold. Nintendo built on the Beam Gun’s success with a chain of Laser Clay Ranges — think “clay pigeons” — that spread across Japan during the mid-1970s, re-purposed bowling alleys where patrons could engage in gunfights with cowboys and “homicidal maniacs” projected onto the far wall.

With Atari now going strong in the United States, videogames were a natural next step for Nintendo. They first made a series of Color TV Games, each a home videogame capable of playing a few variants of a single simple game when hooked up to the family television set; they sold at least 2.5 million of them in the late 1970s. The Nintendo Game & Watch, a whole line of handheld gadgets capable of playing a single game each, did even better; Nintendo is estimated to have sold over 40 million of them during the 1980s. Meanwhile they were also moving into the standup arcade; Donkey Kong, released in 1981, became a worldwide smash, introducing the Nintendo name to many in the United States for the first time. The designer of that cute, colorful, relatively non-violent game, a blueprint for the eventual Nintendo aesthetic as a whole, was one Shigeru Miyamoto. He would become not only Nintendo’s own most famous designer and public figure, but the most famous Japanese videogame designer of all time, full stop. The protagonist of Miyamoto’s Donkey Kong, a little leaping Italian plumber named Mario, was also destined for greatness as arguably the most famous videogame character of all time (his only serious rival is likely Pac-Man, another contemporaneous Japanese creation).

All of this success, however, was only laying the groundwork for Nintendo’s masterstroke. Moving on from the single-game units that had so far been Nintendo’s sole output, Yamauchi tasked his engineers with creating a proper videogame console capable of playing many games that could be sold separately in the form of cartridges, just like the Atari VCS. The device they came up with was hardly state of the art even at the time of its debut. It was built around a clone of the venerable old 8-bit MOS 6502, the same chip found in the Atari VCS as well as American home computers like the Apple II and Commodore 64, with those circuits that were protected by patents excised. It offered graphics a little better than the likes of the 64, sound a little worse. The new machine was being readied at seemingly the worst possible time: just as the Great Videogame Crash was underway in the United States, and just as the worldwide conventional wisdom was saying that home computers were the future, videogame consoles a brief-lived fad of the past. Yet Nintendo freely, even gleefully defied the conventional wisdom. The Nintendo Family Computer (“Famicom”) was deliberately designed to be as non-computer-like as possible. Instead it was patterned after Nintendo’s successful toys and gadgets — all bright, garish plastic, with as few switches and plugs as possible, certainly with nothing as complicated as a keyboard or disk drive. It looked like a toy because Nintendo designed it to look like a toy.

The Nintendo Famicom

The Nintendo Famicom

Yamauchi realized that a successful videogame console was at least as much a question of perception — i.e., of marketing — as it was of technology. In the imploding Atari, he had the one great counterexample he needed, a perfect model of what not to do. Atari’s biggest sin in Yamauchi’s eyes had been to fail to properly lock down the VCS. It had never occurred to them that third parties could start making games for “their” machine, until Activision started doing just that in 1980, to be followed by hundreds more. Not only had all of those third-party cartridges cost Atari hundreds of millions in the games of their own that they didn’t sell and the potential licensing fees that they didn’t collect, they had also gravely damaged the image of their platform: many or most Atari VCS games were just plain bad, and some were in devastatingly terrible taste to boot. The public at large, Yamauchi realized, didn’t parse fine distinctions between a game console and the games it played. He was determined not to lose control of his brand as Atari had done theirs.

For better and for worse, that determination led to Nintendo becoming the first of the great walled gardens in consumer software. The “better” from the standpoint of consumers was a measure of quality control, an assurance that any game they bought for their console would be a pretty good, polished, playable game. And from the standpoint of Yamauchi the “better” was of course that Nintendo got a cut of every single one of those games’ earnings, enough to let him think of the console itself as little more than a loss leader for the real business of making and licensing cartridges: “Forgo the big profits on the hardware because it is really just a tool to sell software. That is where we shall make our money.” The “worse” was far less diversity in theme, content, and mechanics, and a complete void of games willing to actually say almost anything at all about the world, lest they say something that some potential customer somewhere might possibly construe as offensive. The result would be an infantilization of the nascent medium in the eyes of mainstream consumers, an infantilization from which it has arguably never entirely escaped.

Whatever the reservations of curmudgeons like me, however, the walled-garden model of software distribution proved successful even beyond Yamauchi’s wildest dreams. After releasing their new console to Japanese consumers on July 15, 1983, Nintendo sold more than 2.5 million of them in the first eighteen months alone. Sales only increased as the years went by, even as the hardware continued to grow more and more technically obsolete. Consumers didn’t care about that. They cared about all those cute, colorful, addictive games, some produced by an ever-widening circle of outside licensees, others — including many or most of the best and best-remembered — by Nintendo’s own crack in-house development team, with that indefatigable fount of creativity named Shigeru Miyamoto leading the way. Just as Yamauchi had predicted, the real money in the Famicom was in the software that was sold for it.

Minoru Arakawa

Minoru Arakawa

With the Famicom a huge success in Japan, there now beckoned that ultimate market for any ambitious up-and-comer: the United States. Yamauchi had already set up a subsidiary there called Nintendo of America back in 1980, under the stewardship of his son-in-law Minoru Arakawa. Concerns about nepotism aside — no matter how big it got, Nintendo would always remain the Yamauchi family business — Arakawa was ideal for the job: an MIT-educated fluent English-speaker who had traveled extensively around the country and grown to understand and love its people and their way of life. Under his stewardship, Nintendo of America did very well in the early years on the back of Donkey Kong and other standup-arcade games.

Yet Nintendo as a whole hesitated for quite some time at the prospect of introducing the Famicom to North America. When Arakawa canvased toy stores, the hostility he encountered to the very idea of another videogame console was palpable. Atari had damaged or destroyed many a business and many a life on the way down, and few drew much of a distinction between Atari and the videogame market as a whole. According to one executive, “it would be easier to sell Popsicles in the Arctic” than to convince the toy stores to take a flyer on another console.

But Arakawa, working in tandem with two American executive recruits who would become known as “the two Howards” — Howard Lincoln and Howard Philips — wouldn’t let go of the idea. Responding to focus-group surveys that said the Japanese Famicom was too toy-like and too, well, foreign-looking to succeed in the United States, he got Nintendo’s engineers to redesign the externals to be less bulbous, less garish, and less shiny. He also gave the Famicom a new, less cutesy name: the Nintendo Entertainment System, or NES. The only significant technical update Nintendo made for North America was a new state-of-the-art handshaking system for making sure that every cartridge was a legitimate, licensed Nintendo game; black-market cartridges duplicated by tiny companies who hoped to fly under the radar of Nintendo’s stringent licensing regime had become a real problem on the Famicom. Tellingly, the lockout system was by far the most technically advanced aspect of the NES.

The Nintendo Entertainment System

The Nintendo Entertainment System

The new NES made its public debut at last at the Summer Consumer Electronics Show in June of 1985. Few in the home-computer trade press — the videogame trade press didn’t really exist anymore — paid it any real attention. The big news of the show was rather the new Jack Tramiel-led Atari’s 16-bit ST computer. Computer Gaming World was typical, mentioning the NES only as a passing bit of trivia at the end of a long CES feature article: “Nintendo even offered an entirely new game system.” Clearly Arakawa and company had an uphill climb before them.

They deliberately started small. They would sell the NES first in New York City only — chosen because Arakawa considered it the most cynical and challenging place to market a new gadget in the country, and, as the old song says, “if you can make it there you can make it anywhere.” Starting with a warehouse full of the first 100,000 NESs to arrive from Japan and a $50 million war chest, Arakawa and the two Howards personally visited virtually every toy and electronics store in the five boroughs to press flesh and demonstrate the NES to skeptical managers and proprietors — and (hopefully) to take orders when they were finished. Meanwhile Nintendo blitzed the airwaves with advertising. They managed to sell 50,000 NESs in New York alone that Christmas season — not bad for an unknown gadget in a field that everyone, from the most rarefied pundit to the most ordinary Joe or Jane on the street, considered to be yesterday’s fad.

From that promising start they steadily expanded: first to that other taste-maker capital Los Angeles, then to Chicago, to San Francisco, to Dallas and Houston, and finally nationwide. Sales hit the magic 1 million mark well before the end of 1986. Cheap and cheerful and effortless in its lack of fiddly disk drives and keyboards, the NES was selling by that point as well as the Commodore 64, and far better than any other home computer. In the NES’s second year on the market it eclipsed them all to such an extent as to make continued comparison almost pointless: 3 million NESs were sold during those twelve months alone. And, astonishingly, it was still just getting started. During 1988, 7 million NESs were sold, to go with 33 million cartridges, each of which represented yet more profit for Nintendo. Lifetime NES sales topped 30 million in 1990, by which time one out of every three American homes could boast one of these unassuming gray boxes perched underneath the television. Total NES and Famicom lifetime sales reached a staggering 75 million in 1992; as many Nintendos were by then in the world as all PCs, whether found in homes or businesses or schools, combined. Even the Atari VCS in the heyday of the first videogame fad had never been able to boast of numbers like this.

Because Nintendo had come into the console market when it was universally considered dead, they had been able to reinvent it entirely in their own image. Just as “Atari” had once been a synonym for videogames in general, now “Nintendo” threatened to become the same for a new generation of players. Savvy about branding and marketing in a way that Atari had never quite managed to be, Nintendo felt compelled to actively push against this trend by aggressively protecting and limiting the use of their trademarks; they didn’t want people buying a new “Nintendo” that happened to have the name of Sega, Sony, or 3DO stamped on its case.

Nintendo’s penetration of the North American market could (and doubtless has) serve as the basis of an MBA course in marketing and brand-building. Starting from the less than nothing of a dead industry replete with consumer ill-will, coming from a foreign nation that was viewed with fear and mistrust by many Americans, Nintendo of America built one of the largest and most insanely loyal customer bases the American economy has ever known. They did it by tying their own brand to brands their target demographic was known to already love, like Pepsi and McDonald’s. They did it by building Nintendo stores within stores in major chains from Macy’s to Toys “R” Us, where kids could browse and play under the benevolent gaze of Mario while their parents shopped. (By 1991, Nintendo alone represented 20 percent of Toys “R” Us’s total revenues, and seven of their ten best-selling single products.) They did it by building a massive mailing list from the warranty cards that their young customers sent in, then using contests and giveaways to make every single one of them feel like a valued member of the new Generation Nintendo. They did it by publishing a glossy magazine, Nintendo Power, full of hints and tips on the latest games and all the latest news on what was coming next from Nintendo (and nothing on what was coming from their competitors). They did it by setting up a hotline of “Nintendo Game Counselors,” hundreds of them working at any one time to answer youngsters’ questions about how to get through this tricky level or kill that monster. They did it by relentlessly data-mining to find out what their customers liked about their games and what they didn’t, and crafting new releases to hit as many players as possible precisely in their sweet spots. They did it by spending up to $5 million on a single 30-second television commercial, four or five times the typical going rate, making the new commercials for a new Nintendo game an event in themselves. They did it by making sure that Mario and Zelda and their other iconic characters were everywhere, from television shows to records, from lunch boxes to bed sheets. And they did it by never worrying their customers with the sorts of metrics that the home-computer makers loved: kilobytes and megabytes and colors and resolutions and clock speeds and bit counts. The NES was so thoroughly locked down that it was years before there was any published information available at all on what was really contained within those ubiquitous gray plastic shells.

If it can all sound a little soulless when laid out like that, well, few in business would argue with the end results. Nintendo seemed to be becoming more American than most Americana. “A boy between 8 and 15 without a Nintendo is like a boy without a baseball glove,” wrote Hobby World magazine in 1988. In 1990 a survey found Mario to be more recognizable to American children than that most American of all cartoon icons — Mickey Mouse.

And where did all of this leave the established American computer-game industry? That was a question that plenty in said industry itself were asking with ever-increasing frustration and even desperation. Total sales of computer games published on all platforms in 1989 totaled about $230 million; total sales for Nintendo cartridges, $1.5 billion. It wasn’t supposed to have gone like this. No one in computer games had seen anything like Nintendo coming. They, the computer-game industry, were supposed to have been the next big wave in American home entertainment — a chicken in every pot and a home computer in every living room. Instead this Japanese upstart had stolen their thunder to such an extent as to render their entire industry an afterthought, a veritable non-entity in the eyes of most financial analysts and venture capitalists. Just to add insult to the injury, they were being smothered by thoroughly obsolete 8-bit technology when they could offer consumers audiovisual feasts played on Amigas and Atari STs and IBM PS/2s with VGA graphics. A computer-game designer with Electronic Arts saw unnerving parallels between his own industry and another American industry that had been devastated by Japan in the previous decade:

The best companies and the best programmers were making computer games. But the Nintendo player didn’t care about the sophisticated leaps we were making on computers — the frame rate of the images or incredible sound. They just wanted fun. It was like we were making gas guzzlers and the Japanese were making subcompacts.

At street level the situation didn’t look much better. Fred D’Ignazio, a columnist for Compute!’s Gazette, shares a typical story:

My kids and I used to play games on our home computer — games like Epyx’s The Legend of Blacksilver, SSI’s Questron II, EA’s Jordan vs. Bird: One-on-One, Gamestar’s Take Down, Arcadia’s Aaargh!, and, of course gobs and gobs of good educational games.

Then the Nintendo landed, and things haven’t been the same since. The Nintendo runs day and night. (We’re not even allowed to shut off the machine when we go to bed because there’s always a game in progress — and there’s no disk drive to back it up.) Meanwhile, I don’t think our little home computer has been fired up in weeks.

The computer that was most damaged by Nintendo’s invasion of North America was undoubtedly the Commodore 64. It was very cheap in computer terms, but once you added in the cost of the essential disk drive it was nowhere near as cheap as the NES. And it was still a computer, even if a computer that had long been used primarily for playing games. You had to type in arcane commands to get a game started, had to wait for the game to load, often had to shuffle disks in and out of the drive and do a lot more waiting as you actually played. A Compute!’s Gazette reader shares the story of her attempt to introduce her Nintendo-loving eight-year-old nephew to the joys of Commodore 64 gaming:

As he looked through my 64 software to pick out a game, I started to give directions on how to handle the software and disk drive. Before I could finish he said, “I just want to use a cartridge and start playing.” After about fifteen minutes into a game he said, “This is great, but how come it takes so long to start the game again and why do I have to keep turning the disk over and over all the time?” Shortly after, he started complaining that his hand was too small for the joystick. He tried three other joysticks, but he either had the same problem or the joystick didn’t have the dexterity needed to play the game. He then said, “I wish I could use my Nintendo controls on your Commodore.” Soon after, he quit and went right to his Nintendo.

The Commodore 64 was in a very difficult position, squeezed from below by Nintendo and squeezed from above by the Amiga and Atari ST and, most of all, by ever more consumer-friendly MS-DOS-based machines from companies like Tandy, which were beginning to sport hard disks, crisp VGA graphics, sound cards, and mice. There wasn’t much that Commodore’s aged little breadbox could offer in response to a feature set like that. In the battle versus Nintendo for the low end, meanwhile, all of the immense force of playground public opinion was arrayed against the Commodore 64. The 64 was clunky and slow and ugly. It was the machine your big brother used to play games on, the one your parents kept pushing you toward to learn programming or to play educational (blech!) games. The Nintendo was the machine that all your friends played on — the same friends who would look on you as a freak if you tried to get them to play a computer game with you.

If you think that hardcore Commodore 64 users accepted this changing world order peacefully, you don’t have much experience with the fanatic platform loyalties of the 1980s. Their heated opinions on the 64’s Nintendo crisis spilled much ink on the pages of the remaining 64-centric magazines, moving through spasms of denial (“If Nintendo has the ability to keep its users captured, why do my two nephews keep pestering me to let them play the games that I have for my 64?”), advice (“Commodore could bring out some new peripherals like a light gun to play shooting games or a keyboard to make use of the superior sound of the 64”), and justification (“This letter was typed on a 64. Let’s see any Nintendo do that!”). When all else failed, there was always good-old-fashioned name-calling: “The word-processing capability of the 64 is a pointless feature to most Ninnies, since the majority of them don’t seem to be able to read and write anyway. Most of the Ninny chic was built on the fact that a baboon could operate it.”

None of this raging against the dying of the light could make any difference. The Commodore 64 went into an undeniable decline in 1988. That decline became a free fall in 1989, and in 1990 the 64 was effectively declared dead by the American software industry, with virtually every publisher terminating support. The other great 8-bit survivor, the Apple II, hung on a little longer thanks to an entrenched user base in schools and small businesses, but when Apple finally discontinued all production of the line in 1993 the news was greeted by most publishers with a shrug: “I didn’t know those old things were still being made!”

The computer-game publishers’ reactions to Nintendo were complicated, ofttimes uncertain, occasionally downright contradictory. With Nintendo rapidly taking over what used to be the low end of the computer-game market, many publishers felt emboldened to refocus their energies on the still slowly growing higher end, particularly on all those new consumer-oriented clones from Tandy and others. Plenty of publishers, it must be said, weren’t really all that sad to see the 64 go. The platform had always been tricky to develop for, and its parent company was still widely loathed for heaps of very good reasons; everyone in the industry seemed to have at least one Commodore horror story to tell. Many had come to see the 64 during its years of dominance as an albatross holding back ambitions that would have been realizable on the bigger, more powerful platforms. Now they were at last free to pursue those grander schemes.

At the same time, though, the Commodore 64 had been their cash cow for years, and there remained the question of whether and how soon all those bigger machines would make up for its loss. Certainly they failed resoundingly to take up the slack in 1989, a bad year for the computer-game industry and a great one for Nintendo.

As unhappy as the majority of industry old-timers remained with the Nintendo-dominated state of affairs in digital games in general, that $1.5 billion in annual cartridge revenue and massive mainstream penetration was awfully tempting. As early as 1988, it seemed that just about everyone was discussing adapting their computer games to the NES, and a fair number were swallowing their pride to approach Nintendo with hat in hand, asking for a coveted license to make NES games. In addition to the sheer size of the Nintendo market, it also had the advantage that piracy, which many in the computer-game industry continued to believe was costing them at least half of the revenues they would otherwise be enjoying, was nonexistent there thanks to those uncopyable cartridges and the NES’s elaborate lockout system.

Activision, [1]Activision changed their name to Mediagenic midstream in these events. Because I haven’t told the story behind that change yet, and in order to just generally avoid confusion, I simply refer to the company as “Activision” in this article. who had enjoyed their greatest success by far in the old glory days of the Atari VCS, jumped onto the Nintendo bandwagon with perhaps the most enthusiasm of all. Activision’s head, the supremely unsentimental Bruce Davis, often sounded as if he would be perfectly happy to abandon computers altogether, to make Activision exclusively a publisher of videogame cartridges again: “If hardware companies are designing a machine for one purpose, they will do a better job than on a multi-function machine.”

But it’s the more unlikely NES converts that provide the best evidence of just how far Nintendo had come and just how much pressure the traditional computer-game industry was feeling. The NES began to get quite a number of ports of computer-game fare that no one would ever have imagined trying to put on a machine like this just a year or two earlier. Origin, for instance, put out NES versions of Ultima III and Ultima IV, and Lucasfilm Games ported Maniac Mansion. (See Douglas Crockford’s “The Expurgation of Maniac Mansion for a description of the hoops publishers like Lucasfilm had to jump through to meet Nintendo’s stringent content restrictions.) Even SSI, whose traditional stock-in-trade of turn-based, cerebral, complicated strategy games was about as far from the whimsy of Mario and Zelda as you could get, moved Pool of Radiance over to the NES. Computer Gaming World, the journal of choice for those same cerebral strategy gamers, tried to rope in Mario fans with a new magazine-within-a-magazine they dubbed “Video Gaming World.”

Few of these initiatives bore all that much fruit. The publishers may have found a way to get their games onto the NES, but said games remained far from the sort of fare most Nintendo players were interested in; suffice to say that Nintendo never had to worry about any of these titles eclipsing Mario. Still, the fact that so many computer-game publishers were making such an effort shows how scary and uncertain Nintendo was making their world. Perhaps the most telling moment of all came when Trip Hawkins announced that Electronic Arts would be jumping into the console space as well. This was the same Trip Hawkins who had written a commitment to “stay with floppy-disk-based computers only” into Electronic Arts’s first business plan, who had preached the gospel of home computers as successors to videogame consoles as loudly and proudly as anyone in his industry. Now he and his company were singing a very different tune. Bing Gordon, Hawkins’s right-hand man at Electronic Arts, compared home computers to, of all unflattering things, steam engines. James Watt, the inventor of the steam engine, had imagined one in every home, with a bunch of assorted pulleys and gears to make it do different things. Instead modern homes had a bunch of more specialized machines: washing machines, food processors… and now Nintendos. Soon Hawkins would leave Electronic Arts to found 3DO, a company to make… you guessed it, a new videogame console.

Some, however, chose a more belligerent path than these can’t-beat’em joiners. Nintendo’s rigorous control of the NES’s walled garden rankled everyone in the older software industry; this just wasn’t how their business was done. They believed that Nintendo was guilty of restraint of trade, antitrust violations, you name it. Particularly enraging was Nintendo’s complete control of the manufacturing pipeline for NES cartridges. Leveraging those data-mining systems of theirs, more sophisticated than anyone had heretofore ever dreamed of, Nintendo made sure that the supply of new games was always slightly less than the demand for them, thereby creating a hype for each new title as a hot, desirable status symbol among the Nintendo Generation and, most of all, avoiding the glut of games piled up in warehouses — and, eventually, landfills — that had marked the Great Videogame Crash of 1983. But when American publishers saw their games produced in insufficient quantities to become the hits they believed they might otherwise have been, they cried foul. The Software Publishers Association served as the disgruntled voice of the American software industry as a whole in what became a full-scale public-relations war against Nintendo.

The SPA believes that Nintendo has, through its complete control and single-sourcing of cartridge manufacturing, engineered a shortage of Nintendo-compatible cartridges. Retailers, consumers, and independent software vendors have become frustrated by the unavailability of many titles during the holiday season, and believe that these shortages could be prevented by permitting software vendors to produce their own cartridges.

American publishers felt certain that Nintendo was playing favorites, favoring their own games and those of their favorite third-party publishers — generally the ones from Japan — by manipulating production numbers and manipulating the sentiments of Generation Nintendo through the coverage they gave (or didn’t give) each game in Nintendo Power. “If I pissed Nintendo off,” runs a typical complaint, “I would get less product. My games would get hit in Nintendo Power and they’d get low ratings.” And the most surefire way to piss Nintendo off, at least according to this complainer, was to release a game for the NES’s first serious competitor, the Sega Genesis console that entered the United States in 1989.

There was plenty of tinder already lying about the public sphere, just waiting to be ignited by such rhetoric. All of the concerns about videogames that had been voiced by parents, educators, and politicians during the heyday of Generation Atari were now being dusted off and applied to Generation Nintendo. Now, however, they were given additional force by Nintendo’s very foreignness. Plenty of Americans, many of whom had still not completely forgiven Japan for Pearl Harbor, saw a nefarious agenda behind it all, a fifth column of Mario-obsessed youngsters who might come to undermine the very nation. “Notice the way Super Mario is drawn,” wrote one in a letter to a magazine. “He has the eyes of someone who has been brainwashed.” Lurking just below the surface of such complaints, unstated but by no means unconveyed, were old attitudes toward the Japanese as shifty characters who could never be trusted to follow the rules, whether in war or business. It all came down to “cultural” differences, they muttered disingenuously: “There’s more of a sharing of the pie by American companies. In Japan, it’s different: winners win big and losers lose.”

Hoping to capitalize on the burgeoning anti-Nintendo sentiment, in December of 1988 Tengen Games, a spinoff of Atari Games (which was itself the successor to the standup-arcade portion of the original Atari’s business), sued Nintendo in federal court for antitrust violations and monopolistic practices: “The sole purpose of the lockout system is to lock out competition.” Having found a way to defeat the much-vaunted lockout system through a combination of industrial espionage, reverse engineering, and good old social engineering — this is one of the few occasions in Nintendo’s history where one might accuse them of having been naive — Tengen simultaneously launched a few of their own unauthorized games for the NES.

Nintendo’s counterattack against Tengen was massive and comprehensive. Not only did they launch the expected blizzard of legal actions, but they made it clear to all of the stores that handled their products that there would be grave consequences if they chose to sell the Tengen games as well. Such threats ironically represented a far more clear-cut antitrust violation than anything found in Tengen’s original suit. When Tengen got the court to order Nintendo to cease and desist from such behavior, Nintendo allegedly only became more subtle. “You know, we really like to support those who support Nintendo, and we’re not real happy that you’re carrying a Tengen product,” a rep might say. “By the way, why don’t we sit down and talk about product allocations for next quarter? How many Super Marios did you say you wanted?” “Since it was illegal, there were always excuses,” remembers one retailer. “The truck got lost, or the ship from Japan never arrived.”

Tengen was determined to try their case against Nintendo first and foremost in the court of American public opinion. “Who gave Nintendo the power to decide what software the American public can buy?” they asked. The New York Times, for one, agreed with them: “A verdict in favor of Nintendo would probably have a spillover effect into the personal-computer industry, where it could have a chilling effect on the free flow of ideas and innovations that have characterized that market since its inception.” An opportunistic Congressman named Dennis Eckart launched a high-profile crusade against Nintendo that led to lots of heated rhetoric amid Congressional hearings and the involvement of several state Attorneys General and the Federal Trade Commission. Jack Tramiel of the other Atari (the one currently making the Atari ST computer), who had always viewed lawsuits as healthy business competition by other means, piled on with a suit of his own, claiming that by monopolizing the market Nintendo was keeping his own company from getting good software for its machines. “Nintendo has demonstrated its disregard for free and fair competition in America,” said Jack’s son and anointed successor Sam Tramiel.

Yet the anti-Nintendo sentiment in the country didn’t ultimately do much to help either of the two Ataris’ legal cases; the courts proved willing to buck that rising tide. In a landmark ruling against Tengen in March of 1991, Judge Fern Smith stated that Nintendo had the right to “exclude others” from the NES if they so chose, thus providing the legal soil on which many more walled gardens would be tilled in the years to come. Similarly, the Tramiels’ suit against Nintendo was definitively rejected in 1992, after having cost their company a great deal of time, energy, and most of all money it could ill afford. The other various and multifarious investigations into Nintendo’s business, of which there were far too many to summarize here, resulted in a mixed bag of vindications and modest slaps on the wrist that did nothing to alter Nintendo’s overall trajectory. Perhaps the best argument against Nintendo as a monopoly was the arrival of the company’s first competitors in the console space, beginning with Sega, who proved that it actually was still possible to carve out a non-Nintendo place of one’s own in the game-console industry that Nintendo had so recently resurrected.

Nintendo, then, was here to stay, as were Sega and other competitors still to come. The computer-game industry would just have to accept that and reckon with it as best they could. In the end, the threat from Japan proved not quite as apocalyptic as it had seemed during the darkest days of 1989. In 1990 computers could start to boast of a modest new buzz of their own, thanks to the new so-called “multimedia PCs” and a bunch of new games that took advantage of their capabilities. Having ceded the low ground to the consoles, computers had retained the high ground, a loyal constituency of slightly older, more affluent gamers who still had plenty of room in their hearts for the sort of big, high-concept strategy, adventure, and CRPG games that weren’t all that realizable on the more limited consoles. The computer-game industry grew again already in 1990, and by a double-digit percentage at that. The vibrant jungle of PC gaming would continue to bloom in a thousand ways at once, some of them productive, some of them dead ends, some of them inspiring, some of them kind of repugnant. And through it all, the jungle of PC gaming would remain interesting in ways that, at least for this humble writer, the fussily manicured walled garden of Nintendo has never quite managed to be. But whichever mode of distribution you personally favored, one thing became clear as the 1980s gave way to the 1990s: neither Generation Nintendo nor the emerging Generation Wintel would be going anywhere anytime soon.

(Sources: The Making of the Atomic Bomb by Richard Rhodes; Game Over by David Sheff; Compute!’s Gazette of May 1988, March 1989, August 1989, September 1989, October 1989; Computer Gaming World of September/October 1985 and June 1988; Amazing Computing of January 1989; materials in the SSI and Brøderbund collections at the Strong Museum of Play.)

Footnotes

Footnotes
1 Activision changed their name to Mediagenic midstream in these events. Because I haven’t told the story behind that change yet, and in order to just generally avoid confusion, I simply refer to the company as “Activision” in this article.
 

Tags: , ,

Opening the Gold Box, Part 1: Joel Billings and SSI

SSI

I’m a game player, mostly, that’s about it. I’m pretty dull, actually.

— Joel Billings

Joel Billings is about as close to a literal lifelong gamer as it’s possible to be. His father taught him to play the old Avalon Hill wargame classic Tactics II in 1965, when he was just 7 years old. Robert Billings, who regarded gaming only as an occasional pleasant diversion, soon had cause to wonder whether that introduction has been a wise move; young Joel got obsessed right from the first. Instead of playing with cars or model trains, Joel re-fought the major battles of World War II and the American Civil War on his bedroom floor, having simultaneous and almost equally pitched real-world battles with the family dog, who wanted to play too. While other boys played sports, or merely watched them, Joel was determined to simulate them. He tried to recreate every single game of the 1969 football season for every single team — hundreds of individual matches — using Strat-O-Matic Football, finally stopping out of sheer exhaustion with just twenty or so matches left to play. Encouraged to find a more social outlet for his “hobby,” he raided his high school’s chess club to form a wargaming club with himself as founder, president, and, it seems safe to say, most passionate member by a country mile. The same could be said of the company he would later found.

But it was awfully hard in those early days for Joel or anyone in his family to imagine how he could turn his passion into a living wage, especially given that he wasn’t and would never be so much a start-from-scratch designer as an avid, gifted player. After doing well at his suburban Los Angeles high school despite the lure of wargames — he graduated 19th in a class of 572 — he proceeded to Claremont Men’s College in 1975 to pursue a degree in Economics. There he continued with his beloved wargames, betwixt and between and every chance he got. He would sometimes enter three divisions of a wargaming tournament simultaneously, an obligation later described by Al Tommervik of Softalk magazine as “roughly akin to playing a couple of dozen simultaneous chess matches against near masters.”

The late 1970s were a good time to be a wargamer. In terms of dollars and cents, this period was the tabletop-wargame industry’s golden age. Annual sales grew at a rate of 40 percent or more for the better part of the decade, peaking in 1979 at $15.5 million. Those may sound like small numbers in comparison with many another entertainment industry, but for wargaming, always the very definition of a niche hobby, they were very good ones indeed in comparison to what had come before and, less happily, what would soon follow. Surveys reckoned over a quarter of a million Americans were active wargamers, with an average age of just 22 years. (In the years to follow, one of those numbers would plummet while the other rose precipitously.) Joel Billings — smart, from comfortable circumstances, and 21 years old in 1979 — was practically the prototypical specimen of the breed.

In those days wargaming was absolutely dominated by a Coke and a Pepsi, whose combined sales accounted for 80 percent of the industry as a whole. Wargaming’s Coke was Avalon Hill, the big, traditionalist institution whose Tactics, generally regarded as the urtext of the modern wargame, had birthed the industry back in 1954. Its Pepsi was the younger, slightly smaller, slightly hungrier, arguably more innovative Simulations Publications, Incorporated, universally known as SPI. The two companies were each regarded with great love and loyalty by their respective fans, who felt they could discern a distinct personality not only in the marketing and packaging of each company’s games but in the games’ rules as well. Plenty of wargamers were stalwart loyalists to one camp or the other, refusing to buy or play a game by the rival company. Joel wasn’t quite that extreme, but was always an Avalon Hill man when push came to shove.

Joel Billings, the man destined to bring the culture of chits and dice into collision with that of bits and bytes, had his first run-in with computers early in his time at Claremont College. He wound up, more by happenstance than desire, in a BASIC programming class conducted with the mediation of a big DEC PDP-10. This first encounter didn’t rock his world the way it did that of so many characters we’ve met on this blog — Joel had already found his lifelong passion when he had first played Tactics II all those years ago — but he did find the experience interesting, and found he had a certain aptitude for it as well. It started him to musing about the changes computers might wreak on his own favored hobby. For his final project in the class, he wrote a simple little two-player tank game. It was a wargame in only the most generous definition of the term, but it was a start. In the meantime, he parlayed that class into a six-month internship at Amdahl Corporation, a maker of mainframe computers located in Silicon Valley, during his senior year at university.

After graduating from Claremont College in May of 1979, Joel traveled up the coast again to take a summer job with Amdahl before he went on to graduate school at the University of Chicago. As he had before, he stayed in a spare apartment above the house of David Rubinfien, an uncle. Immersed in the world of big mainframe iron as he was, Joel had only recently become aware of the nascent PC revolution. But as soon as he’d seen his first TRS-80 he’d begun wondering what these new microcomputers might be able to do for his hobby. Rubinfien, as always supportive of and helpful to his nephew and possessed of some connections in the Valley to boot, encouraged him to find out.

Joel first talked to some programmers who worked for IBM, but they told him flat-out that his idea of creating a wargame reminiscent of the tabletop games he loved on the microcomputers of the day was absurd. Undaunted, Joel hung flyers in several of the local computer shops. With the moment of decision looming ever closer — did he stay here and try to make a computerized wargame or did he go off to graduate school? — he was contacted in early August by one John Lyon. Eighteen years Joel’s senior, Lyon was an experienced programmer currently working for Control Data who loved wargames almost as much as Joel. He had never programmed a microcomputer before, but he didn’t let that stop him. “This is what opportunity looks like when it knocks,” Lyon had told the sales clerk standing by the store’s bulletin board. “And I’m going to answer it.”

Pressed for time as they were, Joel and John settled on a rather blatant computerized clone of an old Avalon Hill classic called Bismarck, a simulation of the legendary German battleship‘s ill-fated attempt to break out into the Atlantic shipping lanes in 1941. In addition to offering a completed design to start from, Bismarck seemed ideal in a number of other ways. For one thing, its was a popular subject known even to many non-military-history buffs thanks to the classic war flick Sink the Bismarck!  But there were also other, less obvious considerations. Joel had long since realized that the computer had the potential to bring two hugely salable advancements to the traditional tabletop wargame, and a Bismarck game would be well-nigh ideal for demonstrating both of them.

One advancement would be true hidden movement. Implementing a proper “fog of war” presented an obvious problem for a tabletop wargame where each player was tracking moves on the same game board and needed to be able to make sure the other wasn’t cheating. The problem of fog of war was so vexing yet so essential to any realistic simulation of military conflict that some of the most elaborate wargames had taken to requiring a third participant, a referee who could serve as a neutral arbiter and keep track of each player’s units in relation to the others; you can imagine how popular that thankless role was. A computerized version of Bismarck could demonstrate to fine effect the computer’s ability to simulate the fog of war. Indeed, one might say that this entire scenario revolved around the fog of war: the really difficult part for the British side was simply finding the Bismarck. The British forces were so overwhelming in comparison to the German that, as Joel puts it, “if you find the Bismarck you’re likely to kill it.”

The other advantage computers brought to the (non-)table was of course to eliminate not only the need for a referee but also the need for another player, to provide an artificially intelligent opponent who was up for a game any time you were. Artificial intelligence was, however, a hard task to shoehorn into a microcomputer of 1979 vintage. It was here that the second big advantage of Joel and John’s choice of games came in: with a Bismarck game, they really didn’t need much of an artificial intelligence at all. The order of battle for the German side of things consisted of only the Bismarck itself and a single escorting cruiser; the tiny flotilla’s strategic and tactical options were pretty much limited to “sail as quickly as possible and hope the British forces don’t find them.” Surely the computer could manage that much. All John Lyon needed do was restrict the human player to only playing the British side.

Computer Bismark was programmed in Joel's apartment at the top of this rather hair-raising staircase. Thanks to childhood bout with polio, John Lyons had to climb it on crutches every evening.

Computer Bismarck was programmed in Joel’s apartment at the top of this rather hair-raising staircase. Thanks to an adolescent bout with polio, John Lyon had to climb it on crutches every evening.

Lyon set to work programming the game, using only text because that’s all the borrowed North Star CP/M machine he and Joel had scrounged could manage; neither of these two would-be microcomputer-software impresarios yet owned an actual microcomputer. Meanwhile his uncle set up several meetings with venture capitalists, which didn’t yield any immediately tangible results. But then the Silicon Valley grapevine reached Trip Hawkins, a young man only a few years older than Joel who worked for a company Joel had barely heard of to this point: Apple Computer. A venture capitalist called Hawkins to tell him about this interesting proposal that was coming from an inexperienced youngster with questionable credentials to pull it off. If Hawkins would quit his job at Apple and become president of the new company, the venture capitalist said, he could guarantee him ample financing. Hawkins wasn’t ready to do any such thing, but he was intrigued enough by the venture capitalist’s description to meet with Joel.

The two were polar opposites in temperament, Hawkins charismatic, nakedly ambitious, and dynamic while Joel was quiet, staid, and thoughtful. Both, however, had grown up similarly steeped in the games culture of the 1960s and 1970s. Eager to foster the games industry that he hoped to enter in his own right someday soon, Hawkins offered to join the board of any prospective company, provided that Joel was willing to develop his game on the Apple II. The Apple II had been overshadowed by the likes of the TRS-80 and all those CP/M machines to date, Hawkins admitted, but it was having a very good 1979 and was poised to come on strong in the new decade — poised to be “the computer of the future.” He was, to give credit where it’s due, largely right in this. The Apple II would indeed become the premier gaming computer of the next several years, thanks not least to a standout feature that Hawkins didn’t hesitate to point out to Joel: its color bitmap graphics. If they made their Bismarck game for the Apple II, Joel and John could substitute a color picture of the North Atlantic for textual descriptions of the situation.

Hawkins’s participation should play well with the many Silicon Valley venture capitalists who already knew him as a bright young spark, and he could even get Joel access to Apple’s own distribution network and customer rolls. And, far from being a sacrifice, going with the burgeoning Apple II as the new company’s platform of choice seemed a logical course. Hawkins promised to join the board, and Apple II it was from then on.

Still, Joel remained cautious by nature. All too aware of his own lack of experience, he cast about for a bigger partner to shoulder some of the risk and some of the responsibility. He screwed up his courage to call the home of his self-described “heroes” at the wargaming Mecca of Avalon Hill, and managed to get Thomas N. Shaw — game designer, founding editor of Avalon Hill’s in-house magazine The General, and the most long-serving employee of the company — on the other end of the line. Shaw, in Joel’s words, “blew him off,” said Avalon Hill was already investigating the field of computer gaming for themselves and didn’t particularly need the help of a 21-year-old with no relevant experience, thank you very much. Joel’s next call was to Automated Simulations, a computer-games publisher founded by two veteran tabletop wargamers that struck him as the only publisher remotely close in background and spirit to what he was trying to do. But, flying high on the sales of their proto-CRPG Temple of Apshai, Automated Simulations was more interested in adding to that line than branching out into computer wargames. And, once again, they remained distinctly unimpressed by young Joel himself. If Joel wanted to do this thing, he would have to do it alone.

He had definitively decided at last that he did want to do this thing. At the last possible instant, he obtained a one-year deferral on graduate school and an extension of his summer job at Amdahl to pay the bills while he tried to get his company off the ground. Being a methodical sort who did anything he decided to do thoroughly and conscientiously, Joel, with the assistance of a sympathetic older colleague from Amdahl named David Bowen, prepared an evolving series of business plans over the last five months of 1979, using data drawn from trade journals and a survey he passed out at a local tabletop-gaming convention. They make for fascinating reading today. For instance, one data point had ominous implications for the wargames industry, still sanguine in their expectations of double-digit annual growth in the decade to come, if only anyone there had happened to see it: Joel found from his survey that wargamers who purchased computers immediately saw their expenditures on tabletop games drop by an average of 41 percent.

In one of these documents, Joel shows a remarkable understanding of the nature of experiential gaming and what makes it different and important.

It is believed that users of these games are attempting to create a fantasy world in which they can obtain role identification with heroic figures. This is similar to reading a good book or watching TV, except that in a game it is more interactive, lively, or “hot.” Wargames provide historical realism and heroes with the basic requirements of a good game: elements of skill, strategy, and chance. Typical wargames allow role identification with heroes like General Patton, various admirals, Napoleon, and so on.

The business plans paint a picture of a busy little factory, with a large staff of programmers under Lyon beavering away to turn out games at a rapid clip. For all the plans’ diligence, they don’t evince much understanding of the nature of intellectual property. Under the heading of “Overall Product Strategy,” the final plan unabashedly states that “computerized versions of existing [tabletop] games” will be the company’s early priority, with “computerized wargames designed by a top-flight game designer with a computer in mind from the beginning” coming only later as resources permit. Ah, well… Joel’s company would hardly be the only respected publisher to have a dodgy understanding of intellectual property in the wild and woolly early days of the software industry.

The name of Joel’s venture changed several times. What started out as the placeholder “Company A” became “Computer Simulations,” and only then “Strategic Simulations.” Joel first took to abbreviating the name to “SS,” but the historical connotations of those two letters — especially to wargamers, who tended to be all too steeped in the very era of history in question — were too ugly to let them stand alone. So he settled at last on SSI, for “Strategic Simulations, Incorporated,” an abbreviation with the added bonus of harking back to the tabletop-wargaming institution of SPI. The incorporation in question occurred on December 27, 1979.

Even with Trip Hawkins’s backing, Joel still hadn’t found any venture capitalists willing to take a chance on computerized wargaming by that date. So Joel’s family finally came through to fund his dream, raising some $40,000 in seed capital among themselves. Joel’s big sister Susan quit her job as admitting-and-registration manager at a hospital to run the accounting side of the venture, to serve as office manager, and, just possibly, to keep an eye on her little brother on behalf of the family that had just entrusted him with so much of their money and faith. Susan, who had no particular interest in games or computers, took the job on as a favor and a family obligation. “For the first couple of years, I said I’d stay six months and then leave,” she remembers. “I thought it was a temporary thing.” Instead she would remain throughout SSI’s long run, becoming in her way as integral to the company as Joel himself. This even though she never did much warm to games or computers: “I never felt an affinity for the products. My feelings were for the operation and the people.”

Computer Bismark in action.

Computer Bismarck in action.

John Lyon finished SSI’s first game in late January of 1980. Still not the slightest bit interested in disguising its origins in the Avalon Hill Bismarck, Joel titled it simply Computer Bismarck. No matter. Computer Bismarck, generally regarded today as the first serious wargame to appear on a microcomputer, made for a very impressive product for those in SSI’s target demographic. Recognizing the need to present a professional appearance — especially in light of Computer Bismarck‘s $60 price tag, four or five times the price of the typical computer game at the time — Joel had taken the unusual step of hiring an artist and packaging designer for SSI right out of the gate. In an industry still dominated by Ziploc baggies stuffed with hand-scrawled photocopied title cards, Computer Bismarck shipped in an actual box sporting Louis Saekow’s ominous head-on graphic of the Bismarck itself. Inside was not only a real, professionally typeset manual but also a generous collection of player aids, including a map and counters for keeping track of those aspects of the strategic situation that the program, even with the aid of the Apple II’s bitmap graphics, couldn’t always show.

Computer Bismark

Through the auspices of the well-connected Trip Hawkins, Joel made his first significant sale in early February, 50 copies of the game to the Los Altos Computerland. A week later SSI moved out of Joel’s apartment, where by the end there he had been forced to wind his way through a hedge maze built from the first 1000 copies of Computer Bismarck just to reach his bed. After the move, the first of six to ever larger digs that SSI would make over the next decade, Joel hung a map of the United States on the wall. Every time an SSI game sold in a new city, he’d put a pin in the map. Within six weeks, the map was positively bristling with them. Its purpose served, Joel pulled the map off the wall.

They were on their way, but budgets were decidedly tight. That first office space was nothing but a big empty room. Unable to afford cubicles, they made “offices” out of walls of boxes. “When someone grumbled later about not having an office,” Susan remembers, “we’d say the president had a wall of boxes for an office, so you’re in good company.” Despite working for an alleged computer company, Susan managed all of the accounts on paper, with the aid of only “one of those out-of-the-movies adding machines that only does addition and subtraction.” At $15 at the local surplus store, the price had been right.

Computer Bismark

SSI spent all the money they weren’t spending inside their offices trying to make a good impression outside of them. Determined to advertise Computer Bismarck as something genuinely new under the sun, they came up with a catchy slogan: “The $2160 Wargame!” (The extra $2100, of course, referred to the approximate cost of the Apple II system needed to run it.) Just as Joel had hoped, Computer Bismarck attracted significant attention in traditional wargaming circles, getting big writeups in hardcore magazines like Fire and Movement. Computerized wargames were “here at last,” wrote Joel in his “Designer’s Notes” addendum to that article, “and I suggest you run out and buy a home computer as soon as you can justify it to your wife, girlfriend, or mother.” And at least to some extent his readers apparently did. SSI wound up selling almost 8000 copies of Computer Bismarck. That number may not sound spectacular today, but it wasn’t bad for a niche product in what remained a niche industry. By year’s end Lyon and his team had churned out a few more, slightly less blatantly cloned wargames. SSI’s year-end balance sheet showed a loss of $60,000, but that was hardly unexpected for the first-year startup. They believed they were well on the road to profitability. At the same time, though, Joel was well on the road to overhauling the way that SSI did business.

Joel with a single computer and a homemade sign at the June 1980 Origins gaming convention.

Joel with a single computer and a homemade sign at the June 1980 Origins gaming convention.

What caused him to rethink himself was an unsolicited and thoroughly unexpected package that arrived within months of the release of Computer Bismarck. In the package was a game from an Arkansan named Dan Bunten, [1]Dan Bunten later became Danielle Bunten Berry, and lived until her death in 1998 under that name. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. a football simulation that used the Apple II’s optional paddle controllers to brilliant effect. Bunten wanted to know if SSI would be interested in publishing it. Hard as it may be to believe, this was a business model that had never occurred to Joel. Instead of killing themselves to design and program all these games in-house, SSI could curate games from outside developers — handle the packaging and marketing while leaving the tough, unpredictable creative effort to others. If Joel needed any further convincing, the fact that Bunten’s slick football game made SSI’s in-house games look rather workmanlike provided plenty. SSI published Computer Quarterback in September of 1980 as their first non-in-house-developed game. It promptly became by far the fastest seller in their catalog, just in case Joel needed yet further convincing.

SSI’s year-end 1980 “business plan,” really a state-of-the-business report, incorporates an important change from the original plan: “SSI is now relying on outside designers to provide roughly half of all new products.” That percentage would only increase in the years to come. Joel’s original vision of SSI as a sort of wargames factory, with a small army of programmers beavering away to churn out games, would never materialize. No big loss. This new way worked so much better.

As the existence of Computer Quarterback will attest, SSI’s games almost immediately began to depart from the most literal definition of a wargame. Within a few years they would add to their growing military-history library not only more sports games, but also economic simulations, political challenges, and science-fictional scenarios. Somewhat to the chagrin of Joel, a hardcore military wargamer first and last, the average non-military game actually sold much better than the average military; the biggest sellers of all in SSI’s first few years were Computer Quarterback and Computer Baseball.

Yet, like the games of most publishers carving out an identity in the young industry, SSI’s games did all tend to share a personality. In an earlier article, I described that personality as “almost aggressively off-putting.” While not the kindest description I’ve ever written, I think it holds true to the way the average non-wargamer perceived them. It’s right there in the name of the company that made them. These games were very eager to brand themselves as thinking people’s strategic simulations rather than mere games. Rather than minimizing complexities, they reveled in them — that’s to say, they reveled in as many complexities as it was actually possible to generate on a 48 K Apple II. Like the tabletop wargames that inspired them, mechanical elegance, interface, and aesthetics all took a back seat to the idea of recreating history. It may sound like stereotyping to say that most of SSI’s games were written by serious-minded bearded men in home offices whose walls were lined with military-history books… but, well, most of SSI’s games were written by serious-minded bearded men in home offices whose walls were lined with military-history books. Long after the rest of the industry had sworn off BASIC for high-performance machine language, SSI continued to happily accept and publish games written in pure BASIC, hundreds of lines of amateurish spaghetti code. For the SSI hardcore, who like tabletop wargamers loved to explore and tinker with rules in the name of historical accuracy or what-if scenarios, the use of easily listable and modifiable BASIC was as much plus as minus. The great Sid Meier gave us the maxim that “fun trumps realism” in game design. One might say that SSI’s games took the opposite position. But, almost paradoxically, for the niche of people on their wavelength the realism — or the abstract idea of realism, whatever the actual reality of simulation on a 48 K Apple II — was the fun.

For everyone else, the appeal of these baroque, balky, bulky creations remained a mystery. The shops often didn’t know quite what to do with them. Here’s Ed Thomas, a former manager of Software Etc.’s showcase store in Manhattan:

The boxes were half again as big as any other box on the shelf, and they were these intricate wargames with names like Beachhead: Moscow, 1944. I hated those boxes. The only way to display them was to put them on the top shelf, which messed up the order I was trying to establish. In addition, the covers weren’t very attractive, and I never had enough of any one title to face-out the boxes. These damned over-sized, ugly boxes were not at all worth the trouble they caused. I took an immediate dislike to the company that was giving me such a hard time.

That was my first encounter with Strategic Simulations, Inc., a company filled, I was sure, with people who, when not writing intricate computer code, were in a military-style war room recreating D-Day.

SSI proved uniquely impervious to the depredations of the software pirates who were causing so much outrage elsewhere in the industry. Their fool-proof method of copy protection didn’t involve mismatched sector numbers or manual-lookup schemes. It was rather the simple fact that few of the people who copied and traded games could care less about those of SSI. The piracy scene just couldn’t be bothered, unless it was to have an occasional game to mock for its ugly graphics, its slowness, and its sheer BASICness.

Joel poses in 1982 with Pursuit of the Graf Spee, the only SSI game he designed and programmed himself -- albeit by cribbing liberally from Computer Bismark. Selling just 2082, it wasn't a big success.

Joel poses in 1982 with Pursuit of the Graf Spee, the only SSI game he designed and programmed himself — albeit only by cribbing liberally from Computer Bismarck. It sold just 2082 units.

The niche audience for SSI’s games — niche even by the standards of the still tiny software industry in general — sharply limited the potential sales of each of them. As Joel himself put it in 1982, “I’m just a niche in a subset.” And then there were so many sub-niches within SSI’s niche: a dedicated sports gamer raised on Strat-O-Matic Football might not care about military titles at all, a World War II buff might not have any interest in American Civil War games. Relying on the fact that many of the dedicated hardcore would buy lots of games within the sub-niche that did appeal to them, SSI made it up in the sheer volume of titles they published. They really were astoundingly prolific. Already in that 1980 business plan they are planning to leverage all those outside designers to release a new game for every month of 1981. Shockingly, they pulled it off, and kept right on flooding the market with titles thereafter.

In the first four years of SSI’s existence, they released no fewer than 43 separate games, not counting ports and enhanced editions. Most of these never came close to cracking five digits in total unit sales. Some barely sold 2000 copies. Precisely three of them cracked 20,000 units, with the most successful of them all, Computer Baseball, a real outlier at over 45,000 copies sold. Titles like that presumably broke through to some extent beyond the SSI hardcore. But mostly SSI relied on the fanatically loyal customers who bought lots of their games and quite possibly no games at all from anyone else. With virtually none of their games selling in enough quantities to meet even the most generous definition of a hit, their ever expanding back catalog was everything. Each SSI game, even those that initially struggled to sell 1000 units, remained available for years. It would, for instance, still be possible to buy a brand new copy of good old Computer Bismarck in 1986 — and still for a full $60 at that.

It was a comfortable niche as niches go, but there was only room for one company there. About six months after Computer Bismarck, Avalon Hill, as Tom Shaw had once told Joel they would, started their own line of computerized wargames. That, combined with the existence of Computer Bismarck, was a recipe for trouble. Sure enough, Avalon Hill was soon marketing computerized versions of some of their other tabletop classics using the same prefix: Computer Diplomacy, Computer Football Strategy, Computer Circus Maximus. Just to aggravate the confusion, Avalon Hill coincidentally released a second edition of the tabletop Bismarck, which had been out of print for a number of years, the very same year as Computer Bismarck. With the two companies in direct competition, a call from the lawyers was inevitable. SSI got off relatively easy: sued for trademark and copyright infringement in 1984, they settled by agreeing to pay Avalon Hill a lump sum of $30,000 and a 5 percent royalty on future sales — which, given that Computer Bismarck was by then almost five years old and creakily archaic, were likely to be modest at best even for the back-catalog-driven SSI.

All told, Avalon Hill plugged away at the computer thing for a good five years, but despite the drawing power of their name among tabletop veterans could never quite catch up to SSI in either sales or wargamer respect, could never quite get their computer division to turn a real profit. Part of the problem was doubtless that their games, being programmed by a rather unimaginative in-house team, were “really simple,” as Joel puts it, in comparison to SSI’s — not a good thing to players that craved the validation of complexity. And part of their problem was doubtless just the disadvantages of being second. SSI already owned this market. Stymieing the giant that had once blown him off had to bring a smile of vindication even to the mild-mannered face of Joel Billings.

Less happily for SSI, other markets had owners as well. When SSI tried to branch out from their slow, cerebral signature games it just didn’t work for them. In 1982, they launched a line they called RapidFire, consisting of faster-paced, more graphically impressive games, generally programmed and released first on the more audiovisually capable Atari 8-bit line rather than the Apple II. Among the RapidFire games was Dan Bunten’s pioneering proto-real-time-strategy game Cytron Masters. But sales weren’t notably better than their typical wargame: Cytron Masters sold just 4702 units. And as competition heated up it became difficult for little SSI to retain developers who didn’t work firmly in the company’s own niche. Dan Bunten, for instance, was lured away by Trip Hawkins’s new Electronic Arts shortly after finishing Cytron Masters. SSI soon returned to focusing exclusively on the types of games with which Joel was most comfortable.

They did have one valuable ally in their corner in the increasingly competitive industry. In 1981, a Baptist minister and veteran tabletop gamer named Russell Sipe contacted Joel to ask his opinion on a potential magazine that would exclusively cover computer games, focusing on those of an intellectual, wargamey bent. Recognizing a kindred spirit immediately, Joel was very supportive, even committing his own still fragile venture to buying extensive advertising in the new publication. Computer Gaming World became so associated with SSI in its early years that one might be excused if one took it for SSI’s own publication. The slim first issue, for example, includes an extended feature-length review of SSI’s new Torpedo Fire; a review of, playing tips for, and an after-action report from SSI’s President Elect; and a “greatest baseball team of all time” tournament conducted using SSI’s Computer Baseball. This de facto partnership, born like most things involving SSI of shared interests and genuine affection rather than guile, served SSI well for many years, helping to get their niche games in front of just the right niche of potential buyers. SSI grew cautiously but healthily year by year, from sales of $317,000 in 1980 to over $3 million in 1984. The employee rolls grew to match, from 11 at the end of 1980 to 32 at the end of 1984.

While the vast majority of the games were provided by outside developers (the aforementioned serious-minded bearded men), just packaging and coming up with manuals and other supporting materials for a new game every single month was a herculean task, especially given that SSI generally did a very good job with such things; these were expensive games, and they needed to look it. In lieu of the army of programmers — SSI’s in-house development group, while never entirely eliminated, remained much smaller than originally planned — an army (or at any rate a small brigade) of other personnel came on board to design the packaging, write the manuals, ship the games, and deal with all the other logistics of running a growing business. Some of these folks were, like Joel, hardcore gamers delighted to be spending their days in what Joel’s eventual wife came to call “a treehouse for wargamers.” For the rest, the folks like Susan Billings, it was just a job, but a pretty great job all the same. Joel, apart from his one eccentric habit of wearing a three-piece suit to work every day, was as easygoing, down-to-earth, and reasonable a boss as anyone could ever wish for. But it was at least as much Susan who set the tone of the workplace while Joel was hopelessly lost inside his wargames: “It was the opportunity to try to create the perfect work environment so people would want to come to work. It was the chance of a lifetime to develop a company using your style, based on your style, and doing it with someone from your family.” So, yes, SSI was a very happy place — as happy in its way as the legendarily happy Infocom, and for a much longer stretch of time.

SSI employees Tena Lawry and Connie Barron boogie down as the "Simulated Bunnies" because... well, just because.

SSI employees Tena Lawry and Connie Barron boogie down as the “Simulated Bunnies” because… well, just because.

Tena Lawry, who would later become SSI’s senior purchaser, joined in 1981 as a temporary disk copier, responsible for shoving disks into drives and then dropping them into boxes all day long. (If you could put toast in a toaster, you were qualified, says Tena wryly.) Tena:

We broke for lunch and Joel walked in with five pizzas. We all sat on the floor munching away and an announcement was made that we were going to have a rousing game of Nuclear War [a Flying Buffalo game]. Now I’m nervous. I figured we were going to play some intense videogame. I hadn’t even mastered Pac-Man yet, so this would be interesting.

Nuclear War turned out to be a card game in which you amass missiles and such and then trump your opponent in an attempt to annihilate his population. At one point, I dropped a major nuclear payload on Joel. I thought at this point that this may not have been the politically correct thing to do. After all, Joel was the president of SSI and I had just wiped out his entire population. But I soon found out that Joel always appreciates a good game strategist even if it means a pile of dead-body cards.

That night at dinner my family asked me what I had done on my first day at SSI. I said I copied disks, assembled games, and obliterated an entire population while eating pizza. Silence fell over the table. “Just kidding,” I said.

It fell to Susan Billings to address a delicate problem when SSI’s technical staff — hackers being hackers — started to spend much too long in front of their computers between hygiene breaks. She handled the situation with humor, grace, and aplomb, as she did most situations at SSI. Old timers laugh about the infamous “B.O. Memo” to this day.

SSI

At the time that Susan was writing that memo, SSI was tentatively trying to branch out again into a new genre. Thankfully, this expansion would be more successful than the RapidFire line had been. Indeed, in the fullness of time it would lead to a transformative deal with the titan of the other side of the tabletop industry, the yin to Avalon Hill’s yang. We’ll step back next time to look at what set that titan on a collision course with Joel Billings’s modest little treehouse for wargamers.

(Sources: This article is largely drawn from the collection of documents that Joel Billings donated to the Strong Museum of Play, which includes lots of internal SSI documents and some press clippings. Also, Matt Barton’s YouTube interviews with Billings.)

Footnotes

Footnotes
1 Dan Bunten later became Danielle Bunten Berry, and lived until her death in 1998 under that name. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
 

Tags: ,

Wasteland

Wasteland

We can mark the formal beginning of the Wasteland project to the day in December of 1985 when Brian Fargo, head of Interplay, flew out to Arizona with his employee Alan Pavlish to meet with Michael Stackpole. If all went well at the meeting, Pavlish was to join Stackpole and Ken St. Andre as the third member of the core trio who would guide the game to release. His role, however, would be very different from that of his two colleagues.

A hotshot programmer’s programmer, Pavlish, though barely twenty years old, had been kicking around the industry for several years already. Before Interplay existed, he’d done freelance work on Commodore VIC-20 games for their earlier incarnation as Boone Corporation, and done ports of games like Murder on the Zinderneuf to the Apple II and Commodore 64 for another little company called Designer Software. When Pavlish came to work for Interplay full-time, Fargo had first assigned him to similar work: he had ported the non-Interplay game Hacker to the Apple II for Activision. (In those pre-Bard’s Tale days, Fargo was still forced to accept such unglamorous work to make ends meet.) But Fargo had huge respect for Pavlish’s abilities. When the Wasteland idea started to take off while his usual go-to programming ace Bill Heineman [1]Bill Heineman now lives as Rebecca Heineman. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times. was still swamped with the Bard’s Tale games and Interplay’s line of illustrated text adventures, Fargo didn’t hesitate to throw Pavlish in at the deep end: he planned to make him responsible for bringing the huge idea that was Wasteland to life on the little 64 K 8-bit Apple II and Commodore 64.

However, when Fargo and Pavlish got out of their airplane that day it was far from certain that there would be a Wasteland project for Pavlish to work on at all. In contrast to St. Andre, Stackpole was decidedly skeptical, and for very understandable reasons. His experiences with computer-game development to date hadn’t been happy ones. Over the past several years, he’d been recruited to three different projects and put considerable work into each, only to see each come to naught in one way or another. Thanks largely to the influence of Paul Jaquays, [2]Paul Jaquays now lives as Jennell Jaquays. another tabletop veteran who headed Coleco’s videogame-design group during the first half of the 1980s, he’d worked on two games for the Coleco Adam, a would-be challenger in the home-computer wars. The more intriguing of the two, a Tunnels & Trolls adaptation, got cancelled before release. The other, an adaptation of the film 2010: Odyssey Two, was released only after the Adam had flopped miserably and been written off by Coleco; you can imagine how well that game sold. He’d then accepted a commission from science-fiction author cum game developer Fred Saberhagen to design a computer game that took place in the world of the latter’s Book of Swords trilogy. (Stackpole had already worked with Flying Buffalo on a board game set in the world of Saberhagen’s Berserker series.) The computerized Book of Swords had gone into stasis when it became clear that Berserker Works, the development company Saberhagen had founded, just didn’t have the resources to finish it.

So, yes, Stackpole needed some convincing to jump into the breach again with tiny Interplay, a company he’d never heard of. [3]Interestingly, Stackpole did have one connection to Interplay, through Bard’s Tale designer Michael Cranford. Cranford sent Flying Buffalo a Tunnels & Trolls solo adventure of his own devising around 1983. Stackpole thought it showed promise, but that it wasn’t quite there yet, so he sent it back with some suggestions for improvement and a promise to look at it again if Cranford followed through on them. But he never heard another word from him; presumably it was right about this time that Cranford got busy making The Bard’s Tale. Luckily for Interplay, he, Fargo, and Pavlish all got along like a house on fire on that December day. Fargo and Pavlish persuaded Stackpole that they shared — or at least were willing to accommodate — his own emerging vision for Wasteland, for a computer game that would be a game and a world first, a program second. Stackpole:

Programmers design beautiful programs, programs that work easily and simply; game designers design games that are fun to play. If a programmer has to make a choice between an elegant program and a fun game element, you’ll have an elegant program. You need a game designer there to say, “Forget how elegant the program is — we want this to make sense, we want it to be fun.”

I was at a symposium where there were about a dozen people. When asked to tell what we were doing, what I kept hearing over and over from programmer/game designers was something like “I’ve got this neat routine for packing graphics, so I’m going to do a fantasy role-playing game where I can use this routine.” Or a routine for something else, or “I’ve got a neat disk sort,” or this or that. And all of them were putting these into fantasy role-playing games. Not to denigrate their skills as programmers — but that’s sort of like saying, “Gee, I know something about petrochemicals, therefore I’m going to design a car that will run my gasoline.” Well, if you’re not a mechanical engineer, you don’t design cars. You can be the greatest chemist in the world, but you’ve got no business designing a car. I’d like to hope that Wasteland establishes that if you want a game, get game designers to work with programmers.

This vision, cutting as it does so much against the way that games were commonly made in the mid-1980s, would have much to do with both where the eventual finished Wasteland succeeds and where it falls down.

Ditto the game’s tabletop heritage. As had been Fargo’s plan from the beginning, Wasteland‘s rules would be a fairly faithful translation of Stackpole’s Mercenaries, Spies, and Private Eyes tabletop RPG, which was in turn built on the foundation of Ken St. Andre’s Tunnels & Trolls. A clear evolutionary line thus stretched from the work that St. Andre did back in 1975 to Wasteland more than a decade later. No CRPG to date had tried quite as earnestly as Wasteland would to bring the full tabletop experience to the computer.

You explore the world of Wasteland from a top-down perspective rather than the first-person view of The Bard's Tale. This screenshot and the ones that follow come from the slightly later MS-DOS port rather than the 8-bit original.

You explore the world of Wasteland from a top-down perspective instead of the first-person view of The Bard’s Tale. Note that this screenshot and the ones that follow come from the slightly later (and vastly more pleasant to play) MS-DOS port rather than the 8-bit original.

Early in the new year, Stackpole and St. Andre visited Interplay’s California offices for a week to get the process of making Wasteland rolling. St. Andre arrived with a plot already dreamed up. Drawing heavily from the recent ultra-violent action flick Red Dawn, it posited a world where mutually-assured destruction hadn’t proved so mutual after all: the Soviet Union had won the war, and was now occupying the United States. The player would control a group of American freedom fighters skulking around the farmlands of Iowa, trying to build a resistance network. St. Andre and Stackpole spent a month or more after their visit to California drawing maps of cornfields and trying to find ways to make an awful lot of farmers seem different from one another. (Some of this work can be seen in the Agricultural Center in the finished Wasteland.) But finally the pair had to accept the painful truth: the game they were designing was boring. “I said it will be the dullest game you ever saw,” remembers St. Andre, “because the Russians would be there in strength, and your characters start weak and can’t do anything but skulk and hide and slowly, slowly build up.”

St. Andre suggested moving the setting to the desert of the American Southwest, an area with which he, being born and raised in Arizona, was all too familiar. The region also had a certain thematic resonance, being intimately connected with the history of the atomic bomb. The player’s party might even visit Las Vegas, where folks had once sat on their balconies and watched the mushroom clouds bloom. St. Andre suggested nixing the Soviets as well, replacing them with “ravening monsters stalking through a radioactive wasteland, a few tattered humans struggling to survive against an overwhelming threat.” It meant chucking a fair amount of work, but Fargo agreed that it sounded too good to pass up. They might as well all get used to these sorts of false starts. Little would go smoothly or according to plan on this project.

After that first week at Interplay, St. Andre and Stackpole worked from home strictly in a design role, coming up with the plans for the game that were then left to Pavlish in California to implement in code — still an unusual way of working in the mid-1980s, when even many of the great designers, like Dan Bunten [4]In what must be a record for footnotes of this type, I have to also note that Dan Bunten later became Danielle Bunten Berry, and lived until her death in 1998 under that name. and Sid Meier, tended to also be great programmers. But St. Andre and Stackpole used their computers — a Commodore 64 in the case of the former, a battered old Osborne luggable in that of the latter — to do nothing more complex than run a word processor. Bundle after bundle of paper was shipped from Arizona to California, in the form of both computer printouts and reams of hand-drawn maps. St. Andre and Stackpole worked, in other words, largely the same way they would have had Wasteland been planned as a new tabletop adventure module.

Wasteland must be, however, one hell of a big adventure module. It soon became clear that the map-design process, entailing as it did the plotting of every single square with detailed descriptions of what it contained and what the party should be able to do there, was overwhelming the two. St. Andre:

I hadn’t thought a great deal about what was going to be in any of these places. I just had this nebulous story in my mind: our heroes will start in A, they’ll visit every worthwhile place on the map and eventually wind up in Z — and if they’re good enough, they’ll win the game. Certain things will be happening in different locations — monsters of different types, people who are hard to get along with, lots of comic references to life before the war. I figured that when the time came for me to design an area, the Indian Village, for example, I would sit down and figure out what would be in it and that would be it. Except that it started taking a long time. Every map had 1024 squares on it, and each one could do something. Even if I just drew all the buildings, I had to go back and say, “These are all square nine: wall, wall, wall, wall, wall. And if you bump into a wall you’ll get this message: ‘The Indians are laughing at you for walking into a wall.'” Whatever — a map that I thought I could toss off in one or two days was taking two weeks, and the project was falling further and further behind.

Fargo agreed to let St. Andre and Stackpole bring in their old Flying Buffalo buddies Liz Danforth and Dan Carver to do maps as well, and the design team just continued to grow from there. “The guys who were helping code the maps, correcting what we sent in, wanted to do some maps,” remembers Stackpole. “Everyone wanted to have his own map, his own thumbprint on the game.”

Even Fargo himself, who could never quite resist the urge to get his own hands dirty with the creations of this company he was supposed to be running from on high, begged for a map. “I want to do a map. Let me have Needles,” St. Andre remembers him saying. “So I said, ‘You’re the boss, Brian, you’ve got Needles.'” But eventually Fargo had to accept that he simply didn’t have the time to design a game and run a company, and the city of Needles fell to another Interplay employee named Bruce Balfour. In all, the Wasteland manual credits no fewer than eight people other than St. Andre and Stackpole with “scenario design.” Even Pavlish, in between trying to turn this deluge of paper into code, managed to make a map or two of his own.

Wasteland is one of the few computer games in history in which those who worked on the softer arts of writing and design outnumbered those who wrote the code and drew the pictures. The ratio isn’t even close: the Wasteland team included exactly one programmer (Pavlish) and one artist (Todd J. Camasta) to go with ten people who only contributed to the writing and design. One overlooked figure in the design process, who goes wholly uncredited in the game’s manual, was Joe Ybarra, Interplay’s liaison with their publisher Electronic Arts. As he did with so many other classic games, Ybarra offered tactful advice and generally did his gentle best to keep the game on course, even going so far as to fly out to Arizona to meet personally with St. Andre and Stackpole.

Those two found themselves spending as much time coordinating their small army of map designers as they did doing maps of their own. Stackpole:

Work fell into a normal pattern. Alan and I would work details out, I’d pass it down the line to the folks designing maps. If they had problems, they’d tell me, Alan and I would discuss things, and they’d get an answer. In this way the practical problems of scenario design directly influenced the game system and vice versa. Map designers even talked amongst themselves, sharing strategies and some of these became standard routines we all later used.

Stackpole wound up taking personal responsibility for the last third or so of the maps, where the open world begins funneling down toward the climax. St. Andre:

I’m fairly strong at making up stories, but not at inventing intricate puzzles. In the last analysis, I’m a hack-and-slash gamer with only a little thought and strategy thrown in. Interplay and Electronic Arts wanted lots of puzzles in the game. Mike, on the other hand, is much more devious, so I gave him the maps with difficult puzzles and I did the ones that involved walking around, talking to people, and shooting things.

The relationship between these two veteran tabletop designers and Pavlish, the man responsible for actually implementing all of their schemes, wasn’t always smooth. “We’d write up a map with all the things on it and then Alan would say, ‘I can’t do that,'” says St. Andre. There would then follow some fraught discussions, doubtless made still more fraught by amateur programmer St. Andre’s habit of declaring that he could easily implement what was being asked in BASIC on his Commodore 64. (Stackpole: “It’s like a duffer coming up to Arnold Palmer at an average golf course and saying, ‘What do you mean you can’t make that 20-foot putt? I can make a 20-foot putt on a miniature golf course.'”) One extended battle was over the question of grenades and other “area-effect” weapons: St. Andre and Stackpole wanted them, Pavlish said they were just too difficult to code and unnecessary anyway. Unsung hero Joe Ybarra solved that one by quietly lobbying Fargo to make sure they went in.

One aspect of Wasteland that really demonstrates St. Andre and Stackpole’s determination to divorce the design from the technology is the general absence of the usual numbers that programmers favor — i.e., the powers of two that fit so neatly into the limited memories of the Apple II and Commodore 64. Pavlish instinctively wanted to make the two types of pistols capable of holding 16 or 32 bullets. But St. Andre and Stackpole insisted that they hold 7 or 18, just like their real-world inspirations. As demonstrated by the 1024-square maps, the two did occasionally let Pavlish get away with the numbers he favored, but they mostly stuck to their guns (ha!). “It’s going to be inelegant in terms of space,” admits Stackpole, “but that’s reality.”

Logic like this drove Pavlish crazy, striving as he was to stuff an unprecedentedly complex world into an absurdly tiny space. Small wonder that there were occasional blowups. Slowly he learned to give every idea that came from the designers his very best try, and the designers learned to accept that not everything was possible. With that tacit agreement in place, the relationship improved. In the latter stages of the project, St. Andre and Stackpole came to understand the technology well enough to start providing their design specifications in code rather than text. “Then we could put in the multiple saving throws, the skill and attribute checks,” says St. Andre. “Everything we do in a [Tunnels & Trolls] solitaire dungeon suddenly pops up in the last few maps we did for Wasteland because Mike and I were doing the actual coding.”

When not working on the maps, St. Andre and Stackpole — especially the latter, who came more and more to the fore as time went on — were working on the paragraph book that would contain much of Wasteland‘s story and flavor text. The paragraph book wasn’t so much a new idea as a revival of a very old one. Back in 1979, Jon Freeman’s Temple of Apshai, one of the first CRPGs to arrive on microcomputers, had included a booklet of “room descriptions” laid out much like a Dungeons & Dragons adventure module. This approach was necessitated by the almost unbelievably constrained system for which Temple of Apshai was written: a Radio Shack TRS-80 with just 16 K of memory and cassette-based storage. Moving into the late 1980s, the twilight years of the 8-bit CRPG, designers were finding the likes of the Apple II and Commodore 64 as restrictive as Freeman had the TRS-80 for the simple reason that, while the former platforms may have had four times as much memory as the latter, CRPG design ambitions had grown by at least the same multiple. Moving text, a hugely expensive commodity in terms of 8-bit storage, back into an accompanying booklet was a natural remedy. Think of it as one final measure to wring just a little bit more out of the Apple II and Commodore 64, those two stalwart old warhorses that had already survived far longer than anyone had ever expected. And it didn’t hurt, of course, that a paragraph book made for great copy protection.

While the existence of a Wasteland paragraph book in itself doesn’t make the game unique, St. Andre and Stackpole were almost uniquely prepared to use theirs well, for both had lots of experience crafting Tunnels & Trolls solo adventures. They knew how to construct an interactive story out of little snippets of static text as well as just about anyone, and how to scramble it in such a way as to stymie the cheater who just starts reading straight through. Stackpole, following a tradition that began at Flying Buffalo, constructed for the booklet one of the more elaborate red herrings in gaming history, a whole alternate plot easily as convoluted as that in the game proper involving, of all things, a Martian invasion. All told, the Wasteland paragraph book would appear to have easily as many fake entries as real ones.

You fight some strange foes in Wasteland. Combat shifts back to something very reminiescent of The Bard's Tale, with the added tactical dimension of a map showing everyone's location that you can access by tapping the space bar.

For combat, the display shifts back to something very reminiscent of The Bard’s Tale, with the added tactical dimension of a map showing everyone’s location that you can access by tapping the space bar. And yes, you fight some strange foes in Wasteland

Wasteland‘s screen layout often resembles that of The Bard’s Tale, and one suspects that there has to be at least a little of the same code hidden under its hood. In the end, though, the resemblance is largely superficial. There’s just no comparison in terms of sophistication. While it’s not quite a game I can love — I’ll try to explain why momentarily — Wasteland does unquestionably represent the bleeding edge of CRPG design as of its 1988 release date. CRPGs on the Apple II and Commodore 64 in particular wouldn’t ever get more sophisticated than this. Given the constraints of those platforms, it’s honestly hard to imagine how they could.

Key to Wasteland‘s unprecedented sophistication is its menu of skills. Just like in Mercenaries, Spies, and Private Eyes, you can tailor each of the up to four characters in your party as you will, free from the restrictive class archetypes of Dungeons & Dragons (or for that matter Tunnels & Trolls). Skills range from the obviously useful (Clip Pistol, Pick Lock, Medic) to the downright esoteric (Metallurgy, Bureaucracy, Sleight of Hand). And of course career librarian St. Andre made sure that a Librarian skill was included, and of course made it vital to winning the game.

Also as in Mercenaries, Spies, and Private Eyes, a character’s chance of succeeding at just about anything is determined by adding her level in a relevant skill, if any, to a relevant core attribute. For example, to determine a character’s chance of climbing something using her Climb skill the game will also look to her Agility. The system allows a range of solutions to most of the problems you encounter. Say you come to a locked door. You might have a character with the Pick Lock skill try getting in that way. Failing that, a character with the Demolition skill and a little handy plastic explosives could try blasting her way in. Or a strong character might dispense with skills altogether and just try to bash the door down using her Strength attribute. Although a leveling mechanism does exist that lets you assign points to characters’ skills and attributes, skills also improve naturally with use, a mechanism not seen in any previous CRPG other than Dungeon Master (a game that’s otherwise about as different from Wasteland as a game can be and still be called a CRPG).

The skills system makes Wasteland a very different gameplay experience from Ultima V, its only real rival in terms of 8-bit CRPG sophistication at the time of its release. For all its impressive world-building, Ultima V remains bound to Richard Garriott’s standard breadcrumb-trail philosophy of design; beating it depends on ferreting out a long string of clues telling you exactly where to go and exactly what to do. Wasteland, by contrast, can be beaten many ways. If you can’t find the password the guard wants to let you past that locked gate, you can try an entirely different approach: shoot your way in, blow the gate open, pick the lock on the back door and sneak in. It’s perhaps the first CRPG ever that’s really willing to let you develop your own playing personality. You can approach it as essentially a post-apocalyptic Bard’s Tale, making a frontal assault on every map and trying to blow away every living creature you find there, without concerning yourself overmuch about whether it be good or evil, friend or foe. Or you can play it — relatively speaking — cerebrally, trying to use negotiations, stealth, and perhaps a little swindling to get what you need. Or you can be like most players and do a bit of both, as the mood and opportunity strikes you. It’s very difficult if not impossible to get yourself irretrievably stuck in Wasteland. There are always options, always possibilities. While it’s far less thematically ambitious than Ultima V —  unlike the Ultima games, Wasteland was never intended to be anything more or less than pure escapist entertainment — Wasteland‘s more flexible, player-friendly design pointed the way forward while Ultima V was still glancing back.

Indeed, a big part of the enduring appeal of Wasteland to those who love it is the sheer number of different ways to play it. Interplay picked up on this early, and built an unusual feature into the game: it’s possible to reset the entire world to its beginning state while keeping the same group of lovingly developed characters. Characters can advance to ridiculous heights if you do this enough, taking on some equally ridiculous “ranks”: “1st Class Fargo,” “Photon Stud,” etc., culminating in the ultimate achievement of the level 183 “Supreme Jerk.” This feature lets veteran players challenge themselves by, say, trying to complete the game with just one character, and gives an out to anyone who screws up her initial character creation too badly and finds herself overmatched; she can just start over again and replay the easy bits with the same party to hopefully gain enough experience to correct their failings. It takes some of the edge off one of the game’s most obvious design flaws: it’s all but impossible to know which skills are actually useful until you’ve made your way fairly deep into the game.

The very fact that re-playing Wasteland requires you to reset its world at all points to what a huge advance it represents over the likes of The Bard’s Tale. The first CRPG I know of that has a truly, comprehensively persistent world, one in which the state of absolutely everything is saved, is 1986’s Starflight (a game that admittedly is arguably not even a CRPG at all). But that game runs on a “big” machine in 1980s terms, an IBM PC or clone with at least 256 K of memory. Wasteland does it in 64 K, rewriting every single map on the fly as you play to reflect what you’ve done there. Level half of the town of Needles with explosives early in the game, and it will still be leveled when you return many days later. Contrast with The Bard’s Tale, which remembers nothing but the state of your characters when you exit one of its dungeon levels, which lets you fight the same big boss battles over and over and over again if you like. The persistence allows you the player to really affect the world of Wasteland in big-picture ways that were well-nigh unheard-of at the time of its release, as Brian Fargo notes:

Wasteland let you do anything you wanted in any order you wanted, and you could get ripple effects that might happen one minute later or thirty minutes later, a lot like [the much later] Grand Theft Auto series. The Ultima games were open, but things tended to be very compartmentalized, they didn’t ripple out like in Wasteland.

Wasteland is a stunning piece of programming, a resounding justification for all of the faith Fargo placed in the young Alan Pavlish. Immersed in the design rather than the technical end of things as they were — which is itself a tribute to Pavlish, whose own work allowed them to be — St. Andre and Stackpole may still not fully appreciate how amazing it is that Wasteland does what it does on the hardware it does it on.

All of which rather raises the question of why I don’t enjoy actually playing Wasteland a little more than I do. I do want to be careful here in trying to separate what feel like more objective faults from my personal issues with the game. In the interest of fairness and full disclosure, let me put the latter right out there first.

Put simply, the writing of Wasteland just isn’t to my taste. I get the tone that St. Andre and Stackpole are trying to achieve: one of over-the-top comic ultra-violence, like such contemporary teenage-boy cinematic favorites as the Evil Dead films. And they do a pretty good job of hitting that mark. Your characters don’t just hit their enemies in Wasteland, they “brutalize” them. When they die, enemies “explode like a blood sausage,” are “reduced to a thin red paste,” are “spun into a dance of death,” or are “reduced to ground round.” And then there’s some of the imagery, like the blood-splattered doctor in the infirmary.

Wasteland

The personal appeal you find in those quotes and that image, some of the most beloved among Wasteland‘s loyal fandom, says much about whether you’ll enjoy Wasteland as a whole. In his video review of the game, Matt Barton says that “you will be disgusted or find it hilarious.” Well, I must say that my own feelings rather contradict that dichotomy. I can’t quite manage to feel disgusted or outraged at this kind of stuff, especially since, in blessed contrast to so many later games, it’s almost all described rather than illustrated. I do, however, find the entire aesthetic unfunny and boring, whether it’s found in Wasteland or Duke Nukem. In general, I just don’t find humor that’s based on transgression rather than wit to be all that humorous.

I am me, you are you, and mileages certainly vary. Still, even if we take it on its own terms it seems to me that there are other problems with the writing. As CRPG Addict Chester Bolingbroke has noted, Wasteland can’t be much bothered with consistency or coherency. The nuclear apocalypse that led to the situation your characters find themselves in is described as having taken place in 1998, only ten years on from the date of Wasteland‘s release. Yet when the writers find it convenient they litter the game with absurdly advanced technology, from human clones to telepathic mind links. And the tone of the writing veers about as well, perhaps as a result of the sheer number of designers who contributed to the game. Most of the time Wasteland is content with the comic ultra-violence of The Evil Dead, but occasionally it suddenly reaches toward a jarring epic profundity it hasn’t earned. The main storyline, which doesn’t kick in in earnest until about halfway through the game, is so silly and nonsensical that few of even the most hardcore Wasteland fans remember much about it, no matter how many times they’ve played through it.

Wasteland‘s ropey plotting may be ironic in light of Stackpole’s later career as a novelist, but it isn’t a fatal flaw in itself. Games are not the sum of their stories; many a great game has a poor or nonexistent story to tell. To whatever extent it’s a triumph, Wasteland must be a triumph of game design rather than writing, one last hurrah for Michael Stackpole the designer before Michael Stackpole the novelist took over. The story, like the stories in many or most allegedly story-driven games, is just an excuse to explore Wasteland‘s possibility space.

And that possibility space is a very impressive one, for reasons I’ve tried to explain already. Yet it’s also undone, at least a bit, by some practical implementation issues. St. Andre and Stackpole’s determination to make an elegant game design rather than an elegant program comes back to bite them here. The things going on behind the scenes in Wasteland are often kind of miraculous in the context of their time, but those things are hidden behind a clunky and inelegant interface. In my book, a truly great game should feel almost effortless to control, but Wasteland feels anything but. Virtually every task requires multiple keystrokes and the navigation of a labyrinth of menus. It’s a far cry from even the old-school simplicity of Ultima‘s alphabet soup of single-keystroke commands, much less the intuitive ease of Dungeon Master‘s mouse-driven interface.

Some of Wasteland‘s more pernicious playability issues perhaps stem from an overly literal translation of the tabletop experience to the computer.  The magnificent simplicity of the Mercenaries, Spies, and Private Eyes system is much more clunky and frustrating on the computer. As you explore the maps, you’re expected to guess where a skill and/or attribute might be of use, then to try manually invoking it. If you’re not constantly thinking on this level, and always aware of just what skills every member of your party has that might apply, it’s very easy to miss things. For example, the very first map you’re likely to visit contains a mysterious machine. You’re expected to not just dismiss that as scenery, or to assume it’s something you’ll learn more about later, but rather to use someone’s Intelligence to learn that it’s a water purifier you might be able to fix. Meanwhile other squares on other maps contain similar descriptions that are just scenery. In a tabletop game, where there is a constant active repartee between referee and players, where everything in the world can be fully “implemented” thanks to the referee’s imagination, and where every player controls just one character whom she knows intimately instead of a whole party of four, the Mercenaries, Spies, and Private Eyes system works a treat. In Wasteland, it can feel like a tedious, mechanistic process of trial and error.

Other parts of Wasteland are equally heroic but arguably misguided attempts to translate things that are simple and intuitive on the tabletop but extremely difficult on the computer to the digital realm at all costs, full speed ahead and damn the torpedoes. There is, for instance, a convoluted and confusing process for splitting your party into separate groups that can be on entirely separate maps at the same time. It’s impressive in its way, and gives Wasteland claim to yet another first in CRPG history to boot, but one has to question whether the time and effort put into it might have been better spent making a cleaner, more playable computer game. Ditto the parser-based conversation engine that occasionally pops up. An obvious attempt to bring the sort of free-form conversations that are possible with a human referee to the computer, in practice it’s just a tedious game of guess-the-word that makes it far too easy to miss stuff. While I applaud the effort St. Andre and Stackpole and their colleagues at Interplay made to bring more complexity to the CRPG, the fact remains that computer games are not tabletop games, and vice versa.

And then there’s the combat. The Bard’s Tale is still lurking down at the foundation of Wasteland‘s combat engine, but Interplay did take some steps to make it more interesting. Unlike in The Bard’s Tale, the position of your party and their enemies are tracked on a graphical map during combat. In addition to the old Bard’s Tale menu of actions — “attack,” “defend,” etc. — you can move around to find cover, or for that matter charge up to some baddies and stave their heads in with your crowbars in lieu of guns.

Yet somehow combat still isn’t much fun. This groundbreaking and much beloved post-apocalyptic CRPG also serves as an ironic argument for why the vast majority of CRPG designers and players still favor fantasy settings. Something that feels important, maybe even essential, feels lost without the ability to cast spells. Not only do you lose the thrill of seeing a magic-using character level up and trying out a new slate of spells, but you also lose the strategic dimension of managing your mana reserves, a huge part of the challenge of the likes of Wizardry and The Bard’s Tale. In theory, the acquiring of ever more powerful guns and the need to manage your ammunition stores in Wasteland ought to take the place of spells and the mana reserves needed to cast them, but in practice it doesn’t quite work out like that. New guns just aren’t as interesting as new spells, especially considering that there really aren’t all that many of the former to be found in Wasteland. And you’re never very far from a store selling bullets, and you can carry so many with you anyway that it’s almost a moot point.

Most of all, there’s just too much fighting. One place where St. Andre and Stackpole regrettably didn’t depart from CRPG tradition was in their fondness for the wandering monster. Much of Wasteland is a dull slog through endless low-stakes battles with “leather jerks” and “ozoners,” an experience sadly divorced from the game’s more interesting and innovative aspects but one that ends up being at least as time-consuming.

For all these reasons, then, I’m a bit less high on Wasteland than many others. It strikes me as more historically important than a timeless classic, more interesting than playable. There’s of course no shame in that. We need games that push the envelope, and that’s something that Wasteland most assuredly did. The immense nostalgic regard in which it’s still held today says much about how amazing its innovations really were back in 1988.

As the gap between that year of Wasteland‘s release and Fargo, Pavlish, and Stackpole’s December 1985 meeting will attest, this was a game that was in development an insanely long time by the standards of the 1980s. And as you have probably guessed, it was never intended to take anything like this long. Interplay first talked publicly about the Wasteland project as early as the Summer Consumer Electronics Show in June of 1986, giving the impression it might be available as early as that Christmas. Instead it took fully two more years.

Thanks to Wasteland‘s long gestation, 1987 proved a very quiet year for the usually prolific Interplay. While ports of older titles continued to appear, the company released not a single original new game that year. The Bard’s Tale III, turned over to Bill Heineman following Michael Cranford’s decision to return to university, went into development early in 1987, but like Wasteland its gestation would stretch well into 1988. (Stackpole, who was apparently starting to like this computer-game development stuff, wrote the storyline and the text for The Bard’s Tale III to accompany Heineman’s design.) Thankfully, the first two Bard’s Tale games were continuing to sell very well, making Interplay’s momentary lack of productivity less of a problem than it might otherwise have been.

Shortly before Wasteland‘s belated release, St. Andre, Stackpole, and Pavlish, along with a grab bag of the others who had worked with them, headed out to the Sonoran Desert for a photo shoot. Everyone scoured the oddities in the backs of their closets and the local leather shops for their costumes, and a professional makeup team was recruited to help turn them all into warriors straight out of Mad Max. Bill Heineman, an avid gun collector, provided much of the weaponry they carried. The final picture, featured on the inside cover of Wasteland‘s package, has since become far more iconic than the art that appeared on its front, a fitting tribute to this unique team and their unique vision.

Some of the Wasteland team. Ken St. Andre, Michael A. Stackpole, Bill Dugan, Nishan Hossepian, Chris Christensen, Alan Pavlish, Bruce Schlickbernd.

Some of the Wasteland team. From left: Ken St. Andre, Michael Stackpole, Bill Dugan, Nishan Hossepian, Chris Christensen, Alan Pavlish, Bruce Schlickbernd.

Both Wasteland and The Bard’s Tale III were finished almost simultaneously after many months of separate labor. When Fargo informed Electronic Arts of the good news, they insisted on shipping the two overdue games within two months of each other — May of 1988 in the case of Wasteland, July in that of The Bard’s Tale III — over his strident objections. He had good grounds for concern: these two big new CRPGs were bound to appeal largely to the same group of players, and could hardly help but cannibalize one another’s sales. To Interplay, this small company that had gone so long without any new product at all, the decision felt not just unwise but downright dangerous to their future.

Fargo had been growing increasingly unhappy with Electronic Arts, feeling Interplay just wasn’t earning enough from their development contracts for the hit games they had made for their publisher. Now this move was the last straw. Wasteland and The Bard’s Tale III would be the last games Interplay would publish through Electronic Arts, as Fargo decided to carry out an idea he’d been mulling over for some time: to turn Interplay into a full-fledged publisher as well as developer, with their own name — and only their own name — on their game boxes.

Following a pattern that was already all too typical, The Bard’s Tale III — the more traditional game, the less innovative, and the sequel — became by far the better selling of the pairing. Wasteland didn’t flop, but it didn’t become an out-and-out hit either. Doubtless for this reason, neither Interplay nor Electronic Arts were willing to invest in the extensive porting to other platforms that marked the Bard’s Tale games. After the original Apple II and Commodore 64 releases, the only Wasteland port was an MS-DOS version that appeared nine months later, in March of 1989. Programmed by Interplay’s Michael Quarles, it sports modestly improved graphics and an interface that makes halfhearted use of a mouse. While most original players of Wasteland knew it in its 8-bit incarnations, it’s this version that almost everyone who has played it in the years since knows, and for good reason: it’s a far less painful experience than the vintage 8-bit one of juggling disks and waiting, waiting, waiting for all of those painstakingly detailed maps to load and save.

Wasteland‘s place in history, and in the mind of Brian Fargo, would always loom larger than its sales figures might suggest. Unfortunately, his ability to build on its legacy was immediately hampered by the split with Electronic Arts: the terms of the two companies’ contract signed all rights to the  Wasteland name as well as The Bard’s Tale over to Interplay’s publisher. Thus both series, one potential and one very much ongoing, were abruptly stopped in their tracks. Electronic Arts toyed with making a Bard’s Tale IV on their own from time to time without ever seeing the idea all the way through. Oddly given the relative sales numbers, Electronic Arts did bring a sequel of sorts to Wasteland to fruition, although they didn’t go so far as to dare to put the Wasteland name on the box. Given the contents of said box, it’s not hard to guess why. Fountain of Dreams (1990) uses Michael Quarles’s MS-DOS Wasteland engine, but it’s a far less audacious affair. Slipped out with little fanfare — Electronic Arts could spot a turkey as well as anyone — it garnered poor reviews, sold poorly, and is unloved and largely forgotten today.

In the absence of rights to the Wasteland name, Fargo initially planned to leverage his development team and the tools and game engine they had spent so long creating to make more games in other settings that would play much like Wasteland but wouldn’t be actual sequels. The first of these was to have been called Meantime, and was to have been written and designed by Stackpole with the help of many of the usual Wasteland suspects. Its premise was at least as intriguing as Wasteland‘s: a game of time travel in which you’d get to meet (and sometimes battle) historical figures from Cyrano de Bergerac to P.T. Barnum, Albert Einstein to Amelia Earhart. At the Winter CES in January of 1989, Fargo said that Meantime would be out that summer: “I am personally testing the maps right now.” But it never appeared, thanks to a lot of design questions that were never quite solved and, most of all, thanks to the relentless march of technology. All of the Wasteland development tools ran on the Apple II and Commodore 64, platforms whose sales finally collapsed in 1989. Interplay tinkered with trying to move the tool chain to MS-DOS for several years, but the project finally expired from neglect. There just always seemed to be something more pressing to do.

Somewhat surprisingly given the enthusiasm with which they’d worked on Wasteland, neither St. Andre nor Stackpole remained for very long in the field of computer-game design. St. Andre returned to his librarian gig and his occasional sideline as a tabletop-RPG designer, not working on another computer game until recruited for Brian Fargo’s Wasteland 2 project many years later. Stackpole continued to take work from Interplay for the next few years, on Meantime and other projects, often working with his old Flying Buffalo and Wasteland colleague Liz Danforth. But his name too gradually disappeared from game credits in direct proportion to its appearance on the covers of more and more franchise novels. (His first such book, set in the universe of FASA’s BattleTech game, was published almost simultaneously with Wasteland and The Bard’s Tale III.)

Fargo himself never forgot the game that had always been first and foremost his own passion project. He would eventually revive it, first via the “spiritual sequels” Fallout (1997) and Fallout 2 (1998), then with the belated Kickstarter-funded sequel-in-name-as-well-as-spirit Wasteland 2 (2014).

But those are stories for much later times. Wasteland was destined to stand alone for many years. And yet it wouldn’t be the only lesson 1988 brought in the perils and possibilities of bringing tabletop rules to the computer. Another, much higher-profile tabletop adaptation, the result of a blockbuster licensing deal given to the most unexpected of developers, was still to come before the year was out. Next time we’ll begin to trace the story behind this third and final landmark CRPG of 1988, the biggest selling of the whole lot.

(Sources: PC Player of August 1989; Questbusters of July 1986, March 1988, April 1988, May 1988, July 1988, August 1988, October 1988, November 1988, January 1989, March 1989. On YouTube, Rebecca Heineman and Jennell Jaquays at the 2013 Portland Retro Gaming Expo; Matt Barton’s interview with Brian Fargo; Brian Fargo at Unity 2012. Other online sources include a Michael Stackpole article on RockPaperShotgun; Matt Barton’s interview with Rebecca Heineman on Gamasutra; GTW64’s page on Meantime.

Wasteland is available for purchase from GOG.com.)

Footnotes

Footnotes
1 Bill Heineman now lives as Rebecca Heineman. As per my usual editorial policy on these matters, I refer to her as “he” and by her original name only to avoid historical anachronisms and to stay true to the context of the times.
2 Paul Jaquays now lives as Jennell Jaquays.
3 Interestingly, Stackpole did have one connection to Interplay, through Bard’s Tale designer Michael Cranford. Cranford sent Flying Buffalo a Tunnels & Trolls solo adventure of his own devising around 1983. Stackpole thought it showed promise, but that it wasn’t quite there yet, so he sent it back with some suggestions for improvement and a promise to look at it again if Cranford followed through on them. But he never heard another word from him; presumably it was right about this time that Cranford got busy making The Bard’s Tale.
4 In what must be a record for footnotes of this type, I have to also note that Dan Bunten later became Danielle Bunten Berry, and lived until her death in 1998 under that name.
 
57 Comments

Posted by on February 26, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,