RSS

Search results for ‘epyx’

Opening the Gold Box, Part 5: All That Glitters is Not Gold

SSI entered 1989 a transformed company. What had been a niche maker of war games for grognards had now become one of the computer-game industry’s major players thanks to the first fruits of the coveted TSR Dungons & Dragons license. Pool of Radiance, the first full-fledged Dungeons & Dragons CRPG and the first in a so-called “Gold Box” line of same, was comfortably outselling the likes of Ultima V and The Bard’s Tale III, and was well on its way to becoming SSI’s best-selling game ever by a factor of four. To accommodate their growing employee rolls, SSI moved in 1989 from their old offices in Mountain View, California, which had gotten so crowded that some people were forced to work in the warehouse using piles of boxed games for desks, to much larger, fancier digs in nearby Sunnyvale. Otherwise it seemed that all they had to do was keep on keeping on, keep on riding Dungeons & Dragons for all it was worth — and, yes, maybe release a war game here and there as well, just for old times’ sake.

One thing that did become more clear than ever over the course of the year, however, was that not all Dungeons & Dragons products were created equal. Dungeon Masters Assistant Volume II: Characters & Treasures sold just 13,516 copies, leading to the quiet ending of the line of computerized aids for the tabletop game that had been one of the three major pillars of SSI’s original plans for Dungeons & Dragons. A deviation from that old master plan called War of the Lance, an attempt to apply SSI’s experience with war games to TSR’s Dragonlance campaign setting, did almost as poorly, selling 15,255 copies. Meanwhile the second of the “Silver Box” line of action-oriented games that made up the second of the pillars continued to perform well: Dragons of Flame sold 55,711 copies. Despite that success, though, 1989 would also mark the end of the line for the Silver Box, due to a breakdown in relations with the British developers behind those games. Going into the 1990s, then, Dungeons & Dragons on the computer would be all about the Gold Box line of turn-based traditional CRPGs, the only one of SSI’s three pillars still standing.

Thankfully, what Pool of Radiance had demonstrated in 1988 the events of 1989 would only confirm. What players seemed to hunger for most of all in the context of Dungeons & Dragons on the computer was literally Dungeons & Dragons on the computer: big CRPGs that implemented as many of the gnarly details of the rules as possible. Even Hillsfar, a superfluous and rather pointless sort of training ground for characters created in Pool of Radiance, sold 78,418 copies when SSI released it in March as a stopgap to give the hardcore something to do while they waited for the real Pool sequel.

Every female warrior knows that cleavage is more important than protection, right?

They didn’t have too long to wait. The big sequel dropped in June in the form of Curse of the Azure Bonds, and it mostly maintained the high design standard set by Pool of Radiance. Contrarians could and did complain that the free-roaming wilderness map of its predecessor had been replaced by a simple menu of locations to visit, but for this player anyway Pool‘s overland map always felt more confusing than necessary. A more notable loss in my view is the lack of any equivalent in Curse to the satisfying experience of slowly reclaiming the village of Phlan block by block from the forces of evil in Pool, but that brilliant design stroke was perhaps always doomed to be a one-off. Ditto Pool‘s unique system of quests to fulfill, some of them having little or nothing to do with the main plot.

What players did get in Curse of the Azure Bonds was the chance to explore a much wider area around Phlan with the same characters they had used last time, fighting a selection of more powerful and interesting monsters appropriate to their party’s burgeoning skills. At the beginning of the game, the party wakes up with a set of tattoos on their bodies —  the “azure bonds” of the title — and no memory of how they got there. (I would venture to guess that many of us have experienced something similar at one time or another…) It turns out that the bonds can be used to force the characters to act against their own will. Thus the quest is on to get them removed; each of the bonds has a different source, corresponding to a different area you will need to visit and hack and slash your way through in order to have it removed. By the end of Curse, your old Pool characters — or the new ones you created just for this game, who start at level 5 — will likely be in the neighborhood of levels 10 to 12, just about the point in Dungeons & Dragons where leveling up begins to lose much of its interest.

TSR was once again heavily involved in the making of Curse of the Azure Bonds, if not quite to the same extent as Pool of Radiance. As they had for Pool, they provided for Curse an official tie-in novel and tabletop adventure module. I can’t claim to have understood all of the nuances of the plot, such as they are, when I played the game; a paragraph book is once again used, but much of what I was told to read consisted of people that I couldn’t remember or never knew who they were babbling on about stuff I couldn’t remember or never knew what it was. But then, I know nothing about the Forgotten Realms setting other than what I learned in Pool of Radiance and never read the novel, so I’m obviously not the ideal audience. (Believe me, readers, I’ve done some painful things for this blog, but reading a Dungeons & Dragons novel was just a bridge too far…) Still, my cluelessness never interfered with my pleasure in mapping out each area and bashing things with my steadily improving characters; the standard of design in Curse remains as high as the writing remains breathlessly, entertainingly overwrought. Curse of the Azure Bonds did almost as well as its predecessor for SSI, selling 179,795 copies and mostly garnering the good reviews it deserved.

It was only with the third game of the Pool of Radiance series, 1990’s Secret of the Silver Blades, that some of the luster began to rub off of the Gold Box in terms of design, if not quite yet in that ultimate metric of sales. The reasons that Secret is regarded as such a disappointment by so many players — it remains to this day perhaps the least liked of the entire Gold Box line — are worth dwelling on for a moment.

One of the third game’s problems is bound up inextricably with the Dungeons & Dragons rules themselves. Secret of the Silver Blades allows you to take your old party from Pool of Radiance and/or Curse of the Azure Bonds up to level 15, but by this stage gaining a level is vastly less interesting than it was back in the day. Mostly you just get a couple of hit points, some behind-the-scenes improvements in to-hit scores, and perhaps another spell slot or two somewhere. Suffice to say that there’s no equivalent to, say, that glorious moment when you first gain access to the Fireball spell in Pool of Radiance.

The tabletop rules suggest that characters who reach such high levels should cease to concern themselves with dungeon delving in lieu of building castles and becoming generals or political leaders. Scorpia, Computer Gaming World‘s adventure and CRPG columnist, was already echoing these sentiments in the context of the Pool of Radiance series at the conclusion of her article on Curse of the Azure Bonds: “Characters have reached (by game’s end) fairly high levels, where huge amounts of experience are necessary to advance. If character transfer is to remain a part of the series (which I certainly hope it does), then emphasis needs to be placed on role-playing, rather than a lot of fighting. The true heart of AD&D is not rolling the dice, but the relationship between the characters and their world.” But this sort of thing, of course, the Gold Box engine was utterly unequipped to handle. In light of this, SSI probably should have left well enough alone, making Curse the end of the line for the Pool characters, but players were strongly attached to the parties they’d built up and SSI for obvious reasons wanted to keep them happy. In fact, they would keep them happy to the tune of releasing not just one but two more games which allowed players to use their original Pool of Radiance parties. By the time these characters finally did reach the end of the line, SSI would have to set them against the gods themselves in order to provide any semblance of challenge.

But by no means can all of the problems with Secret of the Silver Blades be blamed on high-level characters. The game’s other issues provide an interesting example of the unanticipated effects which technical affordances can have on game design, as well as a snapshot of changing cultures within both SSI and TSR.

A Gold Box map is built on a grid of exactly 16 by 16 squares, some of which can be “special” squares. When the player’s party enters one of the latter, a script runs to make something unusual happen — from something as simple as some flavor text appearing on the screen to something as complicated as an encounter with a major non-player character. The amount of special content allowed on any given map is restricted, however, by a limitation, stemming from the tiny memories of 8-bit machines like the Commodore 64 and Apple II, on the total size of all of the scripts associated with any given map.

One of the neat 16 by 16 maps found in Pool of Radiance and Curse of the Azure Bonds.

The need for each map to be no larger than 16 by 16 squares couldn’t help but have a major effect on the designs that were implemented with the Gold Box engine. In Pool of Radiance, for example, the division of the city of Phlan into a set of neat sections, to be cleared out and reclaimed one by one, had its origins as much in these technical restrictions as it did in design methodology. In that case it had worked out fantastically well, but by the time development began on Secret of the Silver Blades all those predictably uniform square maps had begun to grate on Dave Shelley, that game’s lead designer. Shelley and his programmers thus came up with a clever way to escape the system of 16 by 16 dungeons.

One of the things a script could do was to silently teleport the player’s party to another square on the map. Shelley and company realized that by making clever use of this capability they could create dungeon levels that gave the illusion of sprawling out wildly and asymmetrically, like real underground caverns would. Players who came into Secret of the Silver Blades expecting the same old 16 by 16 grids would be surprised and challenged. They would have to assume that the Gold Box engine had gotten a major upgrade. From the point of view of SSI, this was the best kind of technology refresh: one that cost them nothing at all. Shelley sketched out a couple of enormous underground complexes for the player to explore, each larger almost by an order of magnitude than anything that had been seen in a Gold Box game before.

A far less neat map from Secret of the Silver Blades. It may be more realistic in its way, but which would you rather try to draw on graph paper? It may help you to understand the scale of this map to know that the large empty squares at the bottom and right side of this map each represent a conventional 16 by 16 area like the one shown above.

But as soon as the team began to implement the scheme, the unintended consequences began to ripple outward. Because the huge maps were now represented internally as a labyrinth of teleports, the hugely useful auto-map had to be disabled for these sections. And never had the auto-map been needed more, for the player who dutifully mapped the dungeons on graph paper could no longer count on them being a certain size; they were constantly spilling off the page, forcing her to either start over or go to work on a fresh page stuck onto the old with a piece of tape. Worst of all, placing all of those teleports everywhere used just about all of the scripting space that would normally be devoted to providing other sorts of special squares. So, what players ended up with was an enormous but mind-numbingly boring set of homogeneous caverns filled with the same handful of dull random-monster encounters, coming up over and over and over. This was not, needless to say, an improvement on what had come before. In fact, it was downright excruciating.

At the same time that this clever technical trick was pushing the game toward a terminal dullness, other factors were trending in the same direction. Shelley himself has noted that certain voices within SSI were questioning whether all of those little extras found in Pool of Radiance and Curse of the Azure Bonds, like the paragraph books and the many scripted special encounters, were really necessary at all — or, at the least, perhaps it wasn’t necessary to do them with quite so much loving care. SSI was onto a good thing with these Gold Box games, said these voices — found mainly in the marketing department — and they ought to strike while the iron was hot, cranking them out as quickly as possible. While neither side would entirely have their way on the issue, the pressure to just make the games good enough rather than great in order to get them out there faster can be sensed in every Gold Box game after the first two. More and more graphics were recycled; fewer and fewer of those extra, special touches showed up. SSI never fully matched Pool of Radiance, much less improved on it, over the course of the ten Gold Box games that followed it. That SSI’s founder and president Joel Billings, as hardcore a gamer as any gaming executive ever, allowed this stagnation to take root is unfortunate, but isn’t difficult to explain. His passion was for the war games he’d originally founded SSI to make; all this Dungeons & Dragons stuff, while a cash cow to die for, was largely just product to him.

A similar complaint could be levied — and has been levied, loudly and repeatedly, by legions of hardcore Dungeons & Dragons fans over the course of decades — against Lorraine Williams, the wealthy heiress who had instituted a coup against Gary Gygax in 1985 to take over TSR. The idea that TSR’s long, slow decline and eventual downfall is due solely to Williams is more than a little dubious, given that Gygax and his cronies had already done so much to mismanage the company down that path before she ever showed up. Still, her list of wise strategic choices, at least after her very wise early decision to finally put Dungeons & Dragons on computers, is not a long one.

At the time they were signing the contract with SSI, TSR had just embarked on the most daunting project in the history of the company: a project to reorganize the Advanced Dungeons & Dragons rules, which had sprawled into eight confusing and sometimes contradictory hardcover books by that point, into a trio of books of relatively streamlined and logically organized information, all of it completely rewritten in straightforward modern English (as opposed to the musty diction of Gary Gygax, which read a bit like a cross of Samuel Johnson with H.P. Lovecraft). The fruits of the project appeared in 1989 in the form of a second-edition Player’s Handbook, Dungeon Master’s Guide, and Monstrous Compendium.

And then, right after expending so much effort to clean things up, TSR proceeded to muddy the second-edition waters even more indiscriminately than they had those of the first edition. Every single character class got its own book, and players with a hankering to play Dungeons & Dragons as a Viking or one of Charlemagne’s paladins were catered to. Indeed, TSR went crazy with campaign settings. By 1993, boxed sets were available to let you play in the Forgotten Realms, in the World of Greyhawk, or in Dragonlance‘s world of Krynn, or to play the game as a Jules Verne-esque science-fiction/fantasy hybrid called Spelljammer. You could also play Dungeons & Dragons as Gothic horror if you bought the Ravenloft set, as vaguely post-apocalyptic dark fantasy if you bought Dark Sun, as a set of tales from the Arabian Nights if you bought Al-Qadim, or as an exercise in surreal Expressionism worthy of Alfred Kubin if you bought Planescape.

Whatever the artistic merits behind all these disparate approaches — and some of them did, it should be said, have much to recommend them over the generic cookie-cutter fantasy that was vanilla Dungeons & Dragons — the commercial pressures that led Lorraine Williams to approve this glut of product aren’t hard to discern. The base of tabletop Dungeons & Dragons players hadn’t grown appreciably for many years. Just the opposite, in fact: it’s doubtful whether even half as many people were actively playing Dungeons & Dragons in 1990 as at the height of the brief-lived fad for the game circa 1982. After the existing player base had dutifully rushed out to buy the new second-edition core books, in other words, very few new players were discovering the game and thus continuing to drive their sales. Unless and until they could find a way to change that situation, the only way for TSR to survive was to keep generating gobs of new product to sell to their existing players. Luckily for them, hardcore Dungeons & Dragons players were tremendously loyal and tremendously dedicated to their hobby. Many would buy virtually everything TSR put out, even things that were highly unlikely ever to make it to their gaming tables, just out of curiosity and to keep up with the state of the art, as it were. It would take two or three years for players to start to evince some fatigue with the sheer volume of product pouring out of TSR’s Lake Geneva offices, much of it sorely lacking in play-testing and basic quality control, and to start giving large swathes of it a miss — and that, in turn, would spell major danger for TSR’s bottom line.

Lorraine Williams wasn’t unaware of the trap TSR’s static customer base represented; on the contrary, she recognized as plainly as anyone that TSR needed to expand into new markets if it was to have a bright long-term future. She made various efforts in that direction even as her company sustained itself by flooding the hardcore Dungeons & Dragons market. In fact, the SSI computer games might be described as one of these efforts — but even those, successful as they were on their own terms, were still playing at least partially to that same old captive market. In 1989, Williams opened a new TSR office on the West Coast in an attempt to break the company out of its nerdy ghetto. Run by Flint Dille, Williams’s brother, one of TSR West’s primary goals was to get Dungeons & Dragons onto television screens or, better yet, onto movie screens. Williams was ironically pursuing the same chimera that her predecessor Gary Gygax — now her sworn, lifetime arch-enemy — had so zealously chased. She was even less successful at it than he had been. Whereas Gygax had managed to get a Saturday morning cartoon on the air for a few seasons, Flint Dille’s operation managed bupkis in three long years of trying.

Another possible ticket to the mainstream, to be pursued every bit as seriously in Hollywood as a Dungeons & Dragons deal, was Buck Rogers, the source of the shared fortune of Lorraine Williams and Flint Dille. Their grandfather had been John F. Dille, owner of a newspaper syndicator known as the National Newspaper Service. In this capacity, the elder Dille had discovered the character that would become Buck Rogers — at the time, he was known as Anthony Rogers — in Armageddon 2419 A.D., a pulp novella written by Philip Francis Nowlan and published in Amazing Stories in 1928. Dille himself had come up with the nickname of “Buck” for the lead character, and convinced Nowlan to turn his adventures in outer space into a comic strip for his syndicator. It ended up running from 1929 until 1967 — only the first ten of those years under the stewardship of Nowlan — and was also turned into very popular radio and movie serials during the 1930s, the height of the character’s popularity. Having managed to secure all of the rights to Buck from a perhaps rather naive Nowlan, John Dille and his family profited hugely.

In marked contrast to her attitude toward TSR’s other intellectual properties, Lorraine Williams’s determination to return Buck Rogers to the forefront of pop culture was apparently born as much from a genuine passion for her family’s greatest legacy as it was from the dispassionate calculus of business. In addition to asking TSR West to lobby — once again fruitlessly, as it would transpire — for a Buck Rogers revival on television or film, she pushed a new RPG through the pipeline, entitled Buck Rogers XXVc and published in 1990. TSR supported the game fairly lavishly for several years in an attempt to get it to take off, releasing source books, adventure modules, and tie-in novels to little avail. With all due deference to Buck Rogers’s role as a formative influence on Star Wars among other beloved contemporary properties, in the minds of the Dungeons & Dragons generation it was pure cheese, associated mainly with the Dille family’s last attempt to revive the character, the hilariously campy 1979 television series Buck Rogers in the 25th Century. The game might have had a chance with some players had Williams been willing to recognize the cheese factor and let her designers play it up, but taken with a straight face? No way.

SSI as well was convinced — or coerced — to adapt the Gold Box engine from fantasy to science fiction for a pair of Buck Rogers computer games, 1990’s Countdown to Doomsday and 1992’s Matrix Cubed. SSI’s designers must have breathed a sigh of relief when they saw that the rules for the Buck Rogers tabletop RPG, much more so than any of TSR’s previous non-Dungeons & Dragons RPGs, had been based heavily on those of the company’s flagship game; thus the process of adaptation wasn’t quite so onerous as it might otherwise have been. That said, most agree that the end results are markedly less interesting than the other Gold Box games when it comes to combat, the very thing at which the engine normally excels; a combat system designed to include magic becomes far less compelling in its absence. Benefiting doubtless from its association with the Dungeons & Dragons Gold Box line, for which enthusiasm remained fairly high, the first Buck Rogers game sold a relatively healthy 51,528 copies; the second managed a somewhat less healthy 38,086 copies.

All of these competing interests do much to explain why TSR, after involving themselves so closely in the development of Pools of Radiance and Curse of the Azure Bonds, withdrew from the process almost entirely after those games and just left SSI to it. And that fact in turn is yet one more important reason why the Gold Box games not only failed to evolve but actually devolved in many ways. TSR’s design staff might not have had a great understanding of computer technology, but they did understand their settings and rules, and had pushed SSI to try to inject at least a little bit of what made for a great tabletop-role-playing experience into the computer games. Absent that pressure, SSI was free to fall back on what they did best — which meant, true to their war-game roots, lots and lots of combat. In both Pool and Curse, random encounters cease on most maps after you’ve had a certain number of them — ideally, just before they get boring. Tellingly, in Secret of the Silver Blades and most of the other later Gold Box games that scheme is absent. The monsters just keep on coming, ad infinitum.

Despite lukewarm reviews that were now starting to voice some real irritation with the Gold Box line’s failure to advance, Secret of the Silver Blades was another huge hit, selling 167,214 copies. But, in an indication that some of those who purchased it were perhaps disappointed enough by the experience not to continue buying Gold Box games, it would be the last of the line to break the 100,000-copy barrier. The final game in the Pool of Radiance series, Pools of Darkness, sold just 52,793 copies upon its release in 1991.

In addition to the four-game Pool series, SSI also released an alternate trilogy of Dungeons & Dragons Gold Box games set in Krynn, the world of the Dragonlance setting. Champions of Krynn was actually released before Secret of the Silver Blades, in January of 1990, and sold 116,693 copies; Death Knights of Krynn was released in 1991 and sold 61,958 copies; and The Dark Queen of Krynn, the very last Gold Box game, was released in 1992 and sold 40,640 copies. Another modest series of two games was developed out-of-house by Beyond Software (later to be renamed Stormfront Studios): Gateway to the Savage Frontier (1991, 62,581 copies sold) and Treasures of the Savage Frontier (1992, 31,995 copies sold). In all, then, counting the two Buck Rogers games but not counting the oddball Hillsfar, SSI released eleven Gold Box games over a period of four years.

While Secret of the Silver Blades still stands as arguably the line’s absolute nadir in design terms, the sheer pace at which SSI pumped out Gold Box games during the latter two years of this period in particular couldn’t help but give all of them a certain generic, interchangeable quality. It all began to feel a bit rote — a bit cheap, in stark contrast to the rarefied atmosphere of a Big Event that had surrounded Pool of Radiance, a game which had been designed and marketed to be a landmark premium product and had in turn been widely perceived as exactly that. Not helping the line’s image was the ludicrous knockoff-Boris Vallejo cover art sported by so many of the boxes, complete with lots of tawny female skin and heaving bosoms. Susan Manley has described the odd and somewhat uncomfortable experience of being a female artist asked to draw this sort of stuff.

They pretty much wanted everybody [female] to be the chainmail-bikini babes, as we called them. I said, “Look, not everybody wants to be a chainmail-bikini babe.” They said, “All the guys want that, and we don’t have very many female players.” I said, “You’re never going to have female players if you continue like this. Functional armor that would actually protect people would play a little bit better.”

Tom [Wahl, SSI’s lead artist] and I actually argued over whether my chest size was average or not, which was an embarrassing conversation to have. He absolutely thought that everybody needed to look like they were stepping out of a Victoria’s Secret catalog if they were female. I said, “Gee, how come all the guys don’t have to be super-attractive?” They don’t look like they’re off of romance-novel covers, let’s put it that way. They get to be rugged, they get to be individual, they get to all have different costumes. They get to all have different hairstyles, but the women all had to have long, flowing locks and lots of cleavage.

By 1991, the Gold Box engine was beginning to seem rather like a relic from technology’s distant past. In a sense, the impression was literally correct. When SSI had begun to build the Gold Box engine back in 1987, the Commodore 64 had still ruled the roost of computer gaming, prompting SSI to make the fateful decision not only to make sure the Gold Box games could run on that sharply limited platform, but also to build most of their development tools on it. Pool of Radiance then appeared about five minutes before the Commodore 64’s popularity imploded in the face of Nintendo. The Gold Box engine did of course run on other platforms, but it remained throughout its life subject to limitations born of its 8-bit origins — things like the aforementioned maps of exactly 16 by 16 squares and the strict bounds on the amount of custom scripting that could be included on a single one of those maps. Even as the rest of the industry left the 8-bit machines behind in 1989 and 1990, SSI was reluctant to do so in that the Commodore 64 still made up a major chunk of Gold Box sales: Curse of the Azure Bonds sold 68,622 copies on the Commodore 64, representing more than a third of its total sales, while Secret of the Silver Blades still managed a relatively healthy 40,425 Commodore 64 versions sold. Such numbers likely came courtesy of diehard Commodore 64 owners who had very few other games to buy in an industry that was moving more and more to MS-DOS as its standard platform. SSI was thus trapped for some time in something of a Catch-22, wanting to continue to reap the rewards of being just about the last major American publisher to support the Commodore 64 but having to compromise the experience of users with more powerful machines in order to do so.

SSI had managed to improve the Gold Box graphics considerably by the time of The Dark Queen of Krynn, the last game in the line.

When SSI finally decided to abandon the Commodore 64 in 1991, they did what they could to enhance the Gold Box engine to take advantage of the capabilities of the newer machines, introducing more decorative displays and pictures drawn in 256-color VGA along with some mouse support. Yet the most fundamental limitations changed not all; the engine was now aged enough that SSI wasn’t enthused about investing in a more comprehensive overhaul. And thus the Gold Box games seemed more anachronistic than ever. As SSI’s competitors worked on a new generation of CRPGs that took advantage of 32-bit processors and multi-megabyte memories, the Gold Box games remained the last surviving relics of the old days of 8 bits and 64 K. Looking at The Dark Queen of Krynn and the technical tour de force that was Origin’s Ultima VII side by side, it’s difficult to believe that the two games were released in the same year, much less that they were, theoretically at least, direct competitors.

It’s of course easy for us to look back today and say what SSI should have done. Instead of flooding the market with so many generic Gold Box games, they should have released just one game every year or eighteen months, each release reflecting a much more serious investment in writing and design as well as real, immediately noticeable technical improvements. They should, in other words, have strained to make every new Gold Box game an event like Pool of Radiance had been in its day. But this had never been SSI’s business model; they had always released lots of games, very few of which sold terribly well by the standard of the industry at large, but whose sales in the aggregate were enough to sustain them. When, beginning with Pool of Radiance, they suddenly were making hits by anybody’s standards, they had trouble adjusting their thinking to their post-Pool situation, had trouble recognizing that they could sell more units and make more money by making fewer but better games. Such is human nature; making such a paradigm shift would doubtless challenge any of us.

Luckily, just as the Gold Box sales began to tail off SSI found an alternative approach to Dungeons & Dragons on the computer from an unlikely source. Westwood Associates was a small Las Vegas-based development company, active since 1985, who had initially made their name doing ports of 8-bit titles to more advanced machines like the Commodore Amiga and Atari ST (among these projects had been ports of Epyx’s Winter Games, World Games, and California Games). What made Westwood unique and highly sought after among porters was their talent for improving their 8-bit source material enough, in terms of both audiovisuals and game play, that the end results would be accepted almost as native sons by the notoriously snobbish owners of machines like the Amiga. Their ambition was such that many publishers came to see the biggest liability of employing them as a tendency to go too far, to such an extent that their ports could verge on becoming new games entirely; for example, their conversion of Epyx’s Temple of Apshai on the Macintosh from turn-based to real-time play was rejected as being far too much of a departure.

Westwood first came to the attention of Gold Box fans when they were given the job of implementing Hillsfar, the stopgap “character training grounds” which SSI released between Pool of Radiance and Curse of the Azure Bonds. Far more auspicious were Westwood’s stellar ports of the mainline Gold Box games to the Amiga, which added mouse support and improved the graphics well before SSI’s own MS-DOS versions made the leap to VGA. But Brett Sperry and Louis Castle, Westwood’s founders, had always seen ports merely as a way of getting their foot in the door of the industry. Already by the time they began working with SSI, they were starting to do completely original games of their own for Electronic Arts and Mediagenic/Activision. (Their two games for the latter, both based on a board-game line called BattleTech, were released under the Infocom imprint, although the “real” Cambridge-based Infocom had nothing to do with them.) Westwood soon convinced SSI as well to let them make an original title alongside the implementation assignments: what must be the strangest of all the SSI Dungeons & Dragons computer games, a dragon flight simulator (!) called Dragon Strike. Released in 1990, it wasn’t quite an abject flop but neither was it a hit, selling 34,296 copies. With their next original game for SSI, however, Westwood would hit pay dirt.

Eye of the Beholder was conceived as Dungeons & Dragons meets Dungeon Master, bringing the real-time first-person game play of FTL’s seminal 1987 dungeon crawl to SSI’s product line. In a measure of just how ahead-of-its-time Dungeon Master had been in terms not only of technology but also of fundamental design, nothing had yet really managed to equal it over the three years since its release. Eye of the Beholder arguably didn’t fully manage that feat either, but it did at the very least come closer than most other efforts — and of course it had the huge advantage of the Dungeons & Dragons license. When a somewhat skeptical SSI sent an initial shipment of 20,000 copies into the distribution pipeline in February of 1991, “they all disappeared” in the words of Joel Billings: “We put them out and boom!, they were gone.” Eye of the Beholder went on to sell 129,234 copies, nicely removing some of the sting from the slow commercial decline of the Gold Box line and, indeed, finally giving SSI a major Dungeons & Dragons hit that wasn’t a Gold Box game. The inevitable sequel, released already in December of 1991, sold a more modest but still substantial 73,109 copies, and a third Eye of the Beholder, developed in-house this time at SSI, sold 50,664 copies in 1993. The end of the line for this branch of the computerized Dungeons & Dragons family came with the pointless Dungeon Hack, a game that, as its name implies, presented its player with an infinite number of generic randomly generated dungeons to hack her way through; it sold 27,110 copies following its release at the end of 1993.

This chart from the April 1991 Software Publishers Association newsletter shows just how quickly Eye of the Beholder took off. Unfortunately, this would mark the last time an SSI Dungeons & Dragons game would be in this position.

Despite their popularity in their heyday, the Eye of the Beholder games in my view have aged less gracefully than their great progenitor Dungeon Master, or for that matter even the early Gold Box games. If what you wished for more than anything when playing Dungeon Master was lots more — okay, any — story and lore to go along with the mapping, the combat, and the puzzles, these may be just the games for you. For the rest of us, though, the Dungeons & Dragons rules make for an awkward fit to real-time play, especially in contrast to Dungeon Master‘s designed-from-scratch-for-real-time systems of combat, magic, and character development. The dungeon designs and even the graphics similarly underwhelm; Eye of the Beholder looks a bit garish today in contrast to the clean minimalism of Dungeon Master. The world would have to wait more than another year, until the release of Ultima Underworld, to see a game that truly and comprehensively improved on the model of Dungeon Master. In the meantime, though, the Eye of the Beholder games would do as runners-up for folks who had played Dungeon Master and its sequel and still wanted more, or for those heavily invested in the Dungeons & Dragons rules and/or the Forgotten Realms setting.

For SSI, the sales of the Eye of the Beholder games in comparison to those of the latest Gold Box titles provided all too clear a picture of where the industry was trending. Players were growing tired of the Gold Box games; they hungered after faster-paced CRPGs that were prettier to look at and easier to control. While Eye of the Beholder was still high on the charts, TSR and SSI agreed to extend their original five-year contract, which was due to expire on January 1, 1993, by eighteen months to mid-1994. The short length of the extension may be indicative of growing doubts on the part of TSR about SSI’s ability to keep up with the competition in the CRPG market; one might see it as a way of putting them on notice that the TSR/SSI partnership was by no means set in stone for all time. At any rate, a key provision of the extension was that SSI must move beyond the fading Gold Box engine, must develop new technology to suit the changing times and to try to recapture those halcyon early days when Pool of Radiance ruled the charts and the world of gaming was abuzz with talk of Dungeons & Dragons on the computer. Accordingly, SSI put a bow on the Gold Box era in March of 1993 with the release of Unlimited Adventures, a re-packaging of their in-house development tools that would let diehard Gold Box fans make their own games to replace the ones SSI would no longer be releasing. It sold just 32,362 copies, but would go on to spawn a loyal community of adventure-makers that to some extent still persists to this day. As for what would come next for computerized Dungeons & Dragons… well, that’s a story for another day.

By way of wrapping up today’s story, I should note that my take on the Gold Box games, while I believe it dovetails relatively well with the consensus of the marketplace at the time, is by no means the only one in existence. A small but committed group of fans still loves these games — yes, all of them — for their approach to tactical combat, which must surely mark the most faithful implementation of the tabletop game’s rules for same ever to make it to the computer. “It’s hard to imagine a truly bad game being made with it,” says blogger Chester Bolingbroke — better known as the CRPG Addict — of the Gold Box engine. (Personally, I’d happily nominate Secret of the Silver Blades for that designation.)

Still, even the Gold Box line’s biggest fans will generally acknowledge that the catalog is very front-loaded in terms of innovation and design ambition. For those of you like me who aren’t CRPG addicts, I highly recommend Pool of Radiance and Curse of the Azure Bonds, which together let you advance the same party of characters just about as far as remains fun under the Dungeons & Dragons rules, showing off the engine at its best in the process. If the Gold Box games that came afterward wind up a bit of an anticlimactic muddle, we can at least still treasure those two genuine classics. And if you really do want more Gold Box after playing those two, Lord knows there’s plenty of it out there, enough to last most sane people a lifetime. Just don’t expect any of it to quite rise to the heights of the first games and you’ll be fine.

(Sources: This article is largely drawn from the collection of documents that Joel Billings donated to the Strong Museum of Play, which includes lots of internal SSI documents and some press clippings. Also, the book Designers & Dragons Volume 1 by Shannon Appelcline; Computer Gaming World of September 1989; Retro Gamer 52 and 89; Matt Barton’s video interviews with Joel Billings, Susan Manley, and Dave Shelley and Laura Bowen.

Many of the Gold Box games and the Eye of the Beholder trilogy are available for purchase from GOG.com. You may also wish to investigate The Gold Box Companion, which adds many modern conveniences to the original games.)

 
 

Tags: , ,

Generation Nintendo

Nintendo

In the final months of World War II, when the United States was trying to burn out the will of a starving Japan via the most sustained campaign of aerial incendiary bombardment in history, a handful of obvious targets remained strangely untouched. Among those targets was Kyoto: population 1 million plus, founded in the year 793, capital of the nation and home of the Emperor for most of the intervening centuries, home to more national shrines and other historic sites than any other city in Japan, world famous for its silk and cloisonné. If a single city can be said to embody the very soul of the Japanese people, it must be this one.

If the citizens of Kyoto believed that their city was being left untouched by the bombs raining down on the rest of the country out of respect for the special place it occupied in the Japanese psyche, they were partially correct. Yet the motivation behind their seeming good fortune was cold-blooded rather than humanitarian. American Air Force planners were indeed aware of Kyoto’s symbolic importance, but they hardly saw that importance as grounds for sparing the city. Far from it. Kyoto was being reserved as a target for a special new weapon, one which was referred to only obliquely in Air Force internal memoranda as “the gadget.” Today we know the gadget as the atomic bomb. Entirely destroying Kyoto with one bomb would deliver a shock to the rest of Japan unequaled by the destruction of any other possible target: “From the psychological point of view there is the advantage that Kyoto is an intellectual center for Japan and the people there are more apt to appreciate the significance of such a weapon as the gadget.” Kyoto must be left untouched while the gadget was made ready for service so that mission planners and scientists could properly evaluate the bomb’s effect on an undamaged clean slate of a target.

Hundreds of thousands of Kyoto residents would wind up owing their lives to Henry L. Stimson, a humane man tortured daily by the orders he had to issue as the American Secretary of War; never was there a Secretary of War who hated war more. In response to Stimson’s demand after the successful first test of the gadget in New Mexico, General Leslie Groves, head of the Manhattan Project, reluctantly presented the Air Force’s list of planned targets to him, with Kyoto at the top. Stimson was horrified. Citing the proposed destruction of Kyoto as an unforgivable act from which Japan would never recover, Stimson, 77 years old and in poor health, faced down virtually the entire entrenched bureaucracy of the American military to demand that the first atomic bomb to be used in anger be dropped somewhere, anywhere else: “This is one time I’m going to be the final deciding authority. Nobody’s going to tell me what to do on this.” His stubborn stance resulted at last in Kyoto being stricken from the list by grumbling generals who would have been perfectly happy if its destruction really had been a death blow to the culture it symbolized, thank you very much. Of course, in saving hundreds of thousands of Kyoto residents Stimson was also consigning to death hundreds of thousands of others in Hiroshima. Such are the wages of war.

The decision to spare Kyoto had another unintended consequence, one which may seem trivial — even disrespectful — to mention in parallel with such immense tolls in human lives saved and lost, but one which in its own way illustrates the interconnectness of all things. Hidden away within Kyoto’s blissfully undamaged warren of ancient streets was a little family-owned company called Nintendo, maker of ornate playing cards and other games and collectibles. Absolutely dedicated to the war effort, as all good Japanese were expected to be at the time, they had lately taken to giving their products jingoist themes, such as a backgammon board illustrated by cartoon animals dressed up as soldiers, with Japanese flags flying proudly above them and British and American flags lying crumpled in the dust at their feet.

More than four decades later, Stimson’s determination to spare Kyoto and with it Nintendo boomeranged back on his country in a way that no one could have seen coming. Many contemporary commentators, conditioned by the Reagan Revolution to cast all things in terms of nationalism and patriotism, saw in the arrival of Nintendo on American shores the opening of the latest front in a new war, economic rather than military this time, between the United States and Japan. And this time it seemed that Japan was winning the war handily. They had come for our steel, and we had done nothing. They had come for our auto industry, and we had done nothing. They had come for our televisions and stereos, and we had done nothing. Now they were coming for our videogame consoles. How long would it be until the PC industry, arguably the biggest economic success story of the 1980s, was threatened as well?

Given the subject of this article, I should take a moment to clarify right now that this blog has not been and will never become a history of console-based videogames. This blog is rather a history of computer games, a culture possessed of plenty of interconnections and collisions with the larger, more mainstream culture of the consoles, but one which has nevertheless remained largely its own thing ever since the first popular videogame console and the first three pre-assembled PCs were all launched during the single fecund year of 1977. In addition to reasons of pure personal preference, I justify this focus by noting that a fair number of people are doing great, rigorous history in the realm of videogames, while the realm of computer games has been comparatively neglected.

Still, we can’t really understand the history of computer games without reckoning with those aforementioned interconnections and collisions with the world of the consoles. And one of the biggest and most obvious collisions of all was that crazy time at the tail end of the 1980s when Nintendo arrived to sweep the rug out from under a computer-game industry which had spent the last few years convinced that it was destined to become the next great movement in mainstream American entertainment — i.e., destined to hold exactly the position that this Japanese upstart had just swept in and taken over with breathtaking speed. Small wonder that coded allusions to the dark days of World War II, accompanied by thinly veiled (or blatantly unveiled) racism, became the order of the day in many sectors of American culture, industry, and government alike. Meanwhile the bewildered computer-game executives were trying to figure out what the hell had just hit them and what they should do about it. Let’s join them now in asking the first of those questions.

Hiroshi Yamauchi

Hiroshi Yamauchi

The history of the company known as Nintendo — the name can be very roughly translated as an admonition to work hard but also to accept that one’s ultimate success is in the hands of greater powers — dates all the way back to 1889, when it was founded by Fusajiro Yamauchi as a maker of intricately painted playing cards, known as “hanafuda” in Japanese. Nintendo managed to survive and grow modestly amid many changes in Japanese life over the course of the next half-century and beyond. The company’s modern history, however, begins in 1949, when Hiroshi Yamauchi, latest scion of the family-owned business, took over as president. Far more ambitious than his forebears, this latest Yamauchi was inspired by the entrepreneurial ferment of the rebuilding postwar Japan to expand Nintendo beyond playing cards and collectibles. The results of his efforts were decidedly mixed in the early years. Among his less successful initiatives were a line of instant-rice meals — a sort of ricey Ramen Noodles before Ramen Noodles were cool — and a chain of “love motels” offering busy executives the convenience of paying for their trysts by the hour. (Ironic as they might seem in light of Nintendo’s later rigorously enforced family-friendly image, at the time the love motels seemed to everyone around him a natural innovation for Yamauchi to have dreamed up; he was a notorious philanderer.) More successful, for a while, was a Nintendo taxi service. Yet even it was hardly a world-beater. Throughout the first two decades of Yamauchi’s lengthy reign he continued to cast restlessly about for the Big One, the idea that would finally take Nintendo to the next level.

In 1969, he made a big step in the direction of finding his company’s life’s purpose when he founded a new division called simply “Toys.” Employing a number of young gadget freaks as inventors, Toys began to churn out a series of strange contraptions straight out of Rube Goldberg, such as the Ultra Hand, a scissor-like reach extender that was more whimsical than practical; the Ultra Machine, an indoor mechanical baseball pitcher; and the Ultra Scope, a periscope for peeking around corners and over fences. (Parents were not terribly fond of this last one in particular.) All were quite successful, opening at last the sustainable new business front for Nintendo that Yamauchi had been dreaming of for so long.

With electronic components getting smaller and cheaper by the year, Nintendo’s innovative toys inevitably began to take on more and more of an electronic character as time wore on. The first big success in the realm of electronic gadgets was something called the Nintendo Beam Gun, which combined a light gun with a set of targets equipped with the appropriate photoelectric sensors; more than 1 million of them were sold. Nintendo built on the Beam Gun’s success with a chain of Laser Clay Ranges — think “clay pigeons” — that spread across Japan during the mid-1970s, re-purposed bowling alleys where patrons could engage in gunfights with cowboys and “homicidal maniacs” projected onto the far wall.

With Atari now going strong in the United States, videogames were a natural next step for Nintendo. They first made a series of Color TV Games, each a home videogame capable of playing a few variants of a single simple game when hooked up to the family television set; they sold at least 2.5 million of them in the late 1970s. The Nintendo Game & Watch, a whole line of handheld gadgets capable of playing a single game each, did even better; Nintendo is estimated to have sold over 40 million of them during the 1980s. Meanwhile they were also moving into the standup arcade; Donkey Kong, released in 1981, became a worldwide smash, introducing the Nintendo name to many in the United States for the first time. The designer of that cute, colorful, relatively non-violent game, a blueprint for the eventual Nintendo aesthetic as a whole, was one Shigeru Miyamoto. He would become not only Nintendo’s own most famous designer and public figure, but the most famous Japanese videogame designer of all time, full stop. The protagonist of Miyamoto’s Donkey Kong, a little leaping Italian plumber named Mario, was also destined for greatness as arguably the most famous videogame character of all time (his only serious rival is likely Pac-Man, another contemporaneous Japanese creation).

All of this success, however, was only laying the groundwork for Nintendo’s masterstroke. Moving on from the single-game units that had so far been Nintendo’s sole output, Yamauchi tasked his engineers with creating a proper videogame console capable of playing many games that could be sold separately in the form of cartridges, just like the Atari VCS. The device they came up with was hardly state of the art even at the time of its debut. It was built around a clone of the venerable old 8-bit MOS 6502, the same chip found in the Atari VCS as well as American home computers like the Apple II and Commodore 64, with those circuits that were protected by patents excised. It offered graphics a little better than the likes of the 64, sound a little worse. The new machine was being readied at seemingly the worst possible time: just as the Great Videogame Crash was underway in the United States, and just as the worldwide conventional wisdom was saying that home computers were the future, videogame consoles a brief-lived fad of the past. Yet Nintendo freely, even gleefully defied the conventional wisdom. The Nintendo Family Computer (“Famicom”) was deliberately designed to be as non-computer-like as possible. Instead it was patterned after Nintendo’s successful toys and gadgets — all bright, garish plastic, with as few switches and plugs as possible, certainly with nothing as complicated as a keyboard or disk drive. It looked like a toy because Nintendo designed it to look like a toy.

The Nintendo Famicom

The Nintendo Famicom

Yamauchi realized that a successful videogame console was at least as much a question of perception — i.e., of marketing — as it was of technology. In the imploding Atari, he had the one great counterexample he needed, a perfect model of what not to do. Atari’s biggest sin in Yamauchi’s eyes had been to fail to properly lock down the VCS. It had never occurred to them that third parties could start making games for “their” machine, until Activision started doing just that in 1980, to be followed by hundreds more. Not only had all of those third-party cartridges cost Atari hundreds of millions in the games of their own that they didn’t sell and the potential licensing fees that they didn’t collect, they had also gravely damaged the image of their platform: many or most Atari VCS games were just plain bad, and some were in devastatingly terrible taste to boot. The public at large, Yamauchi realized, didn’t parse fine distinctions between a game console and the games it played. He was determined not to lose control of his brand as Atari had done theirs.

For better and for worse, that determination led to Nintendo becoming the first of the great walled gardens in consumer software. The “better” from the standpoint of consumers was a measure of quality control, an assurance that any game they bought for their console would be a pretty good, polished, playable game. And from the standpoint of Yamauchi the “better” was of course that Nintendo got a cut of every single one of those games’ earnings, enough to let him think of the console itself as little more than a loss leader for the real business of making and licensing cartridges: “Forgo the big profits on the hardware because it is really just a tool to sell software. That is where we shall make our money.” The “worse” was far less diversity in theme, content, and mechanics, and a complete void of games willing to actually say almost anything at all about the world, lest they say something that some potential customer somewhere might possibly construe as offensive. The result would be an infantilization of the nascent medium in the eyes of mainstream consumers, an infantilization from which it has arguably never entirely escaped.

Whatever the reservations of curmudgeons like me, however, the walled-garden model of software distribution proved successful even beyond Yamauchi’s wildest dreams. After releasing their new console to Japanese consumers on July 15, 1983, Nintendo sold more than 2.5 million of them in the first eighteen months alone. Sales only increased as the years went by, even as the hardware continued to grow more and more technically obsolete. Consumers didn’t care about that. They cared about all those cute, colorful, addictive games, some produced by an ever-widening circle of outside licensees, others — including many or most of the best and best-remembered — by Nintendo’s own crack in-house development team, with that indefatigable fount of creativity named Shigeru Miyamoto leading the way. Just as Yamauchi had predicted, the real money in the Famicom was in the software that was sold for it.

Minoru Arakawa

Minoru Arakawa

With the Famicom a huge success in Japan, there now beckoned that ultimate market for any ambitious up-and-comer: the United States. Yamauchi had already set up a subsidiary there called Nintendo of America back in 1980, under the stewardship of his son-in-law Minoru Arakawa. Concerns about nepotism aside — no matter how big it got, Nintendo would always remain the Yamauchi family business — Arakawa was ideal for the job: an MIT-educated fluent English-speaker who had traveled extensively around the country and grown to understand and love its people and their way of life. Under his stewardship, Nintendo of America did very well in the early years on the back of Donkey Kong and other standup-arcade games.

Yet Nintendo as a whole hesitated for quite some time at the prospect of introducing the Famicom to North America. When Arakawa canvased toy stores, the hostility he encountered to the very idea of another videogame console was palpable. Atari had damaged or destroyed many a business and many a life on the way down, and few drew much of a distinction between Atari and the videogame market as a whole. According to one executive, “it would be easier to sell Popsicles in the Arctic” than to convince the toy stores to take a flyer on another console.

But Arakawa, working in tandem with two American executive recruits who would become known as “the two Howards” — Howard Lincoln and Howard Philips — wouldn’t let go of the idea. Responding to focus-group surveys that said the Japanese Famicom was too toy-like and too, well, foreign-looking to succeed in the United States, he got Nintendo’s engineers to redesign the externals to be less bulbous, less garish, and less shiny. He also gave the Famicom a new, less cutesy name: the Nintendo Entertainment System, or NES. The only significant technical update Nintendo made for North America was a new state-of-the-art handshaking system for making sure that every cartridge was a legitimate, licensed Nintendo game; black-market cartridges duplicated by tiny companies who hoped to fly under the radar of Nintendo’s stringent licensing regime had become a real problem on the Famicom. Tellingly, the lockout system was by far the most technically advanced aspect of the NES.

The Nintendo Entertainment System

The Nintendo Entertainment System

The new NES made its public debut at last at the Summer Consumer Electronics Show in June of 1985. Few in the home-computer trade press — the videogame trade press didn’t really exist anymore — paid it any real attention. The big news of the show was rather the new Jack Tramiel-led Atari’s 16-bit ST computer. Computer Gaming World was typical, mentioning the NES only as a passing bit of trivia at the end of a long CES feature article: “Nintendo even offered an entirely new game system.” Clearly Arakawa and company had an uphill climb before them.

They deliberately started small. They would sell the NES first in New York City only — chosen because Arakawa considered it the most cynical and challenging place to market a new gadget in the country, and, as the old song says, “if you can make it there you can make it anywhere.” Starting with a warehouse full of the first 100,000 NESs to arrive from Japan and a $50 million war chest, Arakawa and the two Howards personally visited virtually every toy and electronics store in the five boroughs to press flesh and demonstrate the NES to skeptical managers and proprietors — and (hopefully) to take orders when they were finished. Meanwhile Nintendo blitzed the airwaves with advertising. They managed to sell 50,000 NESs in New York alone that Christmas season — not bad for an unknown gadget in a field that everyone, from the most rarefied pundit to the most ordinary Joe or Jane on the street, considered to be yesterday’s fad.

From that promising start they steadily expanded: first to that other taste-maker capital Los Angeles, then to Chicago, to San Francisco, to Dallas and Houston, and finally nationwide. Sales hit the magic 1 million mark well before the end of 1986. Cheap and cheerful and effortless in its lack of fiddly disk drives and keyboards, the NES was selling by that point as well as the Commodore 64, and far better than any other home computer. In the NES’s second year on the market it eclipsed them all to such an extent as to make continued comparison almost pointless: 3 million NESs were sold during those twelve months alone. And, astonishingly, it was still just getting started. During 1988, 7 million NESs were sold, to go with 33 million cartridges, each of which represented yet more profit for Nintendo. Lifetime NES sales topped 30 million in 1990, by which time one out of every three American homes could boast one of these unassuming gray boxes perched underneath the television. Total NES and Famicom lifetime sales reached a staggering 75 million in 1992; as many Nintendos were by then in the world as all PCs, whether found in homes or businesses or schools, combined. Even the Atari VCS in the heyday of the first videogame fad had never been able to boast of numbers like this.

Because Nintendo had come into the console market when it was universally considered dead, they had been able to reinvent it entirely in their own image. Just as “Atari” had once been a synonym for videogames in general, now “Nintendo” threatened to become the same for a new generation of players. Savvy about branding and marketing in a way that Atari had never quite managed to be, Nintendo felt compelled to actively push against this trend by aggressively protecting and limiting the use of their trademarks; they didn’t want people buying a new “Nintendo” that happened to have the name of Sega, Sony, or 3DO stamped on its case.

Nintendo’s penetration of the North American market could (and doubtless has) serve as the basis of an MBA course in marketing and brand-building. Starting from the less than nothing of a dead industry replete with consumer ill-will, coming from a foreign nation that was viewed with fear and mistrust by many Americans, Nintendo of America built one of the largest and most insanely loyal customer bases the American economy has ever known. They did it by tying their own brand to brands their target demographic was known to already love, like Pepsi and McDonald’s. They did it by building Nintendo stores within stores in major chains from Macy’s to Toys “R” Us, where kids could browse and play under the benevolent gaze of Mario while their parents shopped. (By 1991, Nintendo alone represented 20 percent of Toys “R” Us’s total revenues, and seven of their ten best-selling single products.) They did it by building a massive mailing list from the warranty cards that their young customers sent in, then using contests and giveaways to make every single one of them feel like a valued member of the new Generation Nintendo. They did it by publishing a glossy magazine, Nintendo Power, full of hints and tips on the latest games and all the latest news on what was coming next from Nintendo (and nothing on what was coming from their competitors). They did it by setting up a hotline of “Nintendo Game Counselors,” hundreds of them working at any one time to answer youngsters’ questions about how to get through this tricky level or kill that monster. They did it by relentlessly data-mining to find out what their customers liked about their games and what they didn’t, and crafting new releases to hit as many players as possible precisely in their sweet spots. They did it by spending up to $5 million on a single 30-second television commercial, four or five times the typical going rate, making the new commercials for a new Nintendo game an event in themselves. They did it by making sure that Mario and Zelda and their other iconic characters were everywhere, from television shows to records, from lunch boxes to bed sheets. And they did it by never worrying their customers with the sorts of metrics that the home-computer makers loved: kilobytes and megabytes and colors and resolutions and clock speeds and bit counts. The NES was so thoroughly locked down that it was years before there was any published information available at all on what was really contained within those ubiquitous gray plastic shells.

If it can all sound a little soulless when laid out like that, well, few in business would argue with the end results. Nintendo seemed to be becoming more American than most Americana. “A boy between 8 and 15 without a Nintendo is like a boy without a baseball glove,” wrote Hobby World magazine in 1988. In 1990 a survey found Mario to be more recognizable to American children than that most American of all cartoon icons — Mickey Mouse.

And where did all of this leave the established American computer-game industry? That was a question that plenty in said industry itself were asking with ever-increasing frustration and even desperation. Total sales of computer games published on all platforms in 1989 totaled about $230 million; total sales for Nintendo cartridges, $1.5 billion. It wasn’t supposed to have gone like this. No one in computer games had seen anything like Nintendo coming. They, the computer-game industry, were supposed to have been the next big wave in American home entertainment — a chicken in every pot and a home computer in every living room. Instead this Japanese upstart had stolen their thunder to such an extent as to render their entire industry an afterthought, a veritable non-entity in the eyes of most financial analysts and venture capitalists. Just to add insult to the injury, they were being smothered by thoroughly obsolete 8-bit technology when they could offer consumers audiovisual feasts played on Amigas and Atari STs and IBM PS/2s with VGA graphics. A computer-game designer with Electronic Arts saw unnerving parallels between his own industry and another American industry that had been devastated by Japan in the previous decade:

The best companies and the best programmers were making computer games. But the Nintendo player didn’t care about the sophisticated leaps we were making on computers — the frame rate of the images or incredible sound. They just wanted fun. It was like we were making gas guzzlers and the Japanese were making subcompacts.

At street level the situation didn’t look much better. Fred D’Ignazio, a columnist for Compute!’s Gazette, shares a typical story:

My kids and I used to play games on our home computer — games like Epyx’s The Legend of Blacksilver, SSI’s Questron II, EA’s Jordan vs. Bird: One-on-One, Gamestar’s Take Down, Arcadia’s Aaargh!, and, of course gobs and gobs of good educational games.

Then the Nintendo landed, and things haven’t been the same since. The Nintendo runs day and night. (We’re not even allowed to shut off the machine when we go to bed because there’s always a game in progress — and there’s no disk drive to back it up.) Meanwhile, I don’t think our little home computer has been fired up in weeks.

The computer that was most damaged by Nintendo’s invasion of North America was undoubtedly the Commodore 64. It was very cheap in computer terms, but once you added in the cost of the essential disk drive it was nowhere near as cheap as the NES. And it was still a computer, even if a computer that had long been used primarily for playing games. You had to type in arcane commands to get a game started, had to wait for the game to load, often had to shuffle disks in and out of the drive and do a lot more waiting as you actually played. A Compute!’s Gazette reader shares the story of her attempt to introduce her Nintendo-loving eight-year-old nephew to the joys of Commodore 64 gaming:

As he looked through my 64 software to pick out a game, I started to give directions on how to handle the software and disk drive. Before I could finish he said, “I just want to use a cartridge and start playing.” After about fifteen minutes into a game he said, “This is great, but how come it takes so long to start the game again and why do I have to keep turning the disk over and over all the time?” Shortly after, he started complaining that his hand was too small for the joystick. He tried three other joysticks, but he either had the same problem or the joystick didn’t have the dexterity needed to play the game. He then said, “I wish I could use my Nintendo controls on your Commodore.” Soon after, he quit and went right to his Nintendo.

The Commodore 64 was in a very difficult position, squeezed from below by Nintendo and squeezed from above by the Amiga and Atari ST and, most of all, by ever more consumer-friendly MS-DOS-based machines from companies like Tandy, which were beginning to sport hard disks, crisp VGA graphics, sound cards, and mice. There wasn’t much that Commodore’s aged little breadbox could offer in response to a feature set like that. In the battle versus Nintendo for the low end, meanwhile, all of the immense force of playground public opinion was arrayed against the Commodore 64. The 64 was clunky and slow and ugly. It was the machine your big brother used to play games on, the one your parents kept pushing you toward to learn programming or to play educational (blech!) games. The Nintendo was the machine that all your friends played on — the same friends who would look on you as a freak if you tried to get them to play a computer game with you.

If you think that hardcore Commodore 64 users accepted this changing world order peacefully, you don’t have much experience with the fanatic platform loyalties of the 1980s. Their heated opinions on the 64’s Nintendo crisis spilled much ink on the pages of the remaining 64-centric magazines, moving through spasms of denial (“If Nintendo has the ability to keep its users captured, why do my two nephews keep pestering me to let them play the games that I have for my 64?”), advice (“Commodore could bring out some new peripherals like a light gun to play shooting games or a keyboard to make use of the superior sound of the 64”), and justification (“This letter was typed on a 64. Let’s see any Nintendo do that!”). When all else failed, there was always good-old-fashioned name-calling: “The word-processing capability of the 64 is a pointless feature to most Ninnies, since the majority of them don’t seem to be able to read and write anyway. Most of the Ninny chic was built on the fact that a baboon could operate it.”

None of this raging against the dying of the light could make any difference. The Commodore 64 went into an undeniable decline in 1988. That decline became a free fall in 1989, and in 1990 the 64 was effectively declared dead by the American software industry, with virtually every publisher terminating support. The other great 8-bit survivor, the Apple II, hung on a little longer thanks to an entrenched user base in schools and small businesses, but when Apple finally discontinued all production of the line in 1993 the news was greeted by most publishers with a shrug: “I didn’t know those old things were still being made!”

The computer-game publishers’ reactions to Nintendo were complicated, ofttimes uncertain, occasionally downright contradictory. With Nintendo rapidly taking over what used to be the low end of the computer-game market, many publishers felt emboldened to refocus their energies on the still slowly growing higher end, particularly on all those new consumer-oriented clones from Tandy and others. Plenty of publishers, it must be said, weren’t really all that sad to see the 64 go. The platform had always been tricky to develop for, and its parent company was still widely loathed for heaps of very good reasons; everyone in the industry seemed to have at least one Commodore horror story to tell. Many had come to see the 64 during its years of dominance as an albatross holding back ambitions that would have been realizable on the bigger, more powerful platforms. Now they were at last free to pursue those grander schemes.

At the same time, though, the Commodore 64 had been their cash cow for years, and there remained the question of whether and how soon all those bigger machines would make up for its loss. Certainly they failed resoundingly to take up the slack in 1989, a bad year for the computer-game industry and a great one for Nintendo.

As unhappy as the majority of industry old-timers remained with the Nintendo-dominated state of affairs in digital games in general, that $1.5 billion in annual cartridge revenue and massive mainstream penetration was awfully tempting. As early as 1988, it seemed that just about everyone was discussing adapting their computer games to the NES, and a fair number were swallowing their pride to approach Nintendo with hat in hand, asking for a coveted license to make NES games. In addition to the sheer size of the Nintendo market, it also had the advantage that piracy, which many in the computer-game industry continued to believe was costing them at least half of the revenues they would otherwise be enjoying, was nonexistent there thanks to those uncopyable cartridges and the NES’s elaborate lockout system.

Activision, [1]Activision changed their name to Mediagenic midstream in these events. Because I haven’t told the story behind that change yet, and in order to just generally avoid confusion, I simply refer to the company as “Activision” in this article. who had enjoyed their greatest success by far in the old glory days of the Atari VCS, jumped onto the Nintendo bandwagon with perhaps the most enthusiasm of all. Activision’s head, the supremely unsentimental Bruce Davis, often sounded as if he would be perfectly happy to abandon computers altogether, to make Activision exclusively a publisher of videogame cartridges again: “If hardware companies are designing a machine for one purpose, they will do a better job than on a multi-function machine.”

But it’s the more unlikely NES converts that provide the best evidence of just how far Nintendo had come and just how much pressure the traditional computer-game industry was feeling. The NES began to get quite a number of ports of computer-game fare that no one would ever have imagined trying to put on a machine like this just a year or two earlier. Origin, for instance, put out NES versions of Ultima III and Ultima IV, and Lucasfilm Games ported Maniac Mansion. (See Douglas Crockford’s “The Expurgation of Maniac Mansion for a description of the hoops publishers like Lucasfilm had to jump through to meet Nintendo’s stringent content restrictions.) Even SSI, whose traditional stock-in-trade of turn-based, cerebral, complicated strategy games was about as far from the whimsy of Mario and Zelda as you could get, moved Pool of Radiance over to the NES. Computer Gaming World, the journal of choice for those same cerebral strategy gamers, tried to rope in Mario fans with a new magazine-within-a-magazine they dubbed “Video Gaming World.”

Few of these initiatives bore all that much fruit. The publishers may have found a way to get their games onto the NES, but said games remained far from the sort of fare most Nintendo players were interested in; suffice to say that Nintendo never had to worry about any of these titles eclipsing Mario. Still, the fact that so many computer-game publishers were making such an effort shows how scary and uncertain Nintendo was making their world. Perhaps the most telling moment of all came when Trip Hawkins announced that Electronic Arts would be jumping into the console space as well. This was the same Trip Hawkins who had written a commitment to “stay with floppy-disk-based computers only” into Electronic Arts’s first business plan, who had preached the gospel of home computers as successors to videogame consoles as loudly and proudly as anyone in his industry. Now he and his company were singing a very different tune. Bing Gordon, Hawkins’s right-hand man at Electronic Arts, compared home computers to, of all unflattering things, steam engines. James Watt, the inventor of the steam engine, had imagined one in every home, with a bunch of assorted pulleys and gears to make it do different things. Instead modern homes had a bunch of more specialized machines: washing machines, food processors… and now Nintendos. Soon Hawkins would leave Electronic Arts to found 3DO, a company to make… you guessed it, a new videogame console.

Some, however, chose a more belligerent path than these can’t-beat’em joiners. Nintendo’s rigorous control of the NES’s walled garden rankled everyone in the older software industry; this just wasn’t how their business was done. They believed that Nintendo was guilty of restraint of trade, antitrust violations, you name it. Particularly enraging was Nintendo’s complete control of the manufacturing pipeline for NES cartridges. Leveraging those data-mining systems of theirs, more sophisticated than anyone had heretofore ever dreamed of, Nintendo made sure that the supply of new games was always slightly less than the demand for them, thereby creating a hype for each new title as a hot, desirable status symbol among the Nintendo Generation and, most of all, avoiding the glut of games piled up in warehouses — and, eventually, landfills — that had marked the Great Videogame Crash of 1983. But when American publishers saw their games produced in insufficient quantities to become the hits they believed they might otherwise have been, they cried foul. The Software Publishers Association served as the disgruntled voice of the American software industry as a whole in what became a full-scale public-relations war against Nintendo.

The SPA believes that Nintendo has, through its complete control and single-sourcing of cartridge manufacturing, engineered a shortage of Nintendo-compatible cartridges. Retailers, consumers, and independent software vendors have become frustrated by the unavailability of many titles during the holiday season, and believe that these shortages could be prevented by permitting software vendors to produce their own cartridges.

American publishers felt certain that Nintendo was playing favorites, favoring their own games and those of their favorite third-party publishers — generally the ones from Japan — by manipulating production numbers and manipulating the sentiments of Generation Nintendo through the coverage they gave (or didn’t give) each game in Nintendo Power. “If I pissed Nintendo off,” runs a typical complaint, “I would get less product. My games would get hit in Nintendo Power and they’d get low ratings.” And the most surefire way to piss Nintendo off, at least according to this complainer, was to release a game for the NES’s first serious competitor, the Sega Genesis console that entered the United States in 1989.

There was plenty of tinder already lying about the public sphere, just waiting to be ignited by such rhetoric. All of the concerns about videogames that had been voiced by parents, educators, and politicians during the heyday of Generation Atari were now being dusted off and applied to Generation Nintendo. Now, however, they were given additional force by Nintendo’s very foreignness. Plenty of Americans, many of whom had still not completely forgiven Japan for Pearl Harbor, saw a nefarious agenda behind it all, a fifth column of Mario-obsessed youngsters who might come to undermine the very nation. “Notice the way Super Mario is drawn,” wrote one in a letter to a magazine. “He has the eyes of someone who has been brainwashed.” Lurking just below the surface of such complaints, unstated but by no means unconveyed, were old attitudes toward the Japanese as shifty characters who could never be trusted to follow the rules, whether in war or business. It all came down to “cultural” differences, they muttered disingenuously: “There’s more of a sharing of the pie by American companies. In Japan, it’s different: winners win big and losers lose.”

Hoping to capitalize on the burgeoning anti-Nintendo sentiment, in December of 1988 Tengen Games, a spinoff of Atari Games (which was itself the successor to the standup-arcade portion of the original Atari’s business), sued Nintendo in federal court for antitrust violations and monopolistic practices: “The sole purpose of the lockout system is to lock out competition.” Having found a way to defeat the much-vaunted lockout system through a combination of industrial espionage, reverse engineering, and good old social engineering — this is one of the few occasions in Nintendo’s history where one might accuse them of having been naive — Tengen simultaneously launched a few of their own unauthorized games for the NES.

Nintendo’s counterattack against Tengen was massive and comprehensive. Not only did they launch the expected blizzard of legal actions, but they made it clear to all of the stores that handled their products that there would be grave consequences if they chose to sell the Tengen games as well. Such threats ironically represented a far more clear-cut antitrust violation than anything found in Tengen’s original suit. When Tengen got the court to order Nintendo to cease and desist from such behavior, Nintendo allegedly only became more subtle. “You know, we really like to support those who support Nintendo, and we’re not real happy that you’re carrying a Tengen product,” a rep might say. “By the way, why don’t we sit down and talk about product allocations for next quarter? How many Super Marios did you say you wanted?” “Since it was illegal, there were always excuses,” remembers one retailer. “The truck got lost, or the ship from Japan never arrived.”

Tengen was determined to try their case against Nintendo first and foremost in the court of American public opinion. “Who gave Nintendo the power to decide what software the American public can buy?” they asked. The New York Times, for one, agreed with them: “A verdict in favor of Nintendo would probably have a spillover effect into the personal-computer industry, where it could have a chilling effect on the free flow of ideas and innovations that have characterized that market since its inception.” An opportunistic Congressman named Dennis Eckart launched a high-profile crusade against Nintendo that led to lots of heated rhetoric amid Congressional hearings and the involvement of several state Attorneys General and the Federal Trade Commission. Jack Tramiel of the other Atari (the one currently making the Atari ST computer), who had always viewed lawsuits as healthy business competition by other means, piled on with a suit of his own, claiming that by monopolizing the market Nintendo was keeping his own company from getting good software for its machines. “Nintendo has demonstrated its disregard for free and fair competition in America,” said Jack’s son and anointed successor Sam Tramiel.

Yet the anti-Nintendo sentiment in the country didn’t ultimately do much to help either of the two Ataris’ legal cases; the courts proved willing to buck that rising tide. In a landmark ruling against Tengen in March of 1991, Judge Fern Smith stated that Nintendo had the right to “exclude others” from the NES if they so chose, thus providing the legal soil on which many more walled gardens would be tilled in the years to come. Similarly, the Tramiels’ suit against Nintendo was definitively rejected in 1992, after having cost their company a great deal of time, energy, and most of all money it could ill afford. The other various and multifarious investigations into Nintendo’s business, of which there were far too many to summarize here, resulted in a mixed bag of vindications and modest slaps on the wrist that did nothing to alter Nintendo’s overall trajectory. Perhaps the best argument against Nintendo as a monopoly was the arrival of the company’s first competitors in the console space, beginning with Sega, who proved that it actually was still possible to carve out a non-Nintendo place of one’s own in the game-console industry that Nintendo had so recently resurrected.

Nintendo, then, was here to stay, as were Sega and other competitors still to come. The computer-game industry would just have to accept that and reckon with it as best they could. In the end, the threat from Japan proved not quite as apocalyptic as it had seemed during the darkest days of 1989. In 1990 computers could start to boast of a modest new buzz of their own, thanks to the new so-called “multimedia PCs” and a bunch of new games that took advantage of their capabilities. Having ceded the low ground to the consoles, computers had retained the high ground, a loyal constituency of slightly older, more affluent gamers who still had plenty of room in their hearts for the sort of big, high-concept strategy, adventure, and CRPG games that weren’t all that realizable on the more limited consoles. The computer-game industry grew again already in 1990, and by a double-digit percentage at that. The vibrant jungle of PC gaming would continue to bloom in a thousand ways at once, some of them productive, some of them dead ends, some of them inspiring, some of them kind of repugnant. And through it all, the jungle of PC gaming would remain interesting in ways that, at least for this humble writer, the fussily manicured walled garden of Nintendo has never quite managed to be. But whichever mode of distribution you personally favored, one thing became clear as the 1980s gave way to the 1990s: neither Generation Nintendo nor the emerging Generation Wintel would be going anywhere anytime soon.

(Sources: The Making of the Atomic Bomb by Richard Rhodes; Game Over by David Sheff; Compute!’s Gazette of May 1988, March 1989, August 1989, September 1989, October 1989; Computer Gaming World of September/October 1985 and June 1988; Amazing Computing of January 1989; materials in the SSI and Brøderbund collections at the Strong Museum of Play.)

Footnotes

Footnotes
1 Activision changed their name to Mediagenic midstream in these events. Because I haven’t told the story behind that change yet, and in order to just generally avoid confusion, I simply refer to the company as “Activision” in this article.
 

Tags: , ,

Opening the Gold Box, Part 3: From Tabletop to Desktop

Joel Billings of SSI never had a whole lot of use for Dungeons & Dragons, TSR, or RPGs in general. In this he was hardly unique among hardcore wargamers. The newer hobby had arisen directly from the older, forcing each and every grognard to a judgement and a reckoning. Some wargamers saw in RPGs the experiential games they had really been wanting to play all along; they jumped onto the RPG bandwagon and never looked back. Others, the ones who found Montgomery and Rommel far more interesting than Frodo and Sauron, scoffed at RPGs and their silly fantasies and clung all the tighter to their Avalon Hill and SPI boxes. And of course some split the difference, playing a little of this and a little of that.

Joel counted himself among the scoffers. His one experience with playing Dungeons & Dragons hadn’t been a positive one: a sadistic Dungeon Master killed his whole party before he had even begun to figure out what was going on. “This is the stupidest game I’ve ever seen,” he concluded. He never felt seriously tempted to try it again.

By the time that SSI was off and running, Joel and other wargame stalwarts like him had more reasons than ever to dislike RPGs. The late 1970s, you’ll remember, had seen the wargame at its commercial zenith, the RPG the exciting, fast-rising upstart genre. As the 1980s dawned and Dungeons & Dragons exploded into a popularity no wargame had ever dreamed of, it was hard not to blame one genre’s rapid rise for the other’s slow decline. Already in 1982 SPI, alongside Avalon Hill one of the twin giants of wargaming, found themselves in a serious financial crisis brought on partly by the general decline of the wargame market, partly by the general recession afflicting the American economy at the time, and partly by general mismanagement all too typical of their hobbyist-driven industry. TSR, now more than ten times the size of SPI thanks to the Dungeons & Dragons fad, gave them a secured loan of $425,000 to keep their doors open a while longer.

It will likely never be known whether what happened next was the result of Machiavellian scheming or just Gary Gygax and the Blume brothers’ usual bumbling approach to running TSR. Just two weeks after giving SPI the loan, TSR inexplicably called it in again. Having already used TSR’s money to satisfy their other creditors, SPI had no possible way to pay back the loan. TSR therefore foreclosed, announcing that they were taking over SPI. Shortly thereafter, realizing that SPI was so financially upside down as to become a negative asset on their books, they announced that what they had actually meant to say was that they were assuming ownership of all of SPI’s assets but none of their debts. When SPI’s creditors balked at this brazen attempt by TSR to have their cake and eat it too, TSR negotiated to pay them off for pennies on the dollar; something was better than nothing, figured the creditors. The end result was an SPI bankruptcy filing in effect if not in fact.

But any old wargamer who thought that the TSR purchase heralded better days for the company and the hobby was quickly disabused of that notion. TSR proved a terrible steward of SPI’s legacy, alienating their entire old design team so badly that they left en masse to reform as a new Avalon Hill subsidiary called Victory Games. Worse, TSR claimed that their acquisition of SPI’s assets had not included the paid-up subscriptions to SPI’s beloved house organ Strategy & Tactics; subscriptions were not assets at all, you see, but “liabilities.” Every Strategy & Tactics subscriber, even those who had splashed out a bundle for a “lifetime” subscription, would have to re-up immediately to continue receiving the magazine. And no, there would be no compensation for missed issues from the old regime. This act of betrayal of SPI’s most loyal customers didn’t just kill the most respected wargaming magazine in the world; it also, as Greg Costikyan puts it, shot the old subculture of wargaming in general in the head.

So, if a veteran wargamer like Joel Billings needed further reason to dislike all this Dungeons & Dragons silliness, there he had it. Trip Hawkins, a member of SSI’s board from the company’s inception, claims that he started telling Joel that he should branch out into CRPGs almost immediately after SSI was founded. But, although SSI quickly began to supplement their wargames with sports titles and other sorts of strategy games, Joel resisted CRPGs, saying that he preferred to publish “the games that he enjoyed personally.” RPGs, whether played on the tabletop or the desktop, clearly weren’t in that category.

Although Joel did nothing to encourage CRPG submissions, in late 1983 a fairly decent one arrived of its own accord. Written by two teenage brothers, Charles and John Dougherty, Questron had already ping-ponged around the industry a bit before it reached SSI. When the Dougherty brothers had sent it to Origin Systems, Richard Garriott had not only rejected it but told them in no uncertain terms to expect legal trouble if they dared to release something he considered to be so obviously derivative of his own Ultima games. Word of Garriott’s displeasure may very well have made the other major publishers shy away, until it ended up with the Doughertys’ long-shot, nichey little SSI. Joel decided that, with a first entry in the genre all but gift-wrapped on his desk, he might as well dip a toe into these new waters and see how it went. SSI published Questron in February of 1984, albeit only after finding a way to placate an angry Garriott, who learned of their plans to do so at the January 1984 Winter Consumer Electronics Show and pitched a royal fit. Joel gave him a small stake in Questron‘s action and a small note on its box: “Game structure and style used under license of Richard Garriott.”

Questron

Questron proved a modest start to something very significant. The game, benefiting from the lack of new Ultima or Wizardry titles during 1984, did unexpectedly well. In fact, when the Commodore 64 port of the Apple II original shipped in August, it became the fastest-selling new release SSI had ever enjoyed. The final total would hit almost 35,000 copies, pretty good numbers for a company whose average game still failed to break 10,000 copies. Some meeting notes dated December 2, 1984, make the new thinking that resulted clear: “Going into fantasy games now, could really affect sales favorably.” A little over a month later, SSI was already going through something of an identity crisis: are we a “wargame company” or a more generalized “computer-game company,” more meeting notes plaintively ask.

But SSI would have a hard time building on the momentum of Questron in the time-honored game-industry way of turning it into a franchise. In the contract the Dougherty brothers had signed with SSI, the latter was granted a right of first refusal of a potential sequel. This put the Doughertys in essentially the same situation as a restricted free agent in sports: they were free to shop a potential Questron II to other publishers if they wished, but they had to allow SSI the chance to match any publisher’s offer before signing a final contract. Not understanding or choosing to ignore this stipulation, the Doughertys allowed themselves to be poached by none other than Trip Hawkins’s Electronic Arts, who, with The Bard’s Tale series still in the offing, were eager to hedge their bets with another potential new CRPG franchise. SSI knew nothing about what was going on until the Doughertys announced that they had gone over to the slicker, better-distributed Electronic Arts — farewell and thank you very much for everything. Feeling compelled to defend his own company’s interests, Joel sued Electronic Arts and the Doughertys. A potential Questron series remained in limbo, its momentum dissipating, while the lawsuit dragged on. The situation doubtless made for some strained times back at SSI’s offices, where board-member Trip Hawkins was still coming every month for the directors meeting.

The suit wasn’t settled until April of 1987, ostensibly at least largely in SSI’s favor. The Doughertys’ long-delayed sequel was published shortly thereafter by Electronic Arts, but under the new title of Legacy of the Ancients. Meanwhile the Doughertys were obliged to design, but not to program, a Questron II for SSI; the programming of the sequel could either be done in-house by SSI or outsourced elsewhere at their discretion. It ended up going to Westwood Associates, a frequent SSI contractor on ports and other unglamorous technical tasks who would soon be making a bigger name for themselves as a developer of original games. Released at last in February of 1988, Questron II felt rather uninspired, as one might expect given the forced circumstances of its creation. It did surprisingly well, though, outselling the first Questron by some 16,000 copies. Rather than its own merits, its success was likely down to increasing enthusiasm for CRPGs in general among gamers, and to other things going on that year that were suddenly making little SSI among the biggest names in the genre.

Questron II

In the immediate wake of Questron I‘s release and success, however, those events were still well in the future. Neither Joel Billings’s troubles with his two teenage problem children nor his personal ambivalence toward CRPGs deterred him from recognizing the potential that game had highlighted. Never a publisher to shy away from releasing lots of games, SSI added CRPGs to their ongoing firehose of new wargames. To Joel Billings the businessman’s pleasure if perhaps to Joel Billings the wargamer’s chagrin, the average SSI CRPG continued to do far, far better than the average wargame. Indeed, their very next CRPG(ish) game after Questron, an unusual action hybrid called Gemstone Warrior released in December of 1984, became their first game of any type to top 50,000 copies sold. The more traditional Phantasie — names weren’t really SSI’s strong suit — in March of 1985 also topped the magic 50,000 mark. Soon the CRPGs were coming almost as quickly as the wargames: Rings of Zilfin (January 1986, 17,479 sold); Phantasie II (February 1986, 30,100 sold); Wizard’s Crown (February 1986, 47,676 sold); Shard of Spring (July 1986, 11,942 sold); Roadwar 2000 (August 1986, 44,044 sold); Gemstone Healer (September 1986, 6030 sold); Realms of Darkness (February 1987, 9022 sold); Phantasie III (March 1987, 46,113 sold); The Eternal Dagger (June 1987, 18,471 sold); Roadwar Europa (July 1987, 18,765 sold).

As the list above attests, sales figures for these games were all over the place, but trended generally a bit downward over time as SSI flooded the market. Yet one thing did remain constant: the average SSI CRPG continued to outsell the average SSI wargame by a healthy margin. (The only exception to this rule was Roger Damon’s remarkable Wargame Construction Set, which after its release in October of 1986 became a surprise hit, the first SSI game to crack 60,000 copies sold.) All of these SSI CRPGs — so many coming so close together that it’s difficult even for dedicated fans of the genre’s history to keep them all straight — occupied a comfortable if less than prestigious second rung in the industry as a whole. To describe them as the games you played while you waited for the next Ultima or The Bard’s Tale may sound unkind, but it’s largely accurate. Like SSI’s other games, they tended to be a little bit uglier and a little bit clunkier than the competition.

Wizard's Crown

At their best, though, the rules behind these games felt more consciously designed than the games in the bigger, more respected series — doubtless a legacy of SSI’s wargame roots. This quality is most notable in Wizard’s Crown. The most wargamey of all SSI’s CRPGs, Wizard’s Crown was not coincidentally also the first CRPG to be designed in-house by the company’s own small staff of developers, led by Paul Murray and Keith Brors, the two most devoted tabletop Dungeons & Dragons fans in the office. Built around a combat engine of enormous tactical depth in comparison to Ultima and The Bard’s Tale, it may not be a sustainedly fun game — the sheer quantity and detail of the fights gets exhausting well before the end, and the game has little else to offer — but it’s one of real importance in the history of both SSI and the CRPG. Wizard’s Crown and its sequel The Eternal Dagger, you see, were essentially a dry run for the series of games that would remake SSI’s image.

Coming off a disappointing 1986, the first year in which SSI had failed to increase their earnings over the previous year, Joel Billings was greeted with some news that was rapidly sweeping the industry: that TSR was interested in making a Dungeons & Dragons computer game, and that they would soon be listening to pitches from interested parties. To say that Dungeons & Dragons was a desirable license hardly begins to state the case. This was the license in CRPGs, the name that inexplicably wasn’t there already, a yawning absence about to become a smothering presence at last. Everyone wanted it, and had wanted it for quite some time. That group included SSI as much as anyone; once again pushing aside any misgivings about getting into bed with the company that had shot his own favorite hobby in the head, Joel had been one of the many to contact TSR in earlier years, asking if they were interested in a licensing deal. They hadn’t been then, but now they suddenly were. Encouraged by Murray and Brors and other rabid Dungeons & Dragons fans around the office, Joel decided to put on a “full-court press,” as he describes it, to spare no effort in trying to get the deal for his own little company. Sure, it looked like one David versus a whole lot of Goliaths, but what the hell, right?

The full list of Goliaths with which SSI was competing for the license has never been published, but in interviews Joel has mentioned Origin Systems (of Ultima fame) and Electronic Arts (of The Bard’s Tale fame) as having been among them. As for the other contenders, we do know that there were at least seven more of them. One need only understand the desirability of the license to assume that the seven (or more) must have been a veritable computer-game who’s who. “We were going head to head with the best in the industry,” remembers Chuck Kroegel, a programmer and project manager on SSI’s in-house development team.

SSI was duly granted their hearing, scheduled for April 8, 1987, at TSR’s Lake Geneva, Wisconsin, headquarters. With a scant handful of weeks to prepare, they scrambled desperately to throw together some technology demos; these felt unusually important to SSI’s pitch, given that they were hardly known as a producer of slick or graphically impressive games. Those with a modicum of artistic talent digitized some monster portraits out of the Monster Manual on a Commodore Amiga, coloring them and adding some spot animation. Meanwhile the programmers put together a scrolling three-dimensional dungeon maze, reminiscent of The Bard’s Tale but better (at least by SSI’s own reckoning), on a Commodore 64.

But it was always understood that these hasty demos were only a prerequisite for making a pitch, a way to show that SSI had the minimal competency to do this stuff rather than a real selling point. When SSI’s five-man team — consisting of Joel Billings, Keith Brors, Chuck Kroegel, the newly hired head of internal development Victor Penman, and Vice President of Sales Randy Broweleit — boarded their plane for Lake Geneva, they were determined to really sell TSR on a vision: a vision of not just a game or two but a whole new computerized wing of Dungeons & Dragons that might someday equal or eclipse the tabletop variant. The pitch document that accompanied their presentation has been preserved in the SSI archive at the Strong Museum of Play. I want to quote its key paragraphs, the “Overview,” in full.

The Advanced Dungeons & Dragons computer game system would be provided as a series of modules built around a central character-creation, combat, and magic system. The first release would be this central system, which would include a modest dungeon adventure. It would be followed by the release of a number of adventure modules suitable for beginning-level characters. With the passage of time, higher-level adventures and more character types would be offered. Editors which would permit users to create their own dungeons, outdoors, and cities would also be provided. The timing on the introduction of these later releases would be determined by market demand.

The first release would be the central system. It would be similar to the Player’s Handbook in that it would provide for the creation of a number of character classes, combat, and spells. The players would draw on these abilities to create their characters for adventuring. Also included in this first release would be an introductory dungeon adventure in which the computer program would perform as DM.

This first release would be followed by a number of adventure games similar to TSR’s dungeon and adventure modules. The earliest of these would be aimed at beginning characters. As time passed and players had an opportunity to build up more powerful characters, more challenging modules would be released.

It is anticipated that at least three game sets will be released as a result of periodic improvements in and expansions of the game system. Each of these would be built on an improved and expanded version of the central system. The systems would be kept upwardly compatible so that characters developed on earlier versions of the system could take advantage of its improvements. Dungeon and adventure modules would be created for each of these game sets.

At some point (to be determined by marketing considerations) a number of editors would be released. These editors would enable the users to create their own computer adventures. The first of these would be a Dungeon Master’s Guide-type package, which would provide instructions and tools for setting up the adventures and a Monster Manual-type package to provide monsters for these adventures (the monster disk might be released much earlier since we can see non-DMs wanting it). Specialized packages for creating outdoor adventures, city adventures, overland adventures, seafaring adventures, underwater adventures, etc., would be added to meet market demand.

SSI's original plan for a Dungeons & Dragons "product family," as presented at their pitch. You can see traces of what would come here -- the eventual "Gold Box" line of CRPGs would be grouped into three separate series, each offering the chance to import characters from one game into the next -- the idea of a central "game disk" and add-on "adventure modules" would be thankfully abandoned.

SSI’s original plan for a Dungeons & Dragons “product family,” as presented at their pitch. You can see glimmers of what would come later here — the eventual “Gold Box” line of CRPGs would be grouped into three separate series, each offering the chance to import characters from one game into the next — but the idea of a central “game disk” and add-on “adventure modules” would be thankfully abandoned.

In some ways, what this overview offers is a terrible vision. The Wizardry series had opted for a similar overly literal translation of Dungeons & Dragons‘s core-game/adventure-module structure, requiring anyone who wanted to play any of the later games in the series to first buy and play the first in order to have characters to import. The fallout from that decision was all too easy to spot in the merest glance at the CRPG market as of 1987: the Wizardry series had long since pissed away the position of dominance it had enjoyed after its first game to become an also-ran (much like SSI’s own CRPG efforts) to Ultima and The Bard’s Tale.

On the other hand, though, this overview is a vision, which apparently stood it in marked contrast to most other pitches, focused as they were on just getting a single Dungeons & Dragons game out there as quickly as possible so everyone could start to clean up. TSR innately understood SSI’s more holistic approach. With the early 1980s Dungeons & Dragons fad now long past, their business model relied less on selling huge quantities of any one release than in leveraging — some would say “exploiting” — their remaining base of hardcore players, each of whom was willing to spend lots of money on lots of new products.

Further, the TSR people and the SSI people immediately liked and understood one another; the importance of being on the same psychological wavelength as a potential business partner should never be underestimated. Born out of wargames, TSR seemed to have that culture and its values entwined in their very DNA, even after the ugly SPI episode and all the rest of the chaos of the past decade and change. Many of the people there knew exactly where scruffy little SSI was coming from, born and still grounded in the culture of the tabletop as they were. These same folks at TSR weren’t so sure about all those bigger, slicker firms. While Joel Billings may not have had a lot of personal use for Dungeons & Dragons, that certainly wasn’t true of many of his employees. Joel claims that the “bottom line” that sold TSR on SSI was “an R&D staff that knows AD&D games, plays AD&D games, and enjoys AD&D games.” They would feel “honored to be doing computer AD&D games. If you’re doing fantasy games, the AD&D game is the one to do.” Chuck Kroegel sums up SSI’s biggest advantage over their competitors in fewer words: “We wanted this project more than the other companies.” That genuine personal interest and passion, along with SSI’s idea that this would be a big, ambitious, multi-layered, perhaps era-defining collaboration — TSR had never been known for thinking small — were the important things. The details could be worked out later.

At the Summer Consumer Electronics Show in June — yes, it’s that landmark CES again — SSI and TSR announced their unlikely partnership, formally signing the contract right there at the show in front of the press and SSI’s shocked rivals. The contract was for five years of Dungeons & Dragons software, with options to renew thereafter. It would officially go into effect on January 1, 1988, although development of a planned torrent of products would start immediately.

There would be three distinct Advanced Dungeons & Dragons product lines. One line, which grew out of whole cloth during the negotiations, would be a series of “multi-player action/arcade games” that used settings and characters from TSR’s various novels and supplements, but otherwise had little to do with the tabletop game: “These games will focus on special aspects of AD&D, such as swordplay, spell-casting, and dungeon and wilderness exploration.” Having no particular competence in the area of action games, SSI would sub-contract with their European publishers, U.S. Gold, to make these games, drawing from the deep well of hotshot British game programmers to which U.S. Gold had access.

Another line evolved out of SSI’s original plan for a sort of “Dungeons & Dragons Construction Set.” Instead of letting Dungeon Masters make new computerized adventures — SSI and TSR, like many other companies, were worried about killing the market for future games by putting too good game-making tools in the hands of players — the Dungeon Masters Assistant line would be designed to aid in the construction of adventures and campaigns for the tabletop game.

And finally there was the big line: a full-fledged implementation of Advanced Dungeons & Dragons as a series of CRPGs. The idea of a “central system” with “adventure modules” blessedly disappeared within a few months of the contract signing, replaced by a series of standalone games that would allow those who wished to do so to import the same party into each sequel; those who didn’t wish to do so, or who hadn’t played the earlier games at all, would still be able to create new characters in the later games.

The choice of a partner for this high-profile deal had been driven entirely by the creative types at TSR and the kinship they felt for SSI. That’s doubly surprising when you consider that it occurred well into the reign of Lorraine Williams, whose supposed dislike of games and gamers and constant meddling in the design process would later win her an infamous place in fan legend as the most loathed real-life villain in the history of the tabletop RPG. Whatever the veracity of the other claims made against her, in this case she ignored lots of very sensible questions to let her creative people have the partner they wanted. Could nichey little SSI improve their marketing and distribution enough to get the games in front of as many potential customers as someone like Electronic Arts? Could SSI raise the standards of their graphics and programming to make something attractive and slick enough to match the appeal of the Dungeons & Dragons trademark? In short, was SSI really up to this huge project, many times greater in scope than anything they’d done before? Lorraine Williams was betting five years of her flagship brand’s future, the most precious thing TSR owned, on the answer to all of these questions being yes. It was one hell of a roll of the dice.

SSI was more than ready to crow about their coup.

SSI was more than ready to crow about their coup from the moment the contract was signed.

If SSI was to pull it off, they would have to mortgage their hopefully bright future as the software face of Dungeons & Dragons and expand dramatically. In the months following the contract-signing ceremony, their in-house development staff expanded from 7 to 25 people. Among the new hires were SSI’s first full-time pixel artists, hired to give the new products a look worthy of the license. SSI’s games having never been the sort to wow anyone with their beauty, figuring out the graphics thing presented perhaps the greatest challenge of all, as Victor Penman recognized:

In the past, when SSI was primarily a wargames company, graphics were not as important as game play. Now the graphics will be better, making this product more of an improvement than any other. We’re committed to carrying out state-of-the-art graphics all the way down the line, so we’re dedicated to game sophistication and a new level of graphics more so than anything we’ve done to date.

With the action games outsourced to U.S. Gold and the Dungeon Masters Assistant line being less demanding projects likely to be of only niche appeal anyway, the big push at SSI was on the first full-fledged Dungeons & Dragons CRPG. The new project used the two Wizard’s Crown games, especially those games’ intricate tactical-combat system, as a jumping-off point; most of the SSI veterans who had worked on those games were now employed on this new one. But that could only be a jumping-off point, for SSI’s plans needed to be much more ambitious now to please both TSR and the gaming public, who would expect this first real Dungeons & Dragons CRPG to be something really, truly special. As the first CRPG of a series that would come to include many more, a whole software ecosystem needed to be built from scratch to create it. A multi-platform game engine, interpreters, scripting languages, and level editors were all needed just for starters.

In a move that SSI would soon have cause to regret, the tool chain was built around the Commodore 64, then enjoying its belated final year as the American home-computer industry’s dominant platform. The choice isn’t hard to understand in the context of 1987: the 64 had been around for so long and for so strong that one could almost believe it would continue forever. SSI had sold 35 percent of all their games on the Commodore 64 during 1986, 10 percent more than its closest rival, the Apple II. If anything, these numbers were low for the industry in general, reflecting SSI’s specialization in cerebral strategy games, traditionally a bastion of the Apple II market. With this new partnership, SSI’s bid for the big time, there seemed every reason to think that the 64’s percentage of the pie would only increase. Therefore they would build and release the Dungeons & Dragons games first on the Commodore 64, ensuring that they looked and ran well on that all-important platform. Then they could adapt the same engine to run on the other, often more capable platforms.

The arrival of Dungeons & Dragons at SSI and the dramatic upending of the daily routine that it wrought created inevitable tensions at what had always been a low-key, workmanlike operation. The minority of staffers assigned to the non-Dungeons & Dragons business-as-usual — i.e., the company’s wargames and the last sprinkling of non-licensed CRPGs in the pipeline — started to feel, in the words of Chuck Kroegel, like “outcasts.” Staffers referred to themselves as either working in Disneyland (everything Dungeons & Dragons) or being exiled to Siberia (everything non-Dungeons & Dragons). Sometimes those descriptions could feel distressingly literal: desperate for space, SSI exiled the small team that tested and perfected non-Dungeons & Dragons external submissions to an unheated, cheerless nearby building. “There was a feeling on their part that we were getting all the goodies and they got all the cold Arctic air,” remembers Keith Brors.

Jim Ward, who got on fabolously with SSI, visits in 1990 to celebrate the company's tenth anniversary along with his plus-one.

Jim Ward, who got on fabulously with SSI, visits along with his plus-one in 1990 to celebrate the company’s tenth anniversary.

The folks in Disneyland got plenty of help from Lake Geneva. In the beginning the TSR/SSI partnership really was a partnership, standing it in marked contrast to most similar licensing deals. The scenario for the first Dungeons & Dragons CRPG was first written and designed as a tabletop adventure module by three of TSR’s most experienced staff designers, working under one Jim Ward, whose own history with Dungeons & Dragons went back to well before that name existed, when he had played in Gary Gygax’s earliest campaigns. The tabletop module was passed on to SSI for implementation on the computer in January of 1988. SSI had their hands plenty full before that date just getting the game engine up and running; that job was described by Victor Penman as “equivalent to producing the Player’s Handbook, the Dungeon Master’s Guide, and the Monster Manual in one program.”

TSR’s close involvement ensured that the end result really did feel like tabletop Dungeons & Dragons, more so than any of the competing CRPG series — and this, of course, was exactly what its audience wanted. Ward’s team chose to set the game in TSR’s new campaign world of the Forgotten Realms, envisioned as the more generic, default alternative to the popular but quirky Dragonlance world of Krynn. The big boxed set that introduced the Forgotten Realms was published well after the contract signing with SSI, allowing TSR to carve out a space on the world’s map reserved for the computer games right from the outset. While many have grumbled that words like “generic” and “default” do all too good a job of describing the Forgotten Realms — “vanilla” is another strong candidate — Ward and company nevertheless drowned their scenario in the lore of the place, such as it is, leading to a CRPG with a sense of place comparable only to the Ultima series and its world of Britannia. To further cement the connection between Dungeons & Dragons the tabletop game and its computerized implementation, TSR prepared tie-in products of their own, including a novelization of the first CRPG written by Jim Ward with the help of Jane Cooper Hong and the original tabletop adventure module that had served as SSI’s design document.

SSI had promised TSR when making their original pitch that they could have an official Dungeons & Dragons CRPG ready to go within thirteen months at the outside of signing a deal. Joel Billings always took great pride in his company’s punctuality. Lingering, “troubled” projects of any stripe were a virtual unknown there during the 1980s; outside and in-house developers alike quickly learned to just get their games done and move on to the next if they wanted to continue to work with SSI. Dungeons & Dragons proved to be no exception. SSI would manage to meet their deadline of summer 1988.

With the big day drawing near, Joel Billings took an important step to address the still-lingering questions about whether SSI had the promotional and distributional resources to properly sell Dungeons & Dragons on the computer. It marked the next phase in SSI’s long, multi-faceted relationship with Trip Hawkins and his company Electronic Arts. Barely a year removed from settling SSI’s lawsuit and less than a year removed from losing the big TSR contract to them, Electronic Arts bought into SSI to the tune of 20 percent in May of 1988, giving the smaller company some much-needed cash to spend on a big Dungeons & Dragons promotional effort. SSI also became one of Electronic Arts’s affiliated labels, thus solving the distribution problems. As previous tales told on this blog will attest, such deals with the titans of the industry could be dangerous territory for smaller publishers like SSI. But SSI did have advantages that most of the affiliated labels didn’t: in addition to the longstanding personal relationship enjoyed by Trip Hawkins and Joel Billings, the buy-in would give Electronic Arts a real stake in SSI’s success, making them much harder to gut and cast aside if they should disappoint.

Grognards to the end, Trip Hawkins and Joel Billings dressed up as generals to celebrate their strategic alliance of May 1988.

Grognards to the end, Trip Hawkins and Joel Billings dress up as generals to celebrate their “strategic alliance” of May 1988.

SSI released the first title in all three branches of their new Dungeons & Dragons family tree in August of 1988, each on a different platform of the several each title would eventually reach. Dungeon Masters Assistant Volume I: Encounters shipped on the Apple II. It would sell 26,212 copies across four platforms — not bad for such a specialized utility. Heroes of the Lance, an action game set in Dragonlance‘s world of Krynn that was developed and delivered as promised from Britain, shipped on the Atari ST. The first of what would come to be known as the “Silver Box” line of action-oriented Dungeons & Dragons games, it would sell an impressive 88,808 copies across four platforms, enough to easily qualify it as SSI’s all-time biggest seller.

Enough, that is, if it hadn’t been for Pool of Radiance, first of the “Gold Box” line of full-on Dungeons & Dragons CRPGs. Recognized as The Big One in the lineup right from the start, it didn’t disappoint. Beginning on the Commodore 64 and moving on to MS-DOS, the Apple II, the Macintosh, and the Amiga, its final sales total reached 264,536 copies in North America alone. By far the most successful release of SSI’s history as an independent company, it became exactly the transformative work that SSI (and Electronic Arts) had been banking on, a ticket to the big leagues if ever there was one. Even the Pool of Radiance clue book outsold any previous SSI game, to the tune of 68,395 copies.

Summer CES, June 1988. The big day draws near.

Summer CES, June 1988. The big day draws near.

The second serious attempt of 1988 to adapt a set of tabletop-RPG rules to the computer, Pool of Radiance makes, like its contemporary Wasteland, an enlightening study in game design for that reason and others. Happily, it’s mostly worthy of its huge success; there’s a really compelling game in here, even if you sometimes have to fight a little more than you ought to to tease it out. As a game, it’s more than worthy of an article in its own right. By way of concluding my little series on SSI and TSR and my bigger one on the landmark CRPGs of 1988, I’ll give it that article next time.

(Sources: As with all of my SSI articles, much of this one is drawn from the SSI archive at the Strong Museum of Play. Other sources include the Questbusters of March 1988, Computer Gaming World of March 1988 and July 1988, and Dragon of November 1987, May 1988, and July 1990. Also the book Designers and Dragons by Shannon Appelcline, and Matt Barton’s video interviews with Joel Billings.)

 
 

Tags: , ,

The 68000 Wars, Part 4: Rock Lobster

In the years following Jack Tramiel’s departure, Commodore suffered from a severe leadership deficit. The succession of men who came and went from the company’s executive suites with dizzying regularity often meant well, were often likable enough in their way. Yet they were also weak-willed men who offered only timid, conventional ideas whilst living in perpetual terror of the real boss of the show, Commodore’s dilettantish chairman of the board and interfering largest stockholder Irving Gould.

The exception that proves the rule of atrocious management is Thomas Rattigan, the man who during his brief tenure saved Commodore and in the process the Commodore Amiga from an early death. Rattigan wasn’t, mind you, a visionary; he never got the time to demonstrate such qualities even if he did happen to possess them. His wasn’t any great technical mind, nor was he an intrinsic fan of computers as an end unto themselves; in common with a rather distressing number of industry executives of the time, Rattigan, like Apple’s John Sculley a veteran of Pepsi Cola, seemed to take a perverse pride in his computer illiteracy, saying he “never got beyond the slide rule” and not even bothering to place a computer on his desk. He may not have even been a terribly nice guy; the thousands of employees he laid off, among them virtually the entire team that had once been Amiga, Incorporated, certainly aren’t likely to invite him to dinner anytime soon. No, Rattigan was simply competent, and carried along with that competence a certain courage of his own convictions. That was more than enough to make him stand out from his immediate predecessor and his many successors like the Beatles at a battle of the bands.

Thomas Rattigan

Thomas Rattigan

Rattigan was appointed President and Chief Operating Officer of Commodore International on December 2, 1985, and Chief Executive Officer on April 1, 1986, succeeding the feckless former steel executive Marshall Smith, whose own hapless tenure would serve as a blueprint for most of the Commodore leaders not named Rattigan who would follow. After replacing Tramiel in February of 1984, Smith had fiddled while Commodore burned, going from the billion-dollar face of home computing in North America to the business pages’ favorite source of schadenfreude, hemorrhaging money and living under the shadow of a gleeful deathwatch. The stock had dropped from almost $65 per share at the peak of Tramiel’s reign to less than $5 per share at the nadir of Smith’s. It was Rattigan, in one of his last acts before assuming the mantle of CEO as well as president, who negotiated the last-ditch $135 million loan package that gave Commodore — in other words, Rattigan himself — a lease on life of about one year to turn things around.

Some of the changes that Rattigan enacted to effect that turnaround were as inevitable as they were distressing: the waves of layoffs and cutbacks that had already begun under Smith’s reign continued for some time. Unlike Smith, however, Rattigan understood that he couldn’t cost-reduce Commodore back to profitablity.

The methods that Rattigan used to implement triage on the profit side of the ledger sheet were unsexy but surprisingly effective. One was entry into the burgeoning market for commodity-priced PC clones, hardware that could be thrown together quickly using off-the-shelf components and sold at a reasonable profit. Commodore’s line of PC clones would never do much of anything in North America — the nameplate was too associated with cheap, chirpy home computers for any corporate purchasing manager to glance at it twice — but it did do quite well in Europe; in some European countries, especially West Germany, the Commodore brand remained as respectable as any other.

Rattigan’s other revenue-boosting move was even more simple and even more effective. Commodore’s engineers had been working on a new version of the 64. Dubbed internally the 64CR, for “Cost Reduced,” it was built around a redesigned circuit board that better integrated many of the chips and circuitry using the latest production processes, resulting in a substantial reduction in the cost of production. The chassis and case were also simplified — for example, to use only two instead of many different types of screws. While they were at it, Commodore dramatically changed the look of the machine and most of its common peripherals to match that of the newer Commodore 128, thus creating a uniform appearance across their 8-bit line. As Rattigan said, “I think you’ve got to give people the opportunity not to have a black monitor, a green CPU, and a red disk drive.”

Commodore 64C

Commodore 64C

All of which was very practical and commonsensical. Looking at the new machine, however, Rattigan saw an opportunity to do something Commodore had never done before: to raise its price, and thereby to recoup some desperately needed profit margin. This really was a revolutionary thought for Commodore. Ever since releasing their VIC-20 model that had created the home-computer segment in North America, Commodore had competed almost entirely on the basis of offering more machine for less money than the other folks — an approach that did much to create the low-rent image that would dog the brand for the rest of the company’s life. Commodore had always kept their profit margins razor thin in comparison to the rest of the industry, trusting that they would, as the old saying/joke goes, “make it up in volume.” Now, though, Rattigan realized that the 64 had much more than price alone going for it. Almost everyone buying a 64 in 1986 was motivated largely by the platform’s peerless selection of games. Most, he theorized, would be willing to pay a little more than what Commodore was currently charging to gain access to that library. Thus when Commodore announced the facelifted 64 — now rechristened simply the 64C for obvious reasons — they also announced a 20 percent bump in its wholesale price. To ease some of the pain, they would bundle with it something called GEOS, an independently developed graphically-oriented operating environment that claimed to turn the humble 64 into a mini-Macintosh. (It didn’t really, of course, but it was a noble, impressive effort for a machine with a 1 MHz 8-bit processor and 64 K of memory.) Anyone who’s been around manufacturing at all will understand just what a huge difference a price increase of that magnitude, combined with a substantial reduction in manufacturing cost, would mean to Commodore’s bottom line if customers did indeed prove willing to continue buying the new model in roughly the same numbers as the old. Thankfully, Rattigan’s instincts proved correct. The 64C picked up right where the older model had left off, a brisk — and vastly more profitable — seller.

Sometimes, then, the simplest fixes really are the most effective. Taken together with the cost-cutting, these two measures returned Commodore to modest profitability well before Rattigan’s one-year deadline expired. Entering 1987, the company looked to be in relatively good shape for the short term. Yet questions still swirled around its long-term future. If Commodore didn’t want to accept the depressing fate of becoming strictly a maker of PC clones for the European market, they needed a successful platform of their own that could become the successor to the 64, which was proving longer lived than anyone had ever predicted but couldn’t go on forever. That successor had to be the Amiga. And therein lay problems.

The Amiga was in a sadly moribund state by the beginning of 1987. The gala Lincoln Center debut was now eighteen months in the past, but it felt like an eternity. The excitement with which the press had first greeted the new machine had long since been replaced by narratives of failure and marketing ineptitude. Commodore had stopped production of the Amiga in mid-1986 after making just 140,000 machines, yet was still able to fill the trickle of new orders from warehouse stock. Sure, some pretty good games had been made for the Amiga, at least one of which was genuinely groundbreaking, but with numbers like those how long would that continue? Already Electronic Arts had quietly sidled away from their early declarations that together they and the Amiga would “revolutionize the home-computer industry,” turning their focus back to other, more plebeian platforms where they could actually sell enough games to make it worth their while. Ditto big players in business and productivity software like Borland, Ashton-Tate, and WordPerfect. The industry at large, it seemed, was just about ready to put a fork in Commodore’s erstwhile dream machine.

The Amiga’s most obvious failing was one of marketplace positioning. Really, just who was this machine for? There were two obvious markets: homes, where it would make the best games machine the world had yet seen; and the offices of creative professionals who could make use of its unprecedented multimedia capabilities. Yet the original Amiga model had managed to miss both targets in some fairly fundamental ways. Svelte and sexy as it was, it lacked the internal expansion slots and big power supply necessary to easily outfit it with the hard drives, memory expansions, accelerator cards, and genlocks demanded by the professionals. Meanwhile its price of almost $2000 for a reasonably complete, usable system was far too high for the home market that had so embraced the Commodore 64. Throw in horrid Commodore marketing that ignored both applications in favor of positioning the Amiga as some sort of challenger to the PC-clone business standard, and it was remarkable that the Amiga had sold as well as it had.

If there was a bright spot, it was that the Amiga’s obvious failing had an equally obvious solution: not one but two new models, each perfectly suited for — and, hopefully, marketed toward — one of its two logical customer bases. Rattigan, industry neophyte though he was, saw this reasoning as clearly as anyone, and pushed his engineers to deliver both new machines as quickly as possible. They were officially announced via a low-key, closed-door presentation to select members of the press at the January 1987 Winter Consumer Electronics Show. The two new models would entirely replace the original, which had always been officially called the Amiga 1000 but had seldom been referred to by that name heretofore. The Amiga 2000 would be the big, professional-level machine, with a full 1MB of memory standard — four times that of the 1000 — and all the slots and expansion possibilities a programmer, artist, or video-production specialist — or, for that matter, a game developer — could possibly want.

Amiga 2000

Amiga 2000

But it was the Amiga 500 that would become the most successful Amiga model ever released, as well as the heart of its legacy as a gaming platform. Designed primarily by George Robbins and Bob Welland at Commodore’s West Chester, Pennsylvania, headquarters — the slowly evaporating original Los Gatos Amiga team had little to do with either of the new models — the 500 was code-named “Rock Lobster” during development after the B-52’s song (reason enough to love it right there if you ask me). Key to the work was a re-engineering of Agnus, the most complex of the Amiga’s custom chips, to make it smaller, simpler, and cheaper to manufacture; the end result was known as “Fat Agnus.” That accomplished, Robbins and Welland managed to stuff the contents of the 1000’s case into an all-in-one design that looked like a bulbous, overgrown Commodore 128.

Amiga 500

Amiga 500

The Amiga 500 wasn’t, especially in contrast to the 1000, going to win any beauty contests, but it got the job done. There was a disk drive built into the side of the case, a “trap door” underneath to easily increase memory from the standard 512 K to 1 MB, and an expansion port in lieu of the Amiga 2000’s slots that let the user add peripherals the old-fashioned Commodore way, by daisy-chaining them across the desktop. Best of all, a usable system could be had for around $1000, still a stratum or two above the likes of a 64 or 128 but nowhere near so out of the reach of the enthusiastic gamer or home hacker as had been the first Amiga. Compromised in some ways though it may have been from an engineering standpoint, enough to prompt a chorus of criticism from the old Los Gatos Amigans, the Amiga 500 was a brilliant little machine from a strategic standpoint, the smartest single move the post-Tramiel Commodore would ever make outside of electing to buy Amiga, Incorporated, in the first place.

But unfortunately, this was still Irving Gould’s Commodore, a company that seldom failed to follow every good decision with several bad ones. Amiga circles and the trade press at large were buzzing with anticipation for the not-yet-released new models, which were justifiably expected to change everything, when word hit the business press on April 23 that Thomas Rattigan had been unceremoniously fired. Like the firing of Jack Tramiel three years before when things were going so very well, it made and makes little sense. Gould would later say that Rattigan had been fired for “disobeying the chairman of the board” — i.e., him — and for “gross disregard of his duties,” but refused to get any more specific. Insiders muttered that Rattigan’s chief sin was that of being too good at his job, that the good press his decisions had been receiving had left Gould jealous. Just a couple of weeks before Rattigan’s firing, Commodore’s official magazine had published a lengthy interview with him, complete with his photo on the cover. To this Gould was reported to have taken grave exception. Yet Rattigan hardly comes across as a prima donna or a self-aggrandizer therein. On the contrary, he sounds serious, thoughtful, grounded, and very candid, explicitly rejecting the role of “media celebrity” enjoyed by Apple’s John Sculley, his former colleague at Pepsi: “When you have lost something in the range of $270 million in five quarters, I don’t think it’s time to be a media celebrity. I think it’s time to get back to your knitting and figure out how you’re going to get the company making money.” Nor does he overstate the extent of Commodore’s turnaround, much less take full credit for it, characterizing it as “tremendous improvement, but not an acceptable performance.” It seems hard to believe that Gould could be petty enough to object to such an interview as this one. But at least one more piece of circumstantial evidence exists that he did: Commodore Magazine‘s longtime editor Diane LeBold was forced out of the company on Rattigan’s heels, along with other real or perceived Rattigan loyalists. It made for one hell of a way to run a company.

True to form of being less of a pushover than Gould’s other executive lapdogs, Rattigan soon filed suit against Commodore for $9 million, for terminating his five-year employment contract four years early for no good reason. Commodore promptly counter-sued for $24 million, the whole ugly episode overshadowing the actual arrival of the Amiga 500 and 2000 in stores. After some five years of court battles, Rattigan would finally be awarded his $9 million — yes, every bit of it — just at a time when everything was starting to go sideways for Commodore and they could least afford to pay him.

With Rattigan now out of the picture — Gould had had him escorted off the campus by security guards, no less — Gould announced that he would be taking personal charge of day-to-day operations, a move that filled no one at the company other than his hand-picked circle of sycophants with any joy. But then, for Gould day-to-day oversight meant something different than it did for most people. He continued to live the lifestyle of the jet-setting super-rich, traveling the world — reportedly largely to dodge taxes — and conducting business, to whatever extent he did, by phone. Thus Commodore was not only under a cloud of rumor and gossip at this critical moment when these two critical new machines were being introduced, but they were also leaderless, their executive wings gutted and reeling from Gould’s purge and their ostensible new master who knew where. There was, needless to say, not much in the way of concerted promotion or messaging as the months marched on toward Christmas 1987, the big test of the Amiga 500.

While it didn’t abjectly fail that test, it didn’t really skate through with honors either. On the one hand, Amigas were selling again, and in better numbers than ever before. The narrative of the Amiga as a flop that was soon to be an orphan began to fade, and companies like Electronic Arts began to return to the platform, if not always as a target for first-run games at least as a consistent target for ports. WordPerfect even ported their industry-standard word processor to the Amiga. But on the other hand, the Amiga certainly wasn’t going to become a household name like the 64 anytime soon at this rate. In addition to the nearly complete lack of Commodore advertising, distribution remained a huge problem. Many people who might have found the Amiga very interesting literally never knew it existed, never saw an advertisement and never saw it in a store. Jack Tramiel’s decision to dump the 64 into mass-market channels like Sears and Toys “R” Us had been a breaking of his own word and a flagrant betrayal of his loyal dealers from which Commodore’s reputation had never entirely recovered. Yet it had also been key to the machine’s success; the 64 was available absolutely everywhere during its heyday, an inescapable presence to tempt plenty of people who would never think to walk into a dedicated computer store. Now, though, having laboriously and with very mixed results struggled to rebuild the dealer network that Tramiel had demolished, Commodore refused to do the same with the Amiga 500, even after some of those dealers had started to whisper through back channels that, really, it might be okay to offer some 500s through the mass market in the name of increasing brand awareness and corralling some new users who would quite likely end up coming to them for further hardware, software, and support anyway. But it didn’t happen, not in 1987, 1988, or the bulk of 1989.

The Amiga thus came to occupy an odd position on the American computing scene of the late 1980s, not quite a failure but never quite a full-fledged success either. Always the bridesmaid, never the bride; the talented actor never quite able to find her breakout role; pick your metaphor. Commodore blundered along, going through more of Irving Gould’s sock-puppet executives in the process. Max Toy, unfortunately named in light of the image that Commodore was still trying to shake, took over in October of 1987, to be replaced by Harold Copperman in July of 1989. Meanwhile the two Amiga models settled fairly comfortably into their roles.

Video production became the 2000’s particular strong suit. Amigas were soon regular workhorses on television series like Amazing Stories, Max Headroom, Lingo; on films like Prince of Darkness, Not Quite Human, Into the Homeland; on lots of commercials. If most of this stuff wasn’t exactly the pinnacle of cinematic art, it was certainly more Hollywood work than any other consumer-grade PC was getting. More important, and more inspiring, were the 2000s that found homes in small local newsrooms, on cable-access shows, and in small one- or two-person video-production studios. Just as the Macintosh had helped to democratize the means of production on paper via desktop publishing, the Amiga was now doing the same for the medium of video, complete with a new buzzword for the age: “desktop video.”

The strong suit of the Amiga 500, of course, was games. At first blush, the Amiga might seem a hard sell to game publishers. Even in 1988, after the 500 and 2000 had had some time to turn things around for the platform, a hit Amiga game might sell only 20,000 copies; a major blockbuster by the platform’s terms, 50,000. The installed base still wasn’t big enough to support much bigger numbers than these. An only modestly successful MS-DOS game, by contrast, might sell 50,000 copies, while some titles had reportedly hit 500,000 copies on the Commodore 64 alone. Yet, despite the raw numbers, many publishers discovered that the Amiga carried with it a sort of halo effect. Everyone seriously into computer games knew which platform had the best graphics and sound, which platform had the best games, even if some were reluctant to admit it openly. Publishers found that an Amiga game down-ported to other platforms carried with it a certain cachet inherited from its original version. Cinemaware, the premiere Amiga game developer and later publisher in North America, used the Amiga’s halo effect to particularly good commercial effect. All of their big releases were born, bred, and released first on the Amiga. They found that it made good commercial sense to do so, even if they ultimately sold far more copies to MS-DOS and Commodore 64 owners. While it’s true that Cinemaware could never have survived if the Amiga had been the only platform for which they made games, neither could they have made a name for themselves in the first place if the Amiga versions of their games hadn’t existed. Some of the same triangulations held sway, albeit to a lesser extent, among other publishers.

All told, the last three years of the 1980s were, relatively speaking, the best the Amiga would ever enjoy in North America. By the end of that period, with the 64 at last fading into obsolescence, the Amiga could boast of being the number two platform, behind only MS-DOS, for computer games in North America — a distant second, granted, but second nevertheless — while Commodore stood as the number three maker of PCs in North America in terms of units sold, behind only IBM and Apple. And Commodore was actually making money for most of this period, which was by no means always such a sure thing in other periods. But perhaps more important than numbers and marketshare was the sense of optimism. Every month seemed to bring some breakthrough program or technology, while every Christmas brought the hope that this would be the one where the Amiga finally broke into the public consciousness in a big way. To continue to be an Amiga loyalist in later years would require one to embrace Murphy’s Law as a life’s creed if one didn’t want to be positively smothered under all of the constant disappointments and broken promises that could make the platform seem cursed by some malicious higher power. But in these early, innocent times everything still seemed so possible, if only there would come the right advertising campaign, the right change in management at Commodore, the right hardware improvements.

But, ah, Commodore’s management… there lay the rub, even during these good years. Amiga owners watched with concern and then alarm as Apple and the makers of MS-DOS machines alike steadily improved their offerings whilst Commodore did nothing. In 1987, Apple debuted the Macintosh II, their first color model, with a palette of millions of colors to the Amiga’s 4096 and a hot new 16 MHz 68020 CPU inside. Yes, it cost several times the price of even the professional-grade Amiga 2000, and yes, 68020 or no, the Amiga could still smoke it for many animation tasks thanks to its custom chips. But then, even Apple’s prices always came down over time, and everyone knew that their hardware would only continue to improve. That same year, IBM introduced their new PS/2 line, and with it the new VGA graphics standard with about 262,000 colors on offer. More caveats applied, as Amiga fans were all too quick to point out, but the fact remained that the competition was improving by leaps and bounds while Commodore remained wedded to the same core chipset that they had purchased back in 1984. The Amiga 1000 had been a generation ahead of anything else on the market at the time of its release, but, unfortunately, generations aren’t so long in the world of computers. Gould and his cronies seemed unconcerned about or, still more damningly, blissfully unaware of the competition that was beginning to match and surpass the Amiga in various ways. In 1989, IBM spent 10.9 percent of their gross revenue on R&D, Apple 6.7 percent. And Commodore? 1.7 percent. The one area where Commodore did rank among the biggest spenders in the industry was in executive compensation, particularly the salary of one Irving Gould.


For the 1989 Christmas season, Commodore launched what would prove to be their first and last major mainstream advertising campaign for the Amiga 500. The $20 million campaign featured television spots produced by no less leading lights than Steven Spielberg’s Amblin Entertainment and George Lucas’s Lucasfilm. The slogan was “Amiga: The Computer for the Creative Mind.” The most lavish of the spots featured cameos by a baffling grab bag of minor celebrities, including Tommy Lasorda, Tip O’Neill, the Pointer Sisters, Burt Bacharach, Little Richard, and astronauts Buzz Aldrin, Gordon Cooper, and Scott Carpenter. Commodore’s advertising agency announced confidently that 92 percent of Americans would see an Amiga commercial an average of twenty times during November and December. Commodore would even begin selling 500s through mass-market merchandisers at last, albeit in a limited way, going through Sears and Service Merchandise alone. The campaign was hyped in the Amiga press as a last all-out effort to make that ever-elusive big breakthrough in North America. Sure, it was something they really needed to have done back in 1987, when the 500 first debuted, but at least they were doing it now. That was something, right? Right? In the end, it proved a heartbreaker of the sort with which Amiga fans would grow all too familiar over the years to come: it had no appreciable effect whatsoever. And with that Commodore slipped out of the mainstream American consciousness along with the decade with which their computers would always be identified.

The next year the first of a new generation of unprecedentedly ambitious games arrived — games like Wing Commander, Ultima VI, Railroad Tycoon — that looked, sounded, and played better on MS-DOS machines than they did on Amigas, thanks to the ever-improving graphics cards, sound cards, and new 32-bit 80386 processors in those heretofore bland beige boxes. Cinemaware that same year released Wings, the last of their big Amiga showcases, and then quietly died. The Amiga’s halo effect was no more. Just like that, an era ended.

And yet… well, here’s where things get a little confusing. As the Amiga was drying up as a gaming platform in North America, it was in many ways just getting started in Europe, with most of the classics still to come. Let’s rewind and try to understand how this parallel history came to be.

Commodore had always been extremely strong in Europe, going all the way back to their days as a maker of calculators. Their first full-fledged computer, the PET, had been little more than a blip on the radar in North America in comparison to its competitors the Radio Shack TRS-80 and the Apple II, yet it had fostered a successful and respected line of business computers across the pond. Commodore’s most consistently strong markets then would also prove the strongest of their twilight years: Britain and, especially, West Germany. Both operations were granted much more autonomy than the North American operation, and were staffed by smart people who were much better at selling Commodore’s American computers than Commodore’s Americans were. Germans in particular developed a special affinity for the Commodore brand, one that was virtually free of the home-computer/business-computer dichotomy that Commodore twisted themselves into knots trying to navigate in the United States. In Germany a good home computer was simply a good home computer, and if the same company happened to offer a good business computer, well, that was fine too.

batmanpack

When Commodore’s European leadership looked to the new Amiga 500, they saw a machine sufficient to make the traditional videogame demographic of teenage boys, who were currently snatching up Commodore 64s and Sinclair Spectrums, positively salivate. They unapologetically marketed it on that basis. Knowing what their buyers really wanted the machine for, they quite early on took to bundling together special packages, usually just in time for Christmas, that combined a 500 with a few of the latest hot games. A particular home run was 1989’s so-called “Batman Pack,” which included the game based on the hit Batman movie, a fresh new arcade conversion called The New Zealand Story, the graphically stunning casual flight simulator F/A-18 Interceptor, and, since this was an Amiga after all, the platform’s signature application, Deluxe Paint II. Deluxe Paint aside, there was no talk of video production or productivity of any other stripe, no mention of the Amiga’s groundbreaking multitasking operating system, no navel-gazing discussions of the platform’s place in the multimedia zeitgeist. Teenage boys didn’t want any of that. What they wanted was great games with great graphics, and that’s exactly what Commodore’s European operations gave them. You were just buying a fun computer, a game machine, so you didn’t need to go through a dealer. From the beginning, the Amiga 500 was widely available at all of the glossy electronics stores on European High Streets. The West German operation went even further: they started selling Amigas through grocery stores. Buy an Amiga 500, hook it up to a television, pop in a disk, turn it on, and start playing.

The British and especially the Germans took to the Amiga 500 in numbers that Commodore’s North American executives could only dream of. By early 1988, Commodore could announce that they had sold 500,000 Amigas worldwide, a strong turnaround for a marque that had been all but left for dead a year before. A rather astonishing 200,000 of those machines, 40 percent of the total, had been sold in West Germany alone; Britain accounted for another 70,000. Even now, with the Christmas season behind them, Commodore West Germany was selling a steady 4000 Amiga 500s every week. A few months later came the simultaneously impressive and depressing news that the total market for Amiga hardware and software in West Germany (population 60 million) was now worth more than that for the United States (population 240 million). And where Germany led, the rest of Europe followed. Eighteen months later, with worldwide Amiga sales closing in on 1.5 million, it was the number one gaming computer in Europe, a position it would continue to enjoy for several years to come. Just about to begin its fade from prominence as a game machine in the United States, in Europe the Amiga’s best years and best games were still in front of it, as bedroom coders learned to coax performance out of the hardware of which its designers could hardly have conceived. The old Boing demo, once so stunning that crowds of trade-show attendees had peeked suspiciously under tables looking for the workstation-class machine that was really generating that animation, already looked singularly unimpressive. The story of the Amiga 500 in Europe was, in other words, the story of the Commodore 64 happening all over again. Commodore was now making the vast majority of their money in Europe, the North American operation a perpetual weak sister.

When journalists for the Amiga trade press in North America visited Europe, they were astounded. Here was the mainstream saturation that they had only been able to fantasize about back home. A report from a correspondent visiting a typical department store in Cologne must have read to American readers like a dispatch from Wonderland.

I came across a computerized book listing that was running on an Amiga 500. As I approached the computer department, I was greeted by a stack of Amiga 500s. I could not believe the assortment of Amiga titles on the book rack (hardcover ones, too!). I found two aisles full of Amiga software, consisting mostly of games. The Amiga selection was more than that of any other computer.

In a sense, it was just a reversion to the status quo. After all, prior to the introduction of the VIC-20 in 1981, Commodore’s income had been similarly unevenly distributed between the continents. Seen in this light, it’s the high times of the 1980s that are the anomaly, when American buyers flocked to the VIC-20 and the 64 and for a time made what had always been fundamentally a European brand — although, paradoxically, a European brand engineered and steered from the United States — into an intercontinental phenomenon. Not that that was of much comfort to the succession of executives who came and went from Irving Gould’s hotseat, fired one after another for their failure to make North American sales look as good as European sales.

But I did promise you 68000 Wars in the title of this article, didn’t I? So where, you might well be wondering, was the Amiga’s arch-rival the Atari ST in all this? Well, in North America it was a fairly negligible factor, although Atari would continue to sell their machine there almost as long as Commodore would the Amiga. The hype around the ST had dissipated quickly with the revelation in late 1986 that Atari really wasn’t selling anywhere near as many of them as Jack Tramiel liked to let on, and the Amiga 500, so obviously audiovisually superior and now much closer in price, soon proved a deadly foe indeed. The ST retained its small legion of loyal users: desktop publishers unwilling to splurge on a Macintosh, who took full advantage of its rock-solid monochrome high-resolution screen and Atari’s inexpensive laser printer, thereby truly making the ST live up to its old “Jackintosh” nickname; musicians, both amateur and professional, who loved its built-in MIDI port; programmers and hardware hackers who favored its simple, straightforward design over the Amiga’s more baroque approach; people who needed lots and lots of memory for one reason for another, on which terms Atari always offered the best deal in town (they released 2 MB and 4 MB ST models as early as 1987, when figures like that were all but inconceivable); inveterate Commodore haters and/or Atari lovers who bought it for the badge on its front. Still, there was little doubt which platform had won the 68000 Wars in North America. In the wake of the Amiga 500’s release, Atari began increasingly to turn to other ways of making money: buying the Federated chain of consumer-electronics stores; capitalizing on nostalgia for the glory days of the Atari VCS by continuing to sell both the old hardware and the cartridges to run on it; wresting away from Epyx a handheld gaming machine, the first of its kind, that was ironically designed by members of the old Amiga, Incorporated, team. When all else failed, there was always Jack Tramiel’s old hobby of lawsuits, of which he launched quite a few, most notably against the former owners of Federated for overstating their company’s value and against the new kid on the block in console gaming, Nintendo, for their alleged anti-competitive practices.

In Europe, the ST also came out second best to the Amiga, but the race was a much closer one for a while. Along with their love for all things Commodore, Germans found that they could also make room in their hearts for the Atari ST. It found a home in many German markets it never came close to cracking in the United States, being regarded as a perfectly respectable business computer there for quite some time. It also continued to do fairly well with gamers, thanks to Atari’s pricing strategies that always seemed to keep its low-end model just that little bit cheaper than the Amiga 500, enough to be a difference-maker for some buyers. When the Amiga became the biggest gaming computer in Europe, it was the Atari ST that slipped into the second spot. It would take the much more expensive MS-DOS machines some years yet to overtake these two 68000-based rivals. The economic chaos brought on by the reunification of West and East Germany, which caused many consumers there to tighten their wallets, only helped their cause, as did the millions of new price-conscious buyers who were suddenly scrambling for a piece of that Western computing action following the fall of the Iron Curtain.

The story of the Amiga, and to some extent also that of the ST, is often framed as a narrative of frustration, of brilliance that never got its due. There are some good reasons for that, but it can also be a myopic, America-centric view, ignoring as it does a veritable generation of youngsters on the other side of the Atlantic who grew up knowing these two platforms very well indeed. When I was writing my book about the Amiga, I couldn’t help but note the markedly different responses I got from friends in Europe and the United States when I told them about the project. Most Americans have no idea what an Amiga is (“Omega?”); most Europeans of a certain age remember it all too well, flashing me smiles redolent of nostalgia for afternoons spent before the television with their mates, when the summer seemed endless and the possibilities limitless. Instead of lamenting might-have-beens too much more, I expect to spend quite some articles over the next few years talking about the Amiga’s innovations and successes — and, yes, I’ll also have more to say about the Amiga’s perpetually overlooked little frenemy the Atari ST as well. Whether you grew up with one of these machines or you too aren’t quite sure yet what to make of this whole “Omega” thing, I hope you’ll stick around. Some amazing stuff is in store.

(Sources: Invaluable as always for these articles was Brian Bagnall’s book On the Edge: The Spectacular Rise and Fall of Commodore. I can’t wait for the better, longer version. The long-running “Roomers” column in Amazing Computing is my go-to source for a month-by-month chronology of developments on the Amiga scene, and the source of most of the nit-picky factoids in this article. The issues of Amazing used include: March 1987, June 1987, July 1987, August 1987, October 1987, November 1987, December 1987, February 1988, April 1988, May 1988, June 1988, July 1988, August 1988, September 1988, November 1988, December 1988, January 1989, February 1989, March 1989, April 1989, May 1989, June 1989, July 1989, August 1989, September 1989, October 1989, November 1989, December 1989, January 1990, April 1990, May 1990, June 1990, February 1991, March 1991, April 1991, May 1991, December 1991. Commodore Magazine‘s fateful interview with Thomas Rattigan appeared in the May 1987 issue. Other sources include Retro Gamer 39 and of course my own book The Future Was Here. Hey, it’s not every day a writer gets to cite himself…)

 
83 Comments

Posted by on November 27, 2015 in Digital Antiquaria, Interactive Fiction

 

Tags: , , , ,

The Lurking Horror

The Lurking Horror

Given the demographics of many readers of H.P. Lovecraft, not to mention players of the Call of Cthulhu RPG, it was inevitable that the Cthulhu Mythos would make it to the computer. The only real surprise is that it took all the way until 1987 for the first full-fledged digital work of Lovecraftian horror to appear. That it should have been among all the Imps of Infocom Dave Lebling who wrote said work is, on the other hand, no surprise. The most voracious and omnivorous reader of all in an office full of them, Lebling was also the only Imp with deep roots in the world of tabletop RPGs; he had to have been aware of Sandy Petersen’s game even if he had never played it.

Running neck and neck as he was with Steve Meretzky for the title of most prolific and recognizable Imp, Lebling was pretty much given carte blanche to choose his projects. Thus his rather vague proposal, for a “kind of H.P. Lovecraft game set at a kind of MIT-ish place,” was all that was needed to set the ball rolling. Not that, even discounting Lebling’s track record, there was a lot of risk in the proposition: horror, while relatively uncommon in adventure games to date, was a fictional genre with obvious appeal for the typical player, and Lovecraft was as good a point of entry as any. Indeed, the graphical adventure Uninvited, which had thrown a bit of Lovecraft into its blender along with lots of other hoary old horror tropes, was doing quite well commercially at the very instant that Lebling was making his proposal. Horror was a perfect growth market for adventure authors and players tired of fantasy, science fiction, and cozy mysteries.

The Lurking Horror‘s title inauspiciously harks back to “The Lurking Fear,” a story from Lovecraft’s Edgar Allan Poe-aping early years that’s not all that fondly regarded even by aficionados. “The tempo increases imperceptibly from sluggish to slow” over the course of the story, and “the awful crescendo of terror that we have been promised is more of an anticlimax,” writes Lovecraft biographer and critic Paul Roland. Ah, well… at least it has a great title, as well as a gloriously cheesy opening line that comes perilously close to “It was a dark and stormy night”: “There was thunder in the air on the night I went to the deserted mansion atop Tempest Mountain to find the lurking fear.”

The game casts you as a freshman at “GUE Tech,” a stand-in for MIT. It’s the end of the term, and your twenty-page paper on “modern analogues of Xenophon’s ‘Anabasis'” is due tomorrow. Lebling cleverly updates the classic Lovecraftian setup of a scholar coming upon a strange and foreboding document in an archive somewhere for the computer age. As you try to work on the paper inside the computer center, alone but for one occasionally helpful but usually infuriating hacker, you find that a strange file has replaced your own, a combination of “incomprehensible gibberish, latinate pseudowords, debased Hebrew and Arabic scripts, and an occasional disquieting phrase in English.” Your directory has somehow gotten mixed up with that of the “Department of Alchemy,” says the hacker. You’ll have to go down there to see if they can help you out. If you first help him out with a little problem of his own, he’s even kind enough to provide you with a key that will open most of the doors down there. And so you set off into the bowels of the university, deserted thanks to the blizzard raging outside on this dark winter night, all the while trying not to think about all the students that have been disappearing lately. Down there in the basements and steam tunnels you’ll encounter the full monty: a zombified janitor; a blood-encrusted sacrificial altar; hordes of rats running who knows where; an insane scientist trying to summon creatures from the beyond; lots of slime and general grossness; and, at last, the tentacled beastie at the heart of it all, who seems to be worming his way into the campus’s computer network to do… well, we’re never quite sure, but chances are it’s not good.

This last is The Lurking Horror‘s one really original contribution to Mythos lore, mixing it up with a bit of William Gibson-style cyberpunk; Neuromancer, another book Lebling had to have read, was the talk of science fiction at the time. The mash-up here anticipates a whole sub-genre (sub-sub-genre?) of stories, even if The Lurking Horror doesn’t do a whole lot with the premise beyond introducing it.

But then much the same thing could be said about the game’s relationship to Lovecraft in general. While most of the surface tropes are present and accounted for, most of the subtext of Lovecraft’s cosmic horror — humanity’s aloneness in a cold and unfeeling cosmos, the utter alienness of the Mythos that places it beyond our conceptions of good and evil, the sheer hopelessness of fighting powers so much greater than ourselves — is conspicuously absent. Likewise the actual creatures and gods of the Cthulhu Mythos; the only proper name from Lovecraft to be found here is that of the author himself, appearing as the name of a file on your computer by way of credit where it’s due. At the time that Lebling was writing the game, Arkham House was still emphatically claiming copyright to Lovecraft’s works, and companies like Chaosium who made use of the Mythos were paying licensing fees. Although Arkham’s claim would eventually prove dubious enough that Chaosium and others would drop the license and continue business as usual without it, it was likely copyright concerns that prompted Lebling not to name names. Unlike many computer games that would follow, The Lurking Horror also evinces no obvious debt to the Call of Cthulhu tabletop RPG beyond the bare fact that both are games that build on Lovecraft’s writings. It’s all enough to make me feel a little embarrassed about the two-article buildup I’ve given this game, afraid that this article might now come across like the mother of all anticlimaxes. I can only ask you to be patient, and to know that those last two articles will pay off in spades down the road, when we encounter games that dig much deeper into Mythos lore than this one does.

The Lurking Horror

Even the language of The Lurking Horror doesn’t quite ever go all-in for Lovecraft in all his unhinged glory. While Lebling gets some credit for using “debased” in an extract I’ve already quoted, there’s not a single “blasphemous” or “eldritch” to be found. Part of the ironic problem here, if problem it be, is that Lebling is just too careful a writer — too good a writer? — to let his id run wild in a babble of feverish adjective in that indelible H.P. Lovecraft way. Consider for example this scene, which finds you peering down through a manhole into a pit of horror.

>look in plate
You peer through the hole, shining your light into the stygian darkness below. The commotion below is growing louder, and suddenly you catch a glimpse of things moving in the pit. Without consciously realizing you have done it, you slam the panel shut, reeling away from the source of such images. Now you know what has been done with the missing students...

Lovecraft would doubtless describe this scene as “indescribable,” and then go nuts describing it. Lebling throws in a Lovecraftian “stygian,” but otherwise much more elegantly describes it as indescribable without having to resort to the actual word, and then… doesn’t describe it. His final line is more subtly chilling than anything Lovecraft ever wrote, a fine illustration of the value of a little restraint. Lebling, it seems, subscribes to the school of horror writing promoted by Edmund Wilson in his famous takedown of Lovecraft, which claims the very avoidance of the overwrought adjectives that Lovecraft loved so much to be key to any effective tale.

Perhaps of more concern than Lebling’s failings as a 1980s reincarnation of Lovecraft is the fact that The Lurking Horror, despite some effectively creepy scenes like the one above, ultimately isn’t all that scary. As I noted in my review of the simultaneously released Stationfall, I find that game, ostensibly another of Steve Meretzky’s easygoing science-fiction comedies, far more unnerving in its latter half than this game ever becomes. The default house voice of Infocom is a sly tone of gentle humor, an unwillingness to take it all too seriously. Just that tone creeps into a number of their more straight-laced works, this one among them, and rather cuts against the grain of the fiction. And in this game in particular one senses a conflict in Lebling that’s far from unique among writers following in Lovecraft’s wake: he wants to pay due homage to the man, but he’s also never quite able to take him seriously. At times The Lurking Horror reads more like a Lovecraft parody than homage, a line that is admittedly thin with a writer as ridiculous in so many ways as Lovecraft. Even more broadly, it sometimes feels like a parody of horror in general. The disembodied hand whom you can befriend, for instance, not only doesn’t feel remotely Lovecraftian but is actually a well-worn trope from about a million schlocky B-movies, played here as it often is there essentially for laughs. After striking an appropriately ominous note at the very end of the game, when an egg of the creature you’ve finally destroyed apparently spawns and flies off to begin causing more havoc, Lebling just can’t leave it at that. Instead he closes The Lurking Horror with a bit of macabre slapstick that’s more Tales From the Crypt than Call of Cthulhu.

>get stone
You pick up the stone. It has a long jagged crack that almost breaks it in half. As you pick it up, you feel it bump to one side. Then, as you are holding it in your hand, something pushes its way out through the crack, breaking the stone into two pieces. Something small, pale, and damp blinks its watery eyes at you. It hisses, gaining strength, and spreads membranous wings. It takes to the air, at first clumsily, then with increased assurance, and disappears into the gloom. One eerie cry drifts back to where you stand.

Something rises out of the mud, slowly straightening. The hacker, mud-covered and weak, staggers to his feet. "Can I have my key back?" he asks.

But the most important reason that The Lurking Horror doesn’t stick to its Lovecraftian guns is down to the other, perhaps even more interesting thing it also wants to be: a tribute to MIT, the university where Infocom was born and where Dave Lebling himself spent more than a decade hacking code, eating Chinese food, and exploring roofs and tunnels.

In choosing to look back with more than a hint of nostalgia rather than to gaze resolutely forward, The Lurking Horror was part of a general trend at Infocom during these latter years of the company’s history, part and parcel of the same phenomenon that saw Steve Meretzky bringing back Floyd at last for Stationfall and, after five years without a Zork, the Imps suddenly pulling out that old name that had made them who they were twice in the space of less than a year. By 1987, with sales far from what they once were and their new corporate overlords at Activision understandably concerned about that reality, a sneaking suspicion that they may be nearing the end game must have been percolating through the ranks. Thus the desire to look back, to appreciate — and not without a little wistfulness — just where they’d been. Lebling himself, meanwhile, was fast closing in on forty, a time that brings a certain reflective state of mind if not a full-fledged crisis to many of us. Whatever else it is, The Lurking Horror is also a very personal game for Dave Lebling, by far the most personal he would ever write.

Since I’ve been writing this blog, I’ve found myself growing more and more skeptical of parser-based interactive fiction’s ability to handle elaborate plotting worthy of a novel or even a novella. The Infocom ideal that was printed on their boxes for all those years, of “waking up inside a story,” was, I’ve come to believe, always something of a lost cause. In compensation, however, I’ve come to be ever more impressed by how good the form is at evoking a sense of place. Despite the name we all chose to apply to our erstwhile text adventures long ago, which I’m certainly not going to try to change now, architecture or landscaping may provide better metaphors for what interactive “fiction” does best. (It’s for this reason, for the record, that I’ve long since backed away from trying to painstakingly define “ludic narrative,” and moved away from an exclusive focus on digital storytelling for this blog as a whole.)

Given all that, I’m particularly fascinated by games like this one that embrace that great — greatest? — strength of the medium by letting us explore a real place. For all of the interactive fiction that’s been made during Infocom’s heyday and after, that’s been done surprisingly little. Only three Infocom games, of which this is the second, attempt to recreate real or historical places. I find The Lurking Horror particularly interesting because the landscape of MIT that it chooses to show us is so personally meaningful to Lebling, turning it into a sort of architecture of memory as well as physical space. I really want to do this aspect of the game justice, and so I have something special planned for you for next week’s article: an in-game guided tour of GUE/MIT.

For now, though, I’ll just note that The Lurking Horror is a worthwhile game if also a somewhat schizophrenic one. The comedy cuts against the horror; the Lovecraft homage cuts against the MIT homage. There’s a lot that Lebling wants to do here, and the 128 K Z-Machine just isn’t quite enough to hold it all. It’s one of the few standard-sized Infocom games that I find myself wishing had been made for the roomier Interactive Fiction Plus format. Still, nothing that is here is really objectionable. The puzzles are uniformly well-done, even if, oddly given that this game came out so close on the heels of Hollywood Hijinx, some of them once again revolve around an elevator. (I suspect a bit of groupthink, not surprising given the collaborative nature of Hollywood Anderson’s game.) And the writing is fine, even if it does feel slightly strangled at times by the space limitations. The Lurking Horror feels a little like a missed opportunity, but it wouldn’t feel that way if what’s here — especially its recreation of MIT student life — wasn’t compelling already.

Infocom had high hopes for both Stationfall and The Lurking Horror, these two simultaneously released games of seemingly high commercial appeal written by their two most prolific and recognizable authors. The pair inspired the last really audacious promotional event in Infocom’s history — indeed, their most expensive and ambitious since the grand Suspect murder-mystery party of two-and-a-half years before. For the 1987 Summer Consumer Electronics Show in Chicago — yes, that era-capping CES again — they rented the Field Museum of Natural History for hundreds of guests, as they had each of the two previous years, and sprung for a local rock band to liven the place up. This time, however, they also hired the famed Second City comedy troupe, incubator of talents like Dan Aykroyd and John Belushi, to come in and perform improvisational comedy (“InfoProvisation”) based largely on Infocom games. From The Status Line‘s article on the event, complete with great 1980s pop-culture references:

Through a hilarious sequence of skits using very few props (a couple of chairs and a piano), the audience saw a computerized dating simulator, roared at a romance between a next-generation computer and a piece of has-been software, met Stationfall’s Floyd, visited GUE Tech, and even had the opportunity to affect the course of a scene or two.

In a tribute to the best-selling Leather Goddesses of Phobos, three vignettes, set in a singles bar and interspersed throughout the program, showed real-life versions of the three playing modes. Tame would have made Mother Teresa proud, but by the time they went from suggestive to lewd, it was enough to make Donna Rice blush.

Steve Meretzky (second from left) and Dave Lebling (second from right) ham it up with Second City.

Steve Meretzky (second from left) and Dave Lebling (second from right) ham it up with Second City.

Steve Meretzky and Dave Lebling even got to join the troupe onstage for a few of the skits. (This must have been a special thrill for Meretzky, who, judging by his love for Woody Allen and for performing in Infocom’s in-office productions, had a little of the frustrated comedian/actor in him, like his erstwhile writing partner Douglas Adams.)

But if the Second City gala harked back to the glory days of Infocom in some ways, the present was all too present in others. The new, cheap packaging was hard for fans to overlook, as was the fact that the principal feelie in The Lurking Horror, a packet of “rattlesnake eggs,” had nothing to do with the game. It looked like something that someone in marketing had just plucked off the discount rack at the local novelty shop — which was in fact largely what it was, as was proved when the final package came out with an equally inexplicable rubber centipede in place of the eggs; apparently it could be sourced even cheaper. The Second City event did get a write-up in newspapers all over the country thanks to being picked up by the Associated Press, but, alas, seems to have done little for actual sales of Stationfall and The Lurking Horror, neither of which reached 25,000 copies. For the regular CES attendees who, whether fans of Infocom’s games or not, had grown to love their parties, this final blowout and its underwhelming aftermath was just one more way that that Summer 1987 edition of the trade show marked the end of an era.

Infocom, however, still wasn’t quite done with The Lurking Horror. A few months after all of the Chicago hoopla, a new version of the game, released only for the Commodore Amiga, reached stores. This one sported digitized sound effects to accompany some of its most exciting moments, a first for Infocom and the first sign of an interest in technical experimentation — not to say gimmickry — that would increasingly mark their last couple of years as a going concern. In this case the innovation came directly from an Activision that was very motivated to find ways to spruce up Infocom’s product line. But, unlike so many of Activision’s suggestions, Infocom actually greeted this one with a fair amount of enthusiasm.

It all began with a creative and innovative programmer named Russell Lieblich, who had come to Activision after spending some time at Peter Langston’s idealistic original incarnation of Lucasfilm Games. During the Jim Levy era Lieblich had been allowed to indulge his artistic muse at Activision, resulting in the interesting if not terribly playable commercial flops Web Dimension and Master of the Lamps. That sort of thing wasn’t going to fly in the new Bruce Davis era, so Lieblich, a talented musician as well as programmer, retrenched to concentrate on the technical aspects of computer audio, a field where he would spend much of his long career in games still to come. Of most relevance to Infocom was the system he developed for playing back digitized sounds recorded from the real world. Infocom had a playtester play through The Lurking Horror again, making a list of everywhere where he could imagine a sound effect. Lebling and others then pruned the list to those places where they felt sound would be most effective, and sent the whole thing off to Lieblich to hack into the Amiga version of the Z-Machine interpreter. At least a few other machines were theoretically capable of playing short digitized sounds of reasonable fidelity as well — the Apple Macintosh and IIGS and the Atari ST would have made excellent candidates — but sound was only added to the Amiga version, an indication of just what an afterthought the whole project really was.

As afterthoughts go, it’s not bad, although the fidelity of the sounds isn’t particularly high even by the standards of other Amiga games of the day. I doubt you’d be able to recognize “the squeal of a rat,” “the creak of an opening hatch,” or “the distinctive ‘thunk’ of an axe biting into flesh” — that’s how The Status Line describes some of the sounds — for what they’re supposed to be if you didn’t have the game in front of you telling you what’s happening. Still, they are creepy in an abstract sort of way, and certainly startling when they play out of the blue. While hardly essential, they do add a little something if you’re willing to jump through a few hoops to get them working on a modern interpreter. Whether the addition of a handful of sound effects was enough to make Amiga owners, madly in love with their computers’ state-of-the-art audiovisual capabilities, consider buying an all-text game was of course another matter entirely.

Next week we’ll put Lovecraft to bed for a while (doubtless dreaming one of his terrible dreams of “night-gaunts”), but will take a deeper dive into the other part of The Lurking Horror‘s split personality, its nostalgic tribute to MIT and student life therein. If you haven’t played The Lurking Horror yet, or if you have but it’s been a while, you may want to wait until then to join me on a guided tour that I think you’ll enjoy.

(Sources: As usual with my Infocom articles, much of this one is drawn from the full Get Lamp interview archives which Jason Scott so kindly shared with me. Thanks again, Jason! Other sources include: the book Game Design Theory and Practice by Richard Rouse III; The Status Line of Summer 1987, Fall 1987, Winter 1987, and Winter/Spring 1988.)

 
 

Tags: , , ,