RSS

Tag Archives: interplay

Alone in the Dark

Most videogame stories are power fantasies. You spend your time getting ever stronger, ever tougher, ever more formidable as you accumulate experience points, gold, and equipment. Obstacles aren’t things to go around; they’re things you go through. If you can’t get past any given monster, the solution is to go kill some other monsters, then come back when you’re yet more powerful and slay the big beast at last. Life, these games tell us, is or ought to be one unadulterated ride up the escalator of success; a setback just means you haven’t yet risen high enough.

That dynamic held true in 1992 just as much as it usually does today. But during that year there came a well-nigh revolutionary game out of France that upended all of these traditional notions about what the medium of videogames can do and be. It cast you as a painfully ordinary, near-powerless individual adrift in a scary world, with no surefire panaceas in the form of experience points, gold, or portable rocket launchers to look forward to. It was just you and your wits, trapped in a haunted house full of creatures that were stronger than you and badly wanted to kill you. Despite its supernatural elements, this game’s scenario felt more disconcertingly close to real life than that of any of those other games. Here, you truly were alone in the dark. Aren’t we all from time to time?


Any story of how this shockingly innovative game came to be must begin with that of Frédérick Raynal, its mastermind. Born in the south-central French town of Brive-la-Gaillarde in 1966, Raynal was part of the first generation of European youths to have access to personal computers. In fact, right from the time his father first came home with a Sinclair ZX81, he was obsessed with them. He was also lucky: in a dream scenario for any budding hacker, his almost equally obsessed father soon added computers to the product line of the little videocassette-rental shop he owned, thus giving his son access to a wide variety of hardware. Raynal worked at the store during the day, renting out movies and watching them to kill the time — he was a particular fan of horror movies, a fact which would soon have a direct impact on his career — and helping customers with their computer problems. Then, with a nerdy young man’s total obliviousness to proportion, he hacked away most of the night on one or another of the machines he brought home with him. He programmed his very first released game, a platformer called Robix, in 1986 on an obscure home-grown French computer called the Exelvision which his father sold at the store. His father agreed to sell his son’s Exelvision game there as well, managing to shift about 80 units to customers desperate for software for the short-lived machine.

Raynal’s lifestyle was becoming so unbalanced that his family was beginning to worry about him. One day, he ran out of his room in a panic, telling them that all of the color had bled out of his vision. His mother bustled him off to an ophthalmologist, who told him he appeared to have disrupted the photoreceptors in his eyes by staring so long at a monitor screen. Thankfully, the condition persisted only a few hours. But then there came a day when he suddenly couldn’t understand anything that was said to him; he had apparently become so attuned to the language of computer code that he could no longer communicate with humans. That worrisome condition lasted several weeks.

Thus just about everyone around him took it as a good thing on the whole when he was called up for military service in 1988. Just before leaving, Raynal released his second game, this time for MS-DOS machines. Not knowing what else to do with it, he simply posted it online for free. Popcorn was a Breakout clone with many added bells and whistles, the latest entry in a sub-genre which was enjoying new popularity following the recent success of the Taito arcade game Arkanoid and its many ports to home computers and consoles. Raynal’s game could hold its head high in a crowded field, especially given its non-existent price tag. One magazine pronounced it one of the five best arcade games available for MS-DOS, whether commercial or free, and awarded it 21 points on a scale of 20.

Raynal was soon receiving letters at his military posting from all over the world. “Popcorn has made my life hell!” complained one player good-naturedly. Another wrote that “I caught acute Popcornitus. And, it being contagious, now my wife has it as well.” When Raynal completed his service in the summer of 1989, his reputation as the creator of Popcorn preceded him. Most of the companies in the French games industry were eager to offer him a job. His days working at his father’s computer store, it seemed, were behind him. The Lyon-based Infogrames, the most prominent French publisher of all, won the Raynal sweepstakes largely by virtue of its proximity to his hometown.

Yet Raynal quickly realized that the company he had elected to join was in a rather perilous state. An ambitious expansion into many European markets hadn’t paid off; in fact, it had very nearly bankrupted them. Bruno Bonnell, Infogrames’s co-founder and current chief executive, had almost sold the company to the American publisher Epyx, but that deal had fallen through as soon as the latter had gotten their first good look at the state of his books. It seemed that Infogrames would have to dig themselves out of the hole they’d made. Thus Bonnell had slashed costs and shed subsidiaries ruthlessly just to stay alive. Now, having staunched the worst of the bleeding, he knew that he needed as many talented programmers as he could get in order to rebuild his company — especially programmers like Raynal, who weren’t terribly assertive and were naive enough to work cheap. So, Raynal was hired as a programmer of ports, an unglamorous job but an absolutely essential one in a European market that had not yet consolidated around a single computer platform.

Bonnell, for his part, was the polar opposite of the shy computer obsessive he had just hired; he had a huge personality which put its stamp on every aspect of life at Infogrames. He believed his creativity to be the equal of anyone who worked for him, and wasn’t shy about tossing his staff ideas for games. He called one of them, which he first proposed when Raynal had been on the job for about a year, In the Dark. A typically high-concept French idea, its title was meant to be taken literally. The player would wander through a pitch-dark environment, striking the occasional match from her limited supply, but otherwise relying entirely on sound cues for navigation. Bonnell and Raynal were far from bosom buddies, then or ever, but this idea struck a chord with the young programmer.

As Raynal saw it, the question that would make or break the idea was that of how to represent a contiguous environment with enough verisimilitude to give the player an embodied sense of really being there in the dark. Clearly, a conventional adventure-game presentation, with its pixel graphics and static views, wouldn’t do. Only one approach could get the job done: 3D polygonal graphics. Not coincidentally, 3D was much on Raynal’s mind when he took up Bonnell’s idea; he’d been spending his days of late porting an abstract 3D puzzle game known as Continuum from the Atari ST to MS-DOS.

I’ve had occasion to discuss the advantages and disadvantages of this burgeoning new approach to game-making in previous articles, so I won’t rehash that material here. Suffice to say that the interest so many European programmers had in 3D reflected not least a disparity in the computing resources available to them in comparison to their American counterparts. American companies in this period were employing larger and larger teams, who were filling handfuls of floppy disks — and soon CD-ROMs — with beautiful hand-drawn art and even digitized snippets of real-world video. European companies had nothing like the resources to compete with the Americans on those terms. But procedurally-generated 3D graphics offered a viable alternative. At this stage in the evolution of computer technology, they couldn’t possibly be as impressively photorealistic as hand-drawn pixel art or full-motion video, but they could offer far more flexible, interactive, immersive environments, with — especially when paired with a French eye for aesthetics — a certain more abstracted allure of their own.

This, then, was the road Raynal now started down. It was a tall order for a single programmer. Not only was he trying to create a functional 3D engine from scratch, but the realities of the European market demanded that he make it run on an 80286-class machine, hardware the Americans by now saw as outdated. Even Bonnell seemed to have no confidence in Raynal’s ability to bring his brainstorm to fruition. He allowed Raynal to work on it only on nights and weekends, demanding that he spend his days porting SimCity to the Commodore CDTV.

An artist named Didier Chanfray was the closest thing to a partner and confidante which Raynal had at Infogrames during his first year of working on the engine. It was Chanfray who provided the rudimentary graphics used to test it. And it was also Chanfray who, in September of 1991, saw the full engine in action for the first time. A character roamed freely around a room under the control of Raynal, able to turn about and bend his body and limbs at least semi-realistically. The scene could be viewed from several angles, and it could be lit — or not — by whatever light sources Raynal elected to place in the room. Even shadows appeared; that of the character rippled eerily over the furniture in the room as he moved from place to place. Chanfray had never seen anything like it. He fairly danced around Raynal’s desk, pronouncing it a miracle, magic, alchemy.

In the meantime, Bruno Bonnell had negotiated and signed a new licensing deal — not exactly a blockbuster, but something commensurate with a rebuilding Infogrames’s circumstances.


Something tentacled and other-worldly, it seems, got into the water at Infogrames from the start: Didier Chanfray provided this very Lovecraftian concept drawing for Raynal’s game long before the conscious decision was made to turn it a Lovecraft pastiche. Raynal kept the sketch tacked on the wall beside his desk throughout the project as a reminder of the atmosphere he was going for.

The American horror writer H.P. Lovecraft, who died well before the advent of the computer age in 1937, was nowhere near as well-known in 1991 as he is today, but his so-called “Cthulhu Mythos” of extra-dimensional alien beings, terrifying by virtue of their sheer indifference to humanity and its petty morality, had already made appearances in games. The very first work of ludic Lovecraftia would appear to be the 1979 computer game Kadath, an odd sort of parser-less text adventure. Two years later, at the height of the American tabletop-RPG craze, a small company called Chaosium published Call of Cthulhu, a game which subverted the power fantasy of tabletop Dungeons & Dragons in much the same way that Raynal’s project would soon be subverting that of so many computer games. Still, although Call of Cthulhu was well-supported by Chaosium and remained reasonably popular by the standards of its niche industry throughout the 1980s and beyond, its success didn’t lead to any Lovecraftian onslaught in the realm of digital games. The most notable early example of the breed is Infocom’s very effective 1987 interactive fiction The Lurking Horror. But, being all text at a time when text adventures were becoming hard sells, it didn’t make much commercial impact.

Now, though, Bonnell believed the time had come for a more up-to-date Lovecraftian computer game; he believed such a thing could do well, both in France and elsewhere.

Lovecraft had long had a strong following in France. From the moment his books were first translated into the language in 1954, they had sold in considerable numbers. Indeed, in 1991 H.P. Lovecraft was about as popular in France as he was anywhere — arguably more popular on a per-capita basis than in his native land. The game of Call of Cthulhu too had long since been translated into French, giving a potential digital implementation of it as much natural appeal there as in its homeland. So, Bonnell approached Chaosium about licensing their Call of Cthulhu rules for computers, and the American company agreed.

When viewed retrospectively, it seems a confusing deal to have made, one that really wasn’t necessary for what Infogrames would ultimately choose to do with Lovecraft. When Lovecraft died in obscurity and poverty, he left his literary estate in such a shambles that no one has ever definitively sorted out its confusing tangle of copyright claimants; his writing has been for all intents and purposes in the public domain ever since his death, despite numerous parties making claims to the contrary. Prior to publishing their Lovecraft tabletop RPG, Chaosium had nevertheless negotiated a deal with Arkham House, the publisher that has long been the most strident of Lovecraft’s copyright claimants. With that deal secured, Chaosium had promptly trademarked certain catchphrases, including “Call of Cthulhu” itself, in the context of games. Yet as it turned out Infogrames would use none of them; nor would they draw any plots directly from any of Lovecraft’s published stories. Like the countless makers of Lovecraftian games and stories that would follow them, they would instead draw from the author’s spirit and style of horror, whilst including just a few of his more indelible props, such as the forbidden book of occult lore known as the Necronomicon.

The first Lovecraftian game Infogrames would make would, of course, be the very game that Frédérick Raynal had now spent the last year or so prototyping during his free time. By the time news of his work reached Bonnell, most of Infogrames’s staff were already talking about it like the second coming. While the idea that had inspired it had been wonderfully innovative, it seemed absurd even to the original source of said idea to devote the best 3D engine anyone had ever seen to a game that literally wouldn’t let you see what it could do most of the time. It made perfect sense, on the other hand, to apply its creepy visual aesthetic to the Lovecraft license. The sense of dread and near-powerlessness that was so consciously designed into the tabletop RPG seemed a natural space for the computer game as well to occupy. It was true that it would have to be Call of Cthulhu in concept only: the kinetic, embodied, real-time engine Raynal had created wasn’t suitable for the turn-based rules of the tabletop RPG. For that matter, Raynal didn’t even like the Chaosium game all that much; he considered it too complicated to be fun.

Still, Bonnell, who couldn’t fail to recognize the potential of Raynal’s project, put whatever resources he could spare from his still-rebuilding company at the mild-mannered programmer’s disposal: four more artists to join Chanfray, a sound designer, a second programmer and project manager. When the team’s first attempts at writing an authentic-feeling Lovecraftian scenario proved hopelessly inadequate, Bonnell hired for the task Hubert Chardot, a screenwriter from 20th Century Fox’s French division, a fellow who loved Lovecraft so much that he had turned his first trip to the United States into a tour of his dead hero’s New England haunts. One of Chardot’s first suggestions was to add the word “alone” to the title of the game. He pointed out, correctly, that it would convey the sense of existential loneliness that was such an integral part of Lovecraftian horror — even, one might say, the very thing that sets it apart from more conventional takes on horror.

You can choose to enter the mansion as either of two characters.

The game takes place in the 1920s, the era of Lovecraft himself and of most of his stories (and thus the default era as well for Chaosium’s Call of Cthulhu game). It begins as you arrive in the deserted Louisiana mansion known as Derceto, whose owner Jeremy Hartwood has recently hanged himself. You play either as Edward Carnby, a relic hunter on the trail of a valuable piano owned by the deceased, or as Emily Hartwood, the deceased’s niece, eager to clear up the strange rumors that have dogged her uncle’s reputation and to figure out what really went down on his final night of life. The direction in which the investigation leads you will surprise no one familiar with Lovecraft’s oeuvre or Chaosium’s RPG: occult practices, forbidden books, “things man was never meant to know,” etc. But, even as Chardot’s script treads over this ground that was well-worn already in the early 1990s, it does so with considerable flair, slowly revealing its horrifying backstory via the books and journals you find hidden about the mansion as you explore. (There is no in-game dialog and no real foreground story whatsoever, only monsters and traps to defeat or avoid.) Like most ludic adaptations of Lovecraft, the game differs markedly from its source material only in that there is a victory state; the protagonist isn’t absolutely guaranteed to die or become a gibbering lunatic at the end.

One of the in-game journals, which nails the spirit and style of Lovecraft perfectly. As I noted in an earlier article about the writer, the emotion he does better than any other is disgust.

Yet Chaosium wasn’t at all pleased when Infogrames sent them an early build of the game for their stamp of approval. It seems that the American company had believed they were licensing not just their trademarks to their French colleagues, nor even the idea of a Lovecraft game in the abstract, but rather the actual Call of Cthulhu rules, which they had expected to see faithfully implemented. And, indeed, this may have been Bonnell’s intention when he was making the deal — until Raynal’s 3D engine had changed everything. Chaosium, who had evidently been looking forward to an equivalent of sorts to the Gold Box line of licensed Dungeons & Dragons CRPGs, felt betrayed. After some tense negotiation, they agreed to let Alone in the Dark continue without the Call of Cthulhu name on the box; some editions would include a note saying the game had been “inspired by the works of H.P. Lovecraft,” while others wouldn’t even go that far. In return for Chaosium’s largess on this front, Infogrames agreed to make a more conventional adventure game that would make explicit use of the Call of Cthulhu trademarks.

Call of Cthulhu: Shadow of the Comet, the fruit of that negotiation, would prove a serviceable game, albeit one that still didn’t make much direct use of the tabletop rules. But, whatever its merits, it would come and go without leaving much of a mark on an industry filled to bursting with graphical adventures much like it in terms of implementation. Alone in the Dark, on the other hand, would soon be taking the world by storm — and Chaosium could have had their name on it, a form of advertisement which could hardly have failed to increase their commercial profile dramatically. Chalk it up as just one more poor decision in the life of a company that had a strange talent for surviving — Chaosium is still around to this day — without ever quite managing to become really successful.

Infogrames got their first preview of just what an impact Alone in the Dark was poised to make in the spring of 1992, when Dany Boolauck, a journalist from the French videogame magazine Tilt, arrived to write a rather typical industry puff piece, a set of capsule previews of some of the company’s current works-in-progress. He never got any further than Alone in the Dark. After just a few minutes with it, he declared it “the best game of the last five years!” and asked for permission to turn the capsule blurb about it into a feature-length article, complete with a fawning interview with Raynal. (He described him in thoroughly overwrought terms: as a reincarnation of The Little Prince from Antoine de Saint-Exupéry’s beloved novella of the same name.) In a “review” published in the summer of 1992, still a couple of months before Infogrames anticipated releasing the game, he gave it 19 of 20 stars, gushing over its “exceptional staging” and “almost perfect character movement,” calling it “a revolution in the field of play” that “people must buy!”

Bruno Bonnell was pleased with the positive press coverage, but less thrilled by Boolauck’s portrayal of Raynal as the game’s genius auteur. He called in his introverted young programmer, who seemed a bit befuddled by all the attention, and told him to scrub the words “a Frédérick Raynal creation” from the end credits. Alone in the Dark, he said, was an Infogrames creation, full stop. Raynal agreed, but a grievance began to fester in his heart.

Thanks to Bonnell’s policy of not advertising the individuals behind Infogrames’s games, Raynal’s name didn’t spread quite so far and wide as that of such other celebrated gaming auteurs as Éric Chahi, the mastermind of Another World, France’s standout game from the previous year. Nevertheless, upon its European release in September of 1992, Raynal’s game stood out on its own terms as something special — as an artistic creation that was not just fun or scary but important to its medium. As one would expect, the buzz started in France. “We review many games,” wrote one magazine there. “Some are terrible, some mediocre, some excellent. And occasionally there comes along the game that will revolutionize the world of microcomputers, one that causes sleepless nights, one that you cannot tear yourself away from, can only marvel at. We bid welcome now to the latest member of this exclusive club: Alone in the Dark.” By the end of 1992, the game was a hit not only in France but across most of Europe. Now for America.

Bonnell closed a deal with the American publisher Interplay for distribution of the game there. Interplay had also published Another World, which had turned into a big success Stateside, and the company’s head Brian Fargo was sure he saw similar potential in Alone in the Dark. He thus put the game through his company’s internal testing wringer, just as he had Another World; the French studios had their strengths, but such detail work didn’t tend to be among them. Raynal’s game became a much cleaner, much more polished experience thanks to Interplay’s QA team. Yet Bonnell still had big international ambitions for Infogrames, and he wasn’t willing to let such a remarkable game as this one share with Another World the fate of becoming known to American players simply as an Interplay title. Instead he convinced Fargo to accept a unique arrangement. Interplay and Infogrames each took a stake in a new shared American subsidiary known as I-Motion, under which imprint they published Alone in the Dark.

The game took North America by storm in early 1993, just as it had Europe a few months earlier. It was that rarest of things in games, a genuine paradigm shift; no one had ever seen one that played quite like this. Worldwide, it sold at least 400,000 copies, putting Infogrames on the map in the United States and other non-European countries in the process. Indeed, amidst the international avalanche of praise and punditry, perhaps the most gratifying press notice of all reached Frédérick Raynal’s ears from all the way off in Japan. Shigeru Miyamoto, the designer of Super Mario Bros. and many other iconic Nintendo classics, proclaimed Alone in the Dark to be, more so than any other game, the one he wished he could have come up with.


Arguably the creepiest visual in the game is the weird mannequin’s head of your own character. Its crudely painted expression rather smacks of Chucky the doll from the Child’s Play horror films.

Seen from the perspective of a modern player, however, the verdict on Alone in the Dark must be more mixed. Some historically important games transcend that status to remain vital experiences even today, still every bit as fun and playable as the day they were made. But others — and please forgive me the hoary old reviewer’s cliché! — haven’t aged as well. This game, alas, belongs to the latter category.

Today, in an era when 3D graphics have long since ceased to impress us simply for existing at all, those of Alone in the Dark are pretty painful to look at, all jagged pixels sticking out everywhere from grotesquely octagonal creatures. Textures simply don’t exist, leaving everything to be rendered out of broad swatches of single colors. And the engine isn’t even holistically 3D: the 3D characters move across pasted-on pre-rendered backgrounds, which looks decidedly awkward in many situations. (On the other hand, it could have been worse: Raynal first tried to build the backgrounds out of digitized photographs of a real spooky mansion, a truly unholy union that he finally had to give up on.) Needless to say, a comparison with the lovingly hand-drawn pixel art in the adventure games being put out by companies like LucasArts and Sierra during this period does the crude graphics found here no favors whatsoever. Some of the visuals verge on the unintentionally comical; one of the first monsters you meet was evidently meant to be a fierce dragon-like creature, but actually looks more like a sort of carnivorous chicken. (Shades of the dragon ducks from the original Atari Adventure…)

Dead again! Killed by… Prince during his Purple Rain period?

Then, too, the keyboard-only controls are clunky and unintuitive, and they aren’t made any less awkward by a fixed camera that’s constantly shifting about to new arbitrary locations as you move through the environment; some individual rooms have as many as nine separate camera angles. This is confusing as all get-out when you’re just trying to get a sense of the space, and quickly becomes infuriating when you’re being chased by a monster and really, really don’t have time to stop and adjust your thinking to a new perspective.

The more abstract design choices also leave something to be desired. Sudden deaths abound. The very first room of the game kills you when you step on a certain floorboard, and every book is either a source of backstory and clues or an instant game-ender; the only way to know which it is is to save your game and open it. Some of the puzzles are clever, some less so, but even those that are otherwise worthy too often depend on you standing in just the right position; if you aren’t, you get no feedback whatsoever on what you’re doing wrong, and are thus likely to go off on some other track entirely, never realizing how close you were to the solution. This fiddliness and lack of attention to the else in the “if, then, else” dynamic of puzzle design is a clear sign of a game that never got sufficiently tested for playability and solubility. At times, the game’s uncommunicativeness verges on the passive-aggressive. You’ll quickly grow to loathe the weirdly stilted message, “There is a mechanism which can be triggered here,” which the game is constantly spitting out at you as you gaze upon the latest pixelated whatsit. Is it a button? A knob? A keyhole? Who knows… in the end, the only viable course of action is to try every object in your inventory on it, then go back and start trying all the other objects you had to leave lying around the house thanks to your character’s rather brutal inventory limit.

Fighting is a strange, bloodless pantomime.

Yes, one might be able to write some of the game’s issues off as an aesthetic choice — as merely more ways to make the environment feel unsettling. Franck de Girolami, the second programmer on the development team as well as its project leader, has acknowledged using the disorienting camera consciously for just that purpose: “We realized that the camera angles in which the player was the most helpless were the best to bring in a monster. Players would instantly run for a view in which they felt comfortable.” While one does have to admire the team’s absolute commitment to the core concept of the game, the line between aesthetic choice and poor implementation is, at best, blurred in cases like this one.

And yet the fact remains that it was almost entirely thanks to that same commitment to its core concept that Alone in the Dark became one of the most important games of its era. Not a patch on a contemporary like Ultima Underworld as a demonstration of the full power and flexibility of 3D graphics — to be fair, it ran on an 80286 processor with just 640 K of memory while its texture-mapped, fully 3D rival demanded at least an 80386 with 2 MB — it remained conceptually unlike anything that had come before in daring to cast you as an ordinary mortal, weak and scared and alone, for whom any aspirations toward glory quickly turn into nothing more than a desperate desire to just escape the mansion. For all that it threw the Call of Cthulhu rules completely overboard, it retained this most fundamental aspect of its inspiration, bringing Chaosium’s greatest innovation to a digital medium for the first time. It’s not always impossible to kill the monsters in Alone in the Dark — often it’s even necessary to do so — but, with weapons and ammunition scarce and your health bar all too short, doing so never fails to feel like the literal death struggle it ought to. When you do win a fight, you feel more relieved than triumphant. And you’re always left with that nagging doubt in the back of the mind as you count your depleted ammo and drag your battered self toward the next room: was it worth it?


The legacy of this brave and important game is as rich as that of any that was released in its year, running along at least three separate tracks. We’ll begin with the subsequent career of Frédérick Raynal, its original mastermind.

The seeds of that career were actually planted a couple of weeks before the release of Alone in the Dark, when Raynal and others from Infogrames brought a late build of it to the European Computer Trade Show in London. There he met the journalist Dany Boolauck once again, learning in the process that Boolauck had switched gigs: he had left his magazine and now worked for Delphine Software, one of Infogrames’s French competitors. Delphine had recently lost the services of their biggest star: Éric Chahi, the auteur behind the international hit Another World. As his first assignment in his own new job, Boolauck had been given the task of replacing Chahi with a similarly towering talent. Raynal struck him as the perfect choice; he rather resembled Chahi in many respects, what with his very French aesthetic sensibility, his undeniable technical gifts, and his obsessive commitment to his work. Boolauck called in Paul de Senneville, the well-known composer who had launched Delphine Software as a spinoff from his record label of the same name, to add his dulcet voice to the mix. “We wish to place you in a setting where you will be able to create, where you will not be bullied, where we can make you a star,” said the distinguished older gentleman. “We want to give free rein to the fabulous talent you showed in Alone in the Dark.” When Raynal returned to Lyon to a reprimand from Bruno Bonnell for letting his game’s planned release date slip by a week, the contrast between his old boss and the possible new one who was courting him was painted all too clearly.

Much to Raynal’s dismay, Bonnell was already pushing him and the rest of the team that had made the first Alone in the Dark to make a sequel as quickly as possible using the exact same engine. One Friday just before the new year, Bonnell threw his charges a party to celebrate what he now believed would go down in history as the year when his struggling company turned the corner, thanks not least to Raynal’s game. On the following Monday morning, Raynal knocked on Bonnell’s office door along with three other members of the newly christened Alone in the Dark 2 team, including his most longstanding partner Didier Chanfray. They were all quitting, going to work for Delphine, Raynal said quietly. Much to their surprise, Bonnell offered to match Delphine’s offer, the first overt sign he’d ever given that he understood how talented and valuable they really were. But his counteroffer only prompted Delphine to raise the stakes again. Just after New Years Day, Bonnell bowed out of the bidding in a huff: “You want to leave? Goodbye!”

A couple of weeks later, the videogame magazine Génération 4 held an awards ceremony for the previous year’s top titles at Disneyland Paris. Everyone who had been involved with Alone in the Dark, both those who still worked at Infogrames and those who didn’t, was invited. When, as expected, it took the prize for top adventure game, Bruno Bonnell walked onto the stage to accept the award on behalf of his company. The departure of Raynal and crew being the talk of the industry, the room held its collective breath to see what would happen next. “My name is Bruno Bonnell,” he said from behind the rostrum. “I’d like to thank God, my dog, my grandmother, and of course the whole team at Infogrames for a beautiful project.” And with that he stumped offstage again.

It hadn’t been a particularly gracious acceptance speech, but Raynal and his colleagues nonetheless had much to feel good about. Dany Boolauck and Paul de Senneville were true to their word: they set Raynal up with a little auteur’s studio all his own, known as Adeline Software. They even allowed him to run it from Lyon rather than joining the rest of Delphine in Paris.

Naturally, all of the Alone in the Dark technology, along with the name itself and the Chaosium license (whatever that was worth), stayed with Infogrames. Raynal and his colleagues were thus forced to develop a new engine in the style of the old and to devise a fresh game idea for it to execute. Instead of going dark again, they went light. Released in 1994, Little Big Adventure (known as Relentless: Twinsen’s Adventure in North America) was a poetic action-adventure set in a whimsical world of cartoon Impressionism, consciously conceived by Raynal as an antidote to the ultra-violent Doom mania that was sweeping the culture of gaming at the time. He followed it up in 1997 with Little Big Adventure 2 (known as Twinsen’s Odyssey in North America). Although both games were and remain lovely to look at, Raynal still struggled to find the right balance between the art and the science of game design; both games are as absurdly punishing to play as they are charming to watch, with a paucity of save points between the countless places where they demand pin-point maneuvering and split-second timing. This sort of thing was, alas, something of a theme with the French games industry for far too many years.

This, then, is one legacy of Alone in the Dark. Another followed on even more directly, taking the form of the two sequels which Infogrames published in 1993 and 1994. Both used the same engine, as Bruno Bonnell had demanded in the name of efficiency, and both continued the story of the first game, with Edward Carnby still in the role of protagonist. (Poor Emily Hartwood got tossed by the wayside.) But, although Hubert Chardot once again provided their scripts, much of the spirit of the first game got lost, as the development team began letting the player get away with much more head-to-head combat. Neither sequel garnered as many positive reviews or sales as the original game, and Infogrames left the property alone for quite some time thereafter. A few post-millennial attempts to revive the old magic, still without the involvement of Raynal, have likewise yielded mixed results at best.

But it’s with Alone in the Dark‘s third legacy, its most important by far, that we should close. For several years, few games — not even its own sequels — did much to build upon the nerve-wracking style of play it had pioneered. But then, in 1996, the Japanese company Capcom published a zombie nightmare known as Resident Evil for the Sony Playstation console. “When I first played Resident Evil,” remembers Infogrames programmer Franck de Girolami, “I honestly thought it was plagiarism. I could recognize entire rooms from Alone in the Dark.” Nevertheless, Resident Evil sold in huge numbers on the consoles, reaching a mass market the likes of which Alone in the Dark, being available only on computers and the 3DO multimedia appliance, could never have dreamed. In doing so, it well and truly cemented the new genre that became known as survival-horror, which had gradually filtered its way up from the obscure works of a poverty-stricken writer to a niche tabletop RPG to a very successful computer game to a mainstream ludic blockbuster. Culture does move in mysterious ways sometimes, doesn’t it?

(Sources: the books La Saga des Jeux Vidéo by Daniel Ichbiah, Designers & Dragons: A History of the Roleplaying Game Industry, Volume 1 by Shannon Appelcline, and Alone in the Dark: The Official Strategy Guide by Johan Robson; Todd David Spaulding’s PhD thesis H.P. Lovecraft & The French Connection: Translations, Pulps, and Literary History”; Computer Gaming World of February 1993; Amiga Format of June 1991; Edge of November 1994; Retro Gamer 98. Online sources include Adventure Europe‘s interview with Frédérick Raynal, Just Adventure‘s interview with Hubert Chardot, and the video of Frédérick Raynal’s Alone in the Dark postmortem at the 2012 Game Developers Conference. Note that many of the direct quotations in this article were translated by me into English from their French originals.

The original Alone in the Dark trilogy is available as a package download at GOG.com.)

 
 

Tags: , , ,

Interplay Takes on Trek

Original-series Star Trek is the only version I’ve ever been able to bring myself to care about. Yet this Star Trek I once cared about a great deal.

Doubtless like many of you of a similar age, I grew up with this 1960s incarnation of the show — the incarnation which its creator Gene Roddenberry so famously pitched to the CBS television network as Wagon Train to the Stars, the one which during my childhood was the only Star Trek extant. Three or four Saturdays per year, a local UHF television station would run a Star Trek marathon, featuring nine or ten episodes back to back, interspersed with interviews and other behind-the-scenes segments. Strange as it now sounds even to me in this era when vintage media far more obscure than Star Trek is instantly accessible at any time, these marathons were major events in my young life. I particularly loved the give and take on the bridge of the starship Enterprise during episodes such as “Balance of Terror,” which were heavily inspired by the naval battles of World War II. Upon realizing this, I became quite the little war monger for a while there, devouring every book and movie I could find on the subject. Even after it had slowly dawned on me that in the final reckoning the death and suffering brought on by war far outweigh any courage or glory it might engender, the fascination with history which had been thus awakened never died.

I loved the Star Trek movies of the 1980s as well. Young though I was, I recognized the poignancy inherent in watching the now middle-aged cast cram their increasingly substantial frames back into the confines of their Starfleet uniforms every couple of years. Yes, this made effortless fodder for the late-night comedians, but there was also a wry wisdom to these movies that one doesn’t usually find in such blockbuster fare, as the actors’ aging off-screen selves merged with their onscreen personas in a way we don’t often see in mainstream mass media. Think, for example, of the scene in Star Trek II: The Wrath of Khan where McCoy comes to visit Kirk and present him with his first pair of reading glasses. Decades before I fully understood what that moment — not to mention an expanding middle-aged waistline! — means in real life, I could sense the gravitas of the scene. I credit this side of Star Trek with showing me that there is as much drama and interest in ordinary life as there is in fantastic adventures in outer space. It primed me for the evening I begrudgingly opened Ethan Frome for my English class, and proceeded to devour it over the course of the next several rapt, tear-streaked hours. My English teacher was right, I realized; books without any spaceships or dragons in them really could be pretty darn great. Some years later, I took my bachelor degree in literature.

It must have been about the time I was discovering Ethan Frome that Star Trek: The Next Generation debuted on television. Like most of my peers, I was hugely excited by the prospect, and tuned in eagerly to the first episode. Yet I was disappointed by what I saw. The new incarnation of the Enterprise seemed cold and antiseptic in comparison to the old ship’s trusty physicality. Nor did I care for the new crew, who struck me as equally bland and bloodless. Being smart enough even at this tender age to recognize that fictional personalities, like real ones, need time to ripen and deepen, I gave the show another chance — repeatedly, over the course of years. But it continued to do nothing for me. Instead of Wagon Train to the Stars, this version struck me as Bureaucrats in Space.

All of this, I’ll freely admit, may have more to do with the fact that The Next Generation came along after I had passed science fiction’s golden age of twelve than anything else. Nevertheless, it does much to explain why I’m the perfect audience for our subject of today: the two Star Trek adventure games which Interplay made in the early 1990s. Throwbacks to the distant past of the franchise even when they were brand new, they continue to stand out from the pack today for their retro sensibilities. Fortunately, these are sensibilities which I unabashedly share.



Star Trek hadn’t been well-served by commercial computer games prior to the 1990s. Corporate nepotism had placed its digital-game rights in the slightly clueless hands of the book publisher Simon & Schuster, which was owned, like the show’s parent studio Paramount Pictures, by the media conglomerate known as Gulf and Western. The result had been a series of games that occasionally flirted with mediocrity but more typically fell short of even that standard. Even as each new Star Trek film topped the box-office charts, and even after Star Trek: The Next Generation became the most successful first-run series in the history of syndicated television, Simon & Schuster’s games somehow managed not to become hits. At decade’s end, Paramount granted the rights to a game based on the film Star Trek V: The Final Frontier to the dedicated computer-game publisher Mindscape, but the end product proved little better than what had come before in terms of quality or commercial success. Still, the switch to Mindscape did show that an inkling of awareness of the money all these half-assed Star Trek games were leaving on the table was dawning at last upon Paramount.

As the new decade began, the silver anniversary of the original series’s first broadcast on September 8, 1966, was beginning to loom large. Paramount decided to celebrate the occasion with something of a media blitz, anchored by a two-hour television special that would air in 1991 as close as possible to the show’s exact 25th anniversary. For the first time on this occasion, Paramount decided to make digital games into a concerted part of their media strategy rather than an afterthought. They signed a contract with the Japanese company Konami to make a game, entitled simply Star Trek: 25th Anniversary, for the Nintendo Entertainment System, the heart of the videogame mass market, and for the Nintendo Gameboy, the hot new handheld videogame system. Rather than the Next Generation crew or even the original Enterprise crew in their most recent, most rotund incarnations, these games were to wind the clock all the way back to those heady early days of 1966, when Captain Kirk was still happy to appear on camera with his shirt off.

That deal still left a space for an anniversary title in the computer-game market. Said market was, it was true, much smaller than the one for Nintendo games, but it was notable for its older, more well-heeled buyers willing to pay more money for more ambitious games. Yet computer-game publishers proved more reluctant to sign on for the project than the broad popularity of the Star Trek brand in general might lead one to believe.

It didn’t require the benefit of hindsight to see that the Star Trek franchise, although it was indeed more popular than ever before, was going through a period of transition in 1990. The Next Generation had been on the air for three seasons now and was heading into a fourth; it was thus about to exceed the on-air longevity of the series that had inspired it. Meanwhile the cast of that older series were bowing to the realities of age at last; it had been announced that Star Trek VI: The Undiscovered Country, due for release in late 1991, was to be the last feature film in which they would star. A time when the Next Generation crew would become the default face of Star Trek, the original crew creaky anachronisms, was no longer impossible to imagine.

Given this passing of the torch that seemed to be in progress, most computer-game publishers were skeptical of Paramount’s plans for games featuring the original Enterprise and its crew in their youngest incarnations. They felt that this version of Star Trek was already all but dead in commercial terms, what with the success of all of the franchise’s more recent productions.

Brian Fargo of Interplay Entertainment was among the few who didn’t agree with this point of view. He pitched a computer game to Paramount that would share a name with Konami’s efforts, but would otherwise be a completely separate experience. Aided by his natural charm and the relative disinterest of most of Interplay’s competitors, he made the deal.

Disinterested competitors or no, it was quite a coup for his company, nowhere close to the largest or most prominent in its industry, to secure a license to make Star Trek games — especially given that the deal was made just months after Interplay had acquired the rights to another holy totem of nerd culture, The Lord of the Rings. While the Tolkien games would prove rather a disappointment, the Star Trek license would work out better all the way around.

Interplay signed an open-ended contract with Paramount which allowed them to make Star Trek games all the way until the year 2000, with some significant restrictions: they would be subject to the studio’s veto power over any and all of their aspects, and they could be set only in the time of Captain Kirk and company’s first five-year mission. With these restrictions in mind, Interplay set out to make a game that would be slavishly faithful to the original television series’s format. Instead of a single epic adventure, the game would consist of eight independent “episodes,” each roughly equivalent in plot complexity and geographic scope to those that had aired on television back in the day.

The structure of each episode would be the same: the Enterprise would be called upon by Starfleet to handle some new crisis at the episode’s beginning, whereupon the player would have to warp to the correct star system and engage in some action-oriented space combat, before beaming down to the real heart of the problem and sorting it all out in the guise of an adventure game. Interplay noted that the episodic format could make for a refreshing change from the norm in adventure games, being amenable to a more casual approach. Each episode would be designed to be completable in an evening; after finishing one of them, you could start on the episode that followed the next day, the next week, or the next month, without having to worry about all of the plot and puzzle threads you left dangling last time you played. From Fargo’s perspective, the episodic structure also had the advantage that each part of the game could be designed without much reference to or dependence on any of the others; this made things vastly easier from the standpoint of project management.

Fargo turned to a familiar source for the episodes’ scripts: Michael Stackpole, a member of the Arizona Flying Buffalo fraternity who had played a leading role on Interplay’s Wasteland CRPG and contributed to such other titles as Neuromancer. Stackpole had been busying himself recently with writing tie-in novels set in the universe of the BattleTech tabletop-game franchise. He thus thought that he knew what to expect from working with a licensed property, but he was unprepared for the degree of micromanagement that a bureaucratic giant like Paramount, stewarding one of the most valuable media properties of the age, was willing to engage in. He submitted scripts for fifteen episodes for a game that was anticipated to contain only eight, assuming that should surely cover all his bases; Interplay and Paramount could decide between themselves which eight they actually wanted to include.

To everyone’s shock, Paramount outright rejected all but a handful of them weeks later, usually for the most persnickety of reasons. Interplay’s frustration was still evident in a preview of the game published much later in Computer Gaming World magazine, which noted that “the film studio decided against plot elements derived from episodes which were already part of the Star Trek legend.” With Stackpole having returned to writing his novels, Fargo brought in Elizabeth Danforth, another Flying Buffalo alumnus who had worked with Interplay before, to write more episodes and shepherd them through the labyrinthine approval process.

All of this was happening during one of the most chaotic periods in Interplay’s history. Their distributor Mediagenic had just collapsed, defaulting on hundreds of thousands of dollars they had owed to Interplay and destroying the company’s precious pipeline to retail. The Lord of the Rings game, which was supposed to have been their savior, missed the Christmas 1990 buying season and, when it did finally ship early the following year, met with lukewarm reviews and disappointing sales. Only the strategy game Castles, an out-of-left-field hit from a third-party developer, kept them alive.

Amidst it all, the team making Star Trek: 25th Anniversary kept plugging away — but, inevitably, the game fell behind schedule. September of 1991 arrived, bringing with it the big television special and the Nintendo Entertainment System game, but Interplay’s own tie-in product remained far from complete. It didn’t ship until March of 1992, by which time all of the anniversary hoopla was in the past. Interplay’s game had all the trappings of an anticlimax; it really should have been known as Star Trek: 26th Anniversary, noted more than one commentator pointedly. For those inside the company, the story of the game was taking on some worrisome parallels to that of their Lord of the Rings title: a seeming surefire hit of a high-profile licensed game that arrived late and wound up underwhelming everyone.

They needn’t have worried. Star Trek: 25th Anniversary was a much more polished, more fully realized evocation of its source material than The Lord of the Rings had been, and it came at one of the Star Trek franchise’s high-water marks in popularity. Star Trek VI, which had hit theaters just three months before Interplay’s game, had become everything one could have hoped for from the original crew’s valedictory lap, garnering generally stellar reviews and impressive box-office receipts. Meanwhile The Next Generation was now in its fifth season on television and more popular than ever. The only shadow over proceedings was the death of Gene Roddenberry, the creator of Star Trek, on October 24, 1991. Yet even that event was more help than hindrance to the Interplay game’s commercial prospects, in that it created an appetite among wistful fans to look back to the franchise’s beginnings.

Interplay dedicated Star Trek: 25th Anniversary to Gene Roddenberry.

Indeed, Star Trek: 25th Anniversary thrived in this febrile atmosphere of contemporary success tinged with nostalgia. It became the biggest Interplay hit since Battle Chess, selling over 250,000 copies in all and doing much to set the company’s feet back on firm financial ground after the chaos of the previous couple of years.



The game continues to stand up fairly well today, with a few caveats. Undoubtedly its least satisfying aspect is the space-combat sequence that must be endured at the beginning of each episode. Perhaps not coincidentally, this is one of the few places where the game isn’t faithful to the spirit of Star Trek.

Science fiction’s two most successful media franchises take very different approaches to battles in outer space: while Star Trek portrays its combatants as lumbering naval vessels, jockeying for position in a slow-paced tactical game of cat and mouse, Star Wars looks to the skies of World War II for inspiration, opting for frenetic dog fights in space. But 25th Anniversary goes all-in for Star Wars instead of Star Trek in this respect; the Enterprise turns into Luke Skywalker’s X-Wing fighter, dodging and weaving and spinning on a dime in response to the joystick. The reason for the switch can be summed up in two words: Wing Commander. Origin Systems’s cinematic action game of outer-space dog-fighting was taking the market by storm as Interplay was starting work on their own science-fiction game, and the company wanted to capitalize on their rival’s success. They described their game as “Sierra meets Wing Commander” at early trade-show presentations, and even made it possible to engage in randomized fights just for fun by visiting star systems other than those to which you’ve been directed, just in case the fighting you get to do in the episodes proper isn’t enough for you.

That was quite the stretch; the combat in 25th Anniversary really isn’t much fun as anything more than an occasional palate cleanser, and it’s hard to imagine anyone voluntarily deciding to look for more of it. Not only does this part of the game clash with its faithfulness to Star Trek in just about every other respect, but it doesn’t work even on its own terms. The controls are awkward, it’s hard to understand where your enemies are in relation to you, and it’s simply too hard — a point I’ll be returning to later. For now, suffice to say that Star Trek: 25th Anniversary ain’t no Wing Commander.

The worst part by far of Star Trek: 25th Anniversary.

Thankfully, the rest of the game — the “Sierra” in Interplay’s pithy formulation — is both more engaging and more faithful to the Star Trek of old. When you leave the Enterprise‘s bridge, the game turns into a point-and-click graphic adventure, marking the first time Interplay had dabbled in the format since Tass Times in Tonetown back in 1986. You control Kirk directly, but Spock, McCoy, and some poor expendable redshirt also come along, ready to offer their advice and use their special talents when needed — or, in the case of the redshirt, to take one for the team, dying so that none of the regulars have to do so.

The interface can be a little confusing at first; it’s not always clear when you should be “using” Spock or McCoy themselves on something and when you should be using their tricorders. But you start to get a feel for things after just a few minutes, and soon the interface fades into the background of what could stand on its own as a solid little graphic adventure — or, rather, eight solid little mini-adventures. Some of the puzzles can get a bit fiddly, but there are no outrageously unfair ones. The episodic nature of the game does much to make it manageable by limiting the possibility space you need to explore in order to solve any given puzzle; most of the episodes play out over just half a dozen or so locations.

Still, what elevates a fairly workmanlike adventure game to something far more memorable is the Star Trek connection. This is clearly a game made by and for fans of the source material. If you count yourself among them, you almost can’t help but be delighted. The writers do a great job of evoking the characters we know and love; McCoy lays into Spock like the old racist country doctor he is, Spock plays such a perfect straight man that one can’t help but suspect that he’s laughing up his sleeve behind his facade of “logic,” and Kirk still loves to egg them both on and enjoy the fireworks.

Star Trek: 25th Anniversary apes the look of its source material down to the title card that opens each episode.

The interactive episodes are true to the rhythms of their non-interactive antecedents; each one begins with a title card superimposed over a stately Enterprise soaring toward its latest adventure, and ends with some humorous banter on the bridge and a final command from Kirk of “Warp factor 4!” to send it on its way to the next. Even the visuals, presented in slightly pixelated low-res VGA, conjure up the low-rent sets of the show; more photo-realistic graphics, one suspects, would only ruin the effect. For the music, George “The Fat Man” Sanger and Dave Govett, whose work was everywhere during this period — they scored Wing Commander and Ultima Underworld as well, among many others — mix the familiar Star Trek theme with their own period-perfect motifs. The only things missing from their score in comparison to that of the original show are those oh-so-sixties orchestral stabs at dramatic moments. (There does come a point, Sanger and Govett must have decided, when nostalgia descends into outright cheese.)

It’s true that the episodes work more on the level of pastiche than that of earnest attempts at storytelling — another reason that enjoying this game probably does require you be a fan of vintage Star Trek. Most of the scripts read like a Mad Libs take on the original series, mixing and matching its most familiar tropes. The crew has to shut down another misguided computer (a la “A Taste of Armageddon”), engage in some gunboat diplomacy with the Romulans (“The Enterprise Incident”), and negotiate an earthly religious mythos transplanted to another planet (“Who Mourns for Adonais?”). Harry Mudd, the intergalactic con man whose antics featured in two episodes of the original series, makes a third appearance here. Even Carol Marcus, scientist and Kirk paramour, shows up to foreshadow the major role she’ll later play in the movie Star Trek II.

Star Trek: 25th Anniversary in its graphic-adventure mode. The gang’s all here, including the poor terrified red shirt hiding behind a pillar.

If none of the interactive episodes can challenge the likes of “The City on the Edge of Forever” for the crown of Best of Trek, they’re certainly far less embarrassing than most of what the series produced during its painfully bad third season. They encompass the full tonal palette of the show, from screwball comedy to philosophical profundity. The graphic-adventure format does force a shift in emphasis away from dialog and action to more cerebral activities — the Kirk on television never had to slow down to solve set-piece logic puzzles like some of the ones we see here — but that shift is entirely understandable.

Unfortunately, all of the good will the game engenders is undermined to a considerable extent by one resoundingly terrible design decision — a decision that’s ironically built upon a foundation of very good design choices. Each episode permits multiple solutions to most of the problems it places before you; this is, of course, a good thing. At the end of each episode, assuming you don’t get yourself killed, you receive an evaluation from Starfleet Command in the form of a percentile grade. You’re rewarded with a better grade if you’ve managed to keep the poor redshirt who beamed down with you alive — this game’s writers show far more compassion for the expendable crew members than the original series’s writers ever did! — and if you’ve accomplished things with a minimum of violence — i.e., if you’ve kept your metaphorical and sometimes literal phasers on “stun” rather than “kill.” All of this too is a good thing, seeming evidence of a progressive design sensibility that’s become ubiquitous today, when countless games let you finish each scenario with a bronze, silver, or gold star, allowing you to be exactly as completionist and perfectionist as you choose to be.

But now the bad part comes in. The final grades you receive on the episodes affect the performance of your crew during the remaining space-combat sequences, which themselves become steadily more difficult as you progress through the game. In fact, the final battle is so hard that you virtually have to have scored 100 percent on all of the preceding episodes to even have a chance in it. It turns out that the seeming easygoing attitude of the game, encouraging you to do better but letting you slide if you just want to move on through the episodes, has been a colossal lie, an ugly trap to get you 90 percent of the way to the finish line and then stop you cold. This is like a caricature of awful, retrograde game design — something even Sierra at their absolute nadir would have thought twice about. Either tell the player at the end of the episode that she just hasn’t done well enough and make her do it again, or honor your promise to let her continue with a less than stellar score. Don’t lie to her about it and then cackle about how you got her in the end.

Pro tip: this is not good enough to get you through the game.

Not only is this design decision terrible on its own terms, but it clashes with all of the implications of Interplay’s own characterization of Star Trek: 25th Anniversary as a more casual sort of adventure game than the norm, one that will let you play through a satisfying episode in a single relaxing evening. Interplay heard about this cognitive dissonance from their fans — heard so much about it that they begrudgingly issued an optional patch that let players skip past the combat sequences altogether by triggering a hot key. It wasn’t the most elegant solution, but it was better than nothing.

This discordant note aside, the worst complaint you could make about Star Trek: 25th Anniversary in 1992 is one that doesn’t apply anymore today: that it was just a bit short in light of its $40 street price. And yet, worthy effort though Interplay’s first Star Trek game is on the whole, they would comprehensively top it with their second.



Given 25th Anniversary‘s commercial success and the open-ended license Interplay had acquired from Paramount, a sequel was rather inevitable. There wasn’t much point in making bold changes to a formula that had worked so well. Indeed, when they made the sequel they elected to change nothing whatsoever on the surface, retaining the same engine, the same episodic structure, and even the same little-loved combat sequences. Yet when we peer beneath the surface we see the product of a development team willing to learn from their mistakes. As sometimes happens in game development, the fact that the necessary enabling technology was already in place in the form of an existing engine allowed design in the abstract to come even more to the fore in the sequel. The end result is a game that, while hardly a transformative leap over its predecessor, is less frustrating, more narratively ambitious, and even more fun to play.

Although Star Trek: Judgment Rites continues with the episodic structure of its predecessor, it adapts it to a format more typical of television shows of the 1990s than those of the 1960s. An overarching “season-long” story arc is woven through the otherwise discrete episodes, to come to a head in a big finale episode. This gives the game a feeling of unity that its predecessor lacks.

Even more welcome, however, is a new willingness within the individual episodes to move beyond pastiche and into some narratively intriguing spaces of their own. Virtually all of Judgment Rites‘s episodes, written this time by the in-house Interplay employees Scott Bennie and Mark O’Green in addition to the returning contractors Michael Stackpole and Elizabeth Danforth, mix things up rather than stick with the unbending 25th Anniversary formula of a space combat followed by Kirk, Spock, McCoy, and a semi-anonymous redshirt beaming down somewhere. Combat this time around is neither as frequent nor as predictably placed in the episodes, and the teams that beam down now vary considerably; Scotty, Uhura, and Sulu all get at least one chance of their own to come along and use their special talents.

My favorite episode in Judgment Rites also happens to be the longest and most complex in either of the games. In Bennie’s “No Man’s Land,” a team from the Enterprise beams down to a planet which is being forced to reenact a simulacrum of Earth’s World War I by Trelane, the childish but almost infinitely powerful demigod who was introduced in the original-series episode “The Squire of Gothos.” As his inclusion would indicate, “No Man’s Land” is very aware of Star Trek lore. It’s plainly meant partially as an homage to the original show’s occasional “time-travel” episodes, like “Tomorrow is Yesterday,” “A Piece of the Action,” or “Patterns of Force.” These were beloved by fans for giving the familiar crew the chance to act out a bit in an entirely different milieu. (They were beloved by the show’s perpetually cash-strapped producers for another reason: they let them raid their studio’s stash of stock sets, props, and costumes).

Yet “No Man’s Land” transcends homage to become a surprisingly moving meditation on the tragedy of a pointless war.

Another standout is Stackpole’s “Light and Darkness,” a pointed allegory about the folly of eugenics.

In addition to showing far more confidence in its storytelling, Judgment Rites also addresses the extreme difficulty of the space-combat sequences in its predecessor and the false promise that is letting you continue after completing an episode with a less-than-perfect score. You now have a choice between no combat at all, easy combat, and hard combat. The middle setting is calibrated just about right. Combat at this level, while still a long way from the likes of Wing Commander, becomes an occasional amusing diversion that doesn’t overstay its welcome instead of an infuriating brick wall that kills the rhythm of the game. And, at this level, moving on from any given episode with a score of less than 100 percent is no longer a fool’s gambit.

Although a better game than its predecessor in almost every respect, Judgment Rites couldn’t muster the same sales. It didn’t ship until December of 1993 — i.e., almost two full years after 25th Anniversary — and by that time the engine was beginning to show its age. Nor did it help that Interplay themselves undercut its launch by releasing a “talkie” version of the first game on CD-ROM just a month later.

That said, it’s not hard to understand Interplay’s eagerness to get the talkie version onto the market. In what can only be described as another major coup, Interplay, working through Paramount, brought in the entirety of the original cast to voice their iconic roles. At a time when many CD-ROM-based games were still being voiced by their programmers, it promised to be quite a thrill indeed to listen to the likes of William Shatner, Leonard Nimoy, and Deforest Kelley in the roles that had made them famous.

The reality was perhaps a little less compelling than the promise. While no one would ever accuse any member of the show’s cast of being a master thespian in the abstract, they had been playing these roles for so long that doing so once more for a computer game ought to have presented little problem on the face of it. Yet they plainly struggled with this unfamiliar medium. Their voice acting runs the gamut from bored to confused, but almost always sounds like exactly what it is: actors in front of microphones reading lines on a page. It seems that none of them knew anything about the stories to which the lines related, which can only be construed as a failure on Interplay’s part — albeit one perhaps precipitated by the sharply limited amount of time during which they had the actors at their disposal. Over the course of a scant few days, the cast was asked to voice all of the dialog not for one but for two complete games; the voices for a CD-ROM version of Judgment Rites were recorded at the same time. And they had to do it all bereft of any dramatic context whatsoever.

Somewhat disapointing though the final result is, these sessions represent a melancholy milestone of their own in Trek history, marking the last time the entire cast to the original show was assembled for a new creative project. As such, the talkie versions of these games are the last gasps of an era.

Personally, though, I prefer the games without voices — not only because of the disappointing voice work but because Interplay chose to implement it in a really annoying way, with Kirk/Shatner saying each choice in every dialog menu before you choose one. Interplay, like most of their peers, was still scrambling to figure out what did and didn’t work in this new age of multimedia computing.

Despite holding a license to the original series for the balance of the decade, Interplay would never release another game set in this era of Star Trek after the talkie version of Judgment Rites shipped in March of 1994. The company did work intermittently on an ambitious 3D action-adventure featuring Kirk and the rest of the classic crew, tentatively entitled Secret of Vulcan Fury, near the end of the decade, but never came close to finishing it. Gamers and Trekkies were moving on, and the newer incarnations of the show were becoming, just as some had predicted they would, the default face of the franchise. Indeed, no Star Trek game since the two Interplay titles discussed in this article has revisited the original show. This fact only makes 25th Anniversary and especially Judgment Rites all the more special today.



That would make for a good conclusion to this article, but we do have one more thing to cover — for no article about Interplay’s takes on classic Trek could be complete without the media meme they spawned.

Like a fair number of other memes, this one involves William Shatner, for more than half a century now one of the odder — and more oddly endearing — characters on the media landscape. Back when he was a struggling young actor trying to make it in Hollywood, it was apparently drilled into him by his agents that he should never, ever turn down paying work of any kind. He has continued to live by this maxim to this day. Shatner will do absolutely anything if you pay him enough: pitch any product, sing-talk his way through fascinatingly terrible albums, “write” a new memoir every couple of years along with some of the worst science-fiction novels in history. He’s the ultimate cultural leveler, seeing no distinction between a featured role in a prestigious legal drama and one in a lowest-common-denominator sitcom based on someone’s Twitter feed.

And yet he manages to stay in the public’s good graces by doing it all with a wink and a nod that lets us know he’s in on the joke; when he goes on a talk show to plug his latest book, he can’t even be bothered to seriously pretend that he actually wrote it. He’s elevated crass commercialism to a sort of postmodern performance art. When the stars align, the kitschy becomes profound, and the terrible becomes wonderful. (“Why is this good?” writes a YouTube commenter in response to his even-better-than-the-original version of “Common People.” “It has no right to be this good.”) For this reason, as well as because he’s really, truly funny — one might say that he’s a far better comedian than he ever was an actor — he gets a pass on everything. At age 88 as of this writing, he remains the hippest octogenarian this side of Willie Nelson.

In keeping with his anything-for-a-buck career philosophy, Shatner is seldom eager to spend much time second-guessing — much less redoing — any of his performances. His reputation among media insiders as a prickly character with a taste for humiliation has long preceded him. It’s especially dangerous for anyone he perceives as below him on the totem pole to dare to correct him, challenge him, or just voice an opinion to him. Like a dog, he can smell insecurity, and, his eagerness to move on to the next gig notwithstanding, he’s taken a malicious delight in tormenting many a young assistant director. Craig Duman, the Interplay sound engineer who was given the task of recording Shatner’s lines for the CD-ROM versions of 25th Anniversary and Judgment Rites, can testify to this firsthand.

The problem began when Shatner was voicing the script for the first episode of Judgment Rites. Coming to the line, “Spock, sabotage the system,” he pronounced the word “sabotage” rather, shall we say, idiosyncratically: pronouncing the vowel of the last syllable like “bad” rather than “bod.” A timid-sounding Duman, all too obviously overawed to be in the same room as Captain Kirk, piped up to ask him to say the line again with the correct pronunciation — whereupon Shatner went off. “I don’t say sabotahge! You say sabotahge! I say sabotage!” (You say “potato,” I say “potahto?”) His concluding remark was deliciously divaish: “Please don’t tell me how to act. It sickens me.”


This incident would have remained an in-joke around Interplay’s offices had not an unknown employee from the sound studio they used leaked it to the worst possible person: morning-radio shock jock Howard Stern. Driving to work one morning, Brian Fargo was horrified to hear the outtake being broadcast across the country by this self-proclaimed “King of All Media.” Absent the “it sickens me,” the clip wouldn’t have had much going for it, but with it it was absolutely hilarious; Stern played it over and over again. Fargo was certain he had just witnessed the death of one of Interplay’s most important current projects.

He was lucky; it seems that Shatner wasn’t a regular Howard Stern listener, and didn’t hear about the leak until after both of the talkies had shipped. But the clip, being short enough to encapsulate in a sound file manageable even over a dial-up connection, became one of the most popular memes on the young World Wide Web. It also found a receptive audience within Hollywood, where plenty of people had had similar run-ins with Shatner’s prickly off-camera personality. It finally made its way into the 1999 comedy film Mystery Men, where Ben Stiller parrots, “Please don’t tell me how to act. It sickens me,” on one occasion, and Janeane Garofalo later inserts a pointed, “You say sabotahge! I say sabotage!”

Thank to Howard Stern, Mystery Men, and the mimetic magic of the Internet, this William Shatner outtake has reached a couple of orders of magnitude more people than ever played the game which spawned it; most of those who have engaged with the meme have no idea of its source. If it seems unfair that this of all things should be the most enduring legacy of Interplay’s loving re-creations of the Star Trek of yore, well, such is life in a world of postmodern media. As Shatner himself would attest, just reaching people, no matter how you have to do it, is an achievement of its own. And if you can make them laugh while you’re about it, so much the better.

(Sources: Computer Gaming World of December 1991, May 1992, March 1994, and May 1994; Questbusters of April 1992; Origin Systems’s internal newsletter Point of Origin of December 9 1991; the special video features included with the Star Trek: Judgment Rites Collector’s Edition. Online sources include Matt Barton’s interview with Brian Fargo and Fargo’s appearance on Angry Centaur Gaming’s International Podcast. Finally, some of this article is drawn from the collection of documents that Brian Fargo donated to the Strong Museum of Play.

Star Trek: 25th Anniversary and Judgment Rites are both available for purchase from GOG.com.)

 

Tags: , , ,

Another World

The French creative aesthetic has always been a bit different from that of English-speaking nations. In their paintings, films, even furniture, the French often discard the stodgy literalism that is so characteristic of Anglo art in favor of something more attenuated, where impression becomes more important than objective reality. A French art film doesn’t come off as a complete non sequitur to Anglo eyes in the way that, say, a Bollywood or Egyptian production can. Yet the effect it creates is in its way much more disorienting: it seems on the surface to be something recognizable and predictable, but suddenly zigs where we expect it to zag. In particular, it may show disconcertingly little interest in the logic of plot, that central concern of Anglo film. What affects what and why is of far less interest to a filmmaker like, say, François Truffaut than the emotional affect of the whole.

Crude though such stereotypes may be, when the French discovered computer games they did nothing to disprove them. For a long time, saying a game was French was a shorthand way for an Anglo to say that it was, well, kind of weird, off-kilter in a way that made it hard to judge whether the game or the player was at fault. Vintage French games weren’t always the most polished or balanced of designs, yet they must still be lauded today for their willingness to paint in emotional colors more variegated than the trite primary ones of fight or flight, laugh or cry. Such was certainly the case with Éric Chahi’s Another World.


France blazed its own trail through the earliest years of the digital revolution. Most people there caught their first glimpse of the digital future not through a home computer but through a remarkable online service called Minitel, a network of dumb terminals that was operated by the French postal and telephone service. Millions of people installed one of the free terminals in their home, making Minitel the most widely used online service in the world during the 1980s, dwarfing even the likes of CompuServe in the United States. Those in France who craved the capabilities of a full-fledged computer, meanwhile, largely rejected the Sinclair Spectrums and Commodore 64s that were sweeping the rest of Europe in favor of less universal lines like the Amstrad CPC and the Oric-1. Apple as well, all but unheard of across most of Europe, established an early beachhead in France, thanks to the efforts of a hard-charging and very Gallic general manager named Jean-Louis Gassée, who would later play a major role in shepherding the Macintosh to popularity in the United States.

In the second half of the 1980s, French hardware did begin to converge, albeit slowly, with that in use in the rest of Europe. The Commodore Amiga and Atari ST, the leading gaming computers in Europe as a whole, were embraced to at least some extent in France as well. By 1992, 250,000 Amigas were in French homes. This figure might not have compared very well to the 2.5 million of them in Britain and Germany by that point, but it was more than enough to fuel a thriving little Amiga game-development community that was already several years old. “Our games didn’t have the excellent gameplay of original English-language games,” remembers French game designer Philippe Ulrich, “but their aesthetics were superior, which spawned the term ‘The French Touch’ — later reused by musicians such as Daft Punk and Air.”

Many Amiga and ST owners had been introduced to the indelibly French perspective on games as early as 1988. That was the year of Captain Blood, which cast the player in the role of a clone doomed to die unless he could pool his vital essences with those of five other clones scattered across the galaxy — an existential quest for identity to replace the conquer-the-galaxy themes of most science-fiction games. If that alone wasn’t weird enough, the gameplay consisted mostly of talking to aliens using a strange constructed language of hieroglyphs devised by the game’s developers.

Such avoidance of in-game text, whether done as a practical method of easing the problems of localization or just out of the long-established French ambivalence toward translation from their mother tongue, would become a hallmark of the games that followed, as would a willingness to tackle subject matter that no one else would touch. The French didn’t so much reject traditional videogame themes and genres as filter them through their own sensibilities. Often, this meant reflecting American culture back upon itself in ways that could be both unsettling and illuminating. North & South, for instance, turned the Civil War, that greatest tragedy of American history, into a manic slapstick satire. For any American kid raised on a diet of exceptionalism and solemn patriotism, this was deeply, deeply strange stuff.

The creator of Another World, perhaps the ultimate example of the French Touch in games, was, as all of us must be, a product of his environment. Éric Chahi had turned ten the year that Star Wars dropped, marking the emergence of a transnational culture of blockbuster media, and he was no more immune to its charms than were other little boys all over the world. Yet he viewed that very American film through a very French lens. He liked the rhythm and the look of the thing — the way the camera panned across an endless vista of peaceful space down into a scene of battle at the beginning; the riff on Triumph of the Will that is the medal ceremony at the end — much more than he cared about the plot. His most famous work would evince this same rather non-Anglo sense of aesthetic priorities, playing with the trappings of American sci-fi pop culture but skewing them in a distinctly French way.

But first, there would be other games. From the moment Chahi discovered computers several years after Star Wars, he was smitten. “During school holidays, I didn’t see much of the sun,” he says. “Programming quickly became an obsession, and I spent around seventeen hours a day in front of a computer screen.” The nascent French games industry may have been rather insular, but that just made it if anything even more wide-open for a young man like himself than were those of other countries. Chahi was soon seeing the games he wrote — from platformers to text adventures — published on France’s oddball collection of viable 8-bit platforms. His trump card as a developer was a second talent that set him apart from the other hotshot bedroom coders: he was also a superb artist, whether working in pixels or in more traditional materials. Although none of his quickie 8-bit games became big hits, his industry connections did bring him to the attention of a new company called Delphine Software in 1988.

Delphine Software was about as stereotypically French a development house as can be imagined. It was a spinoff of Delphine Records, whose cash cow was the bizarrely popular easy-listening pianist Richard Clayderman, a sort of modern-day European Liberace who would come to sell 150 million records by 2006. Paul de Senneville, the owner of Delphine Records, was himself a composer and musician. Artist that he was, he gave his new software arm virtually complete freedom to make whatever games they felt like making. Their Paris offices looked like a hip recording studio; Chahi remembers “red carpet at the entrance, gold discs everywhere, and many eccentric contemporary art pieces.”

Future Wars

He had been hired by Delphine on the basis of his artistic rather than his programming talent, to illustrate a point-and-click adventure game with the grandiose title of Les Voyageurs du Temps: La Menace (“The Time Travelers: The Menace”), later to be released in English under the punchier name of Future Wars. Inspired by the Sierra graphic adventures of the time, it was nevertheless all French: absolutely beautiful to look at — Chahi’s illustrations were nothing short of mouth-watering — but more problematic to play, with a weird interface, weirder plot, and puzzles that were weirdest of all. As such, it stands today as a template for another decade and change of similarly baffling French graphic adventures to come, from companies like Coktel Vision as well as Delphine themselves.

But the important thing from Chahi’s perspective was that the game became a hit all across Europe upon its release in mid-1989, entirely on the basis of his stunning work as its illustrator. He had finally broken through. Yet anyone who expected him to capitalize on that breakthrough in the usual way, by settling into a nice, steady career as Delphine’s illustrator in residence, didn’t understand his artist’s temperament. He decided he wanted to make a big, ambitious game of his own all by himself — a true auteur’s statement. “I felt that I had something very personal to communicate,” he says, “and in order to bring my vision to others I had to develop the title on my own.” Like Marcel Proust holed up in his famous cork-lined Paris apartment, scribbling frantically away on In Search of Lost Time, Chahi would spend the next two years in his parents’ basement, working sixteen, seventeen, eighteen hours per day on Another World. He began with just two fixed ideas: he wanted to make a “cinematic” science-fiction game, and he wanted to do it using polygonal graphics.

Articles like this one throw around terms like “polygonal graphics” an awful lot, and their meanings may not always be clear to everyday readers. So, let’s begin by asking what separated the type of graphics Chahi now proposed to make from those he had been making before.

The pictures that Chahi had created for Future Wars were what is often referred to as pixel graphics. To make them, the artist loads a paint program, such as the Amiga’s beloved Deluxe Paint, and manipulates the actual onscreen pixels to create a background scene. Animation is accomplished using sprites: additional, smaller pictures that are overlaid onto the background scene and moved around as needed. On many computers of the 1980s, including the Amiga on which Chahi was working, sprites were implemented in hardware for efficiency’s sake. On other computers, such as the IBM PC and the Atari ST, they had to be conjured up, rather less efficiently, in software. Either way, though, the basic concept is the same.

The artist who works with polygonal graphics, on the other hand, doesn’t directly manipulate onscreen pixels. Instead she defines her “pictures” mathematically. She builds scenes out of geometric polygons of three sides or more, defined as three or more connected points, or sets of X, Y, and Z coordinates in abstract space. At run time, the computer renders all this data into an image on the monitor screen, mapping it onto physical pixels from the perspective of a “camera” that’s anchored at some point in space and pointed in a defined direction. Give a system like this one enough polygons to render, and it can create scenes of amazing complexity.

Still, it does seem like a roundabout way of approaching things, doesn’t it? Why, you may be wondering, would anyone choose to use polygonal graphics instead of just painting scenes with a conventional paint program? Well, the potential benefits are actually enormous. Polygonal graphics are a far more flexible, dynamic form of computer graphics. Whereas in the case of a pixel-art background you’re stuck with the perspective and distance the artist chose to illustrate, you can view a polygonal scene in all sorts of different ways simply by telling the computer where in space the “camera” is hanging. A polygonal scene, in other words, is more like a virtual space than a conventional illustration — a space you can move through, and that can in turn move around you, just by changing a few numbers. And it has the additional advantage that, being defined only as a collection of anchoring points for the polygons that make it up rather than needing to explicitly describe the color of every single pixel, it usually takes up much less disk space as well.

With that knowledge to hand, you might be tempted to reverse the question of the previous paragraph, and ask why anyone wouldn’t want to use polygonal graphics. In fact, polygonal graphics of one form or another had been in use on computers since the 1960s, and were hardly unheard of in the games industry of the 1980s. They were most commonly found in vehicular simulators like subLOGIC’s Flight Simulator, which needed to provide a constantly changing out-the-cockpit view of their worlds. More famously in Europe, Elite, one of the biggest games of the decade, also built its intense space battles out of polygons.

The fact is, though, that polygonal graphics have some significant disadvantages to go along with their advantages, and these were magnified by the limited hardware of the era. Rendering a scene out of polygons was mathematically intensive in comparison to the pixel-graphic-backgrounds-and-sprites approach, pushing an 8-bit or even 16-bit CPU (like the Motorola 68000 in the Amiga) hard. It was for this reason that early versions of Flight Simulator and Elite and many other polygonal games rendered their worlds only as wire-frame graphics; there just wasn’t enough horsepower to draw in solid surfaces and still maintain a decent frame rate.

And there were other drawbacks. The individual polygons from which scenes were formed were all flat surfaces; there was no concept of smooth curvature in the mathematics that underlay them.1 But the natural world, of course, is made up of almost nothing but curves. The only way to compensate for this disparity was to use many small polygons, packed so closely together that their flat surfaces took on the appearance of curvature to the eye. Yet increasing the polygon count in this way increased the burden of rendering it all on the poor overtaxed CPUs of the day — a burden that quickly became untenable. In practice, then, polygonal graphics took on a distinctive angular, artificial appearance, whose sense of artificiality was only enhanced by the uniform blotches of color in which they were drawn.2

These illustrations show how an object can be made to appear rounded by making it out of a sufficient number of flat polygons. The problem is that each additional polygon which must be rendered taxes the processor that much more.

For all these reasons, polygonal graphics were mostly confined to the sort of first-person-perspective games, like those aforementioned vehicular simulators and some British action-adventures, which couldn’t avoid using them. But Chahi would buck the trend by using them for his own third-person-perspective game. Their unique affordances and limitations would stamp Another World just as much as its creator’s own personality, giving the game’s environments the haunting, angular vagueness of a dream landscape. The effect is further enhanced by Chahi’s use of a muted, almost pastel palette of just 16 colors and an evocative, minimalist score by Jean-François Freitas — the only part of the game that wasn’t created by Chahi himself. Although you’re constantly threatened with death — and, indeed, will die over and over in the course of puzzling your way through the game — it all operates on the level of impression rather than reality.

According to some theories of visual art, the line between merely duplicating reality and conveying impressions of reality is the one that separates the draftsman from the artist. If so, Another World‘s visuals betray an aesthetic sophistication rarely seen in computer games of its era. While other games strained to portray violence with ever more realism, Another World went another way entirely, creating an affect that’s difficult to put into words — a quality which is itself another telltale sign of Art. Chahi:

Polygon techniques are great for animation, but the price you pay is the lack of detail. Because I couldn’t include much detail, I decided to work with the player’s imagination, creating suggestive content instead of being highly descriptive. That’s why, for example, the beast in the first scene is impressive even if it is only a big black shape. The visual style of Another World is really descended from the black-and-white comic-book style, where shape and volume are suggested in a very subtle way. By doing Another World, I learned a lot about suggestion. I learned that the medium is the player’s own imagination.

To make his suggestive rather than realistic graphics, Chahi spent much time first making tools, beginning with an editor written in a variant of BASIC. The editor’s output was then rendered in the game in assembly language for the sake of speed, with the logic of it all controlled using a custom script language of Chahi’s own devising. This approach would prove a godsend when it came time to port the game to platforms other than the Amiga; a would-be porter merely had to recreate the rendering engine on a new platform, making it capable of interpreting Chahi’s original polygonal-graphics data and scripts. Thus Another World was, in addition to being a game, actually a new cross-platform game engine as well, albeit one that would only be used for a single title.

Some of the graphics had their point of origin in the real world, having been captured using a long-established animation technique known as rotoscoping: tracing the outlines, frame by frame, of real people or objects filmed in motion, to form the basis of their animated equivalents. Regular readers of this blog may recall that Jordan Mechner used the same technique as far back as 1983 to create the characters in his cinematic karate game Karateka. Yet the differences between the two young developers’ approaches to the technique says much about the march of technology between 1983 and 1989.

Mechner shot his source footage on real film, then used a mechanical Moviola editing machine, a staple of conventional filmmakers for decades, to isolate and make prints of every third frame of the footage. He then traced these prints into his Apple II using an early drawing pad called a VersaWriter.

Chahi’s Amiga allowed a different approach. It had been developed during the brief heyday of laser-disc games in arcades. These often worked by overlaying interactive computer-generated graphics onto static video footage unspooling from the laser disc itself. Wishing to give their new computer the potential to play similar games in the home with the addition of an optional laser-disc player, the designers of the Amiga built into the machine’s graphics chips a way of overlaying the display onto other video; one color of the onscreen palette could be defined as transparent, allowing whatever video lay “below” it to peek through. The imagined laser-disc accessory would never appear due to issues of cost and practicality, but, in a classic example of an unanticipated technological side-effect, this capability combined with the Amiga’s excellent graphics in general made it a wonderful video-production workstation, able to blend digital titles and all sorts of special effects with the analog video sources that still dominated during the era. Indeed, the emerging field of “desktop video” became by far the Amiga’s most sustained and successful niche outside of games.

The same capability now simplified the process of rotoscoping dramatically for Chahi in comparison to what Mechner had been forced to do. He shot video footage of himself on an ordinary camcorder, then played it back on a VCR with single-frame stop capability. To the same television as the VCR was attached his Amiga. Chahi could thus trace the images directly from video into his Amiga, without having to fuss with prints at all.

It wasn’t until months into the development of Another World that a real game, and with it a story of sorts, began to emerge from this primordial soup of graphics technology. Chahi made a lengthy cut scene, rendered, like all of the ones that would follow, using the same graphics engine as the game’s interactive portions for the sake of aesthetic consistency. The entire scene, lasting some two and a half minutes, used just 70 K of disk space thanks to the magic of polygonal graphics. In it, the player’s avatar, a physicist named Lester Cheykin, shows up at his laboratory for a night of research, only to be sucked into his own experiment and literally plunged into another world; he emerges underwater, just a few meters above some vicious plant life eager to make a meal out of him. The player’s first task, then, is to hastily swim to the surface, and the game proper gets underway. The story that follows, such as it is, is one of more desperate escapes from the flora and fauna of this new world, including an intelligent race that don’t like Lester any more than their less intelligent counterparts. Importantly, neither the player nor Lester ever learns precisely where he is — another planet? another dimension? — or why the people that live there — we’ll just call them the “aliens” from now on for simplicity’s sake — want to kill him.

True to the spirit of the kid who found the look of Star Wars more interesting than the plot, the game is constructed with a filmmaker’s eye toward aesthetic composition rather than conventional narrative. After the opening cut scene, the whole game contains not one word devoted to dialog, exposition, or anything else until “The End” appears, excepting only grunts and muffled exclamations made in an alien language you can’t understand. All of Chahi’s efforts were poured into the visual set-pieces, which are consistently striking and surprising, often with multiple layers of action.

Chahi:

I wanted to create a truly immersive game in a very consistent, living universe with a movie feel. I never wanted to create an interactive movie itself. Instead I wanted to extract the essence of a movie — the rhythm and the drama — and place it into game form. To do this I decided to leave the screen free of the usual information aids like an energy bar, score counter, and other icons. Everything had to be in the universe, with no interruptions getting in the way.

Midway through the game, you encounter a friend, an alien who’s been imprisoned — for reasons that, needless to say, are never explained — by the same group who are out to get you. The two of you join forces, helping one another through the rest of the story. Your bond of friendship is masterfully conveyed without using words, relying on the same impressionistic visuals as everything else. The final scene, where the fellow Chahi came to call “Buddy” gently lifts an exhausted Lester onto the back of a strange winged creature and they fly away together, is one of the more transcendent in videogame history, a beautiful closing grace note that leaves you with a lump in your throat. Note the agonizingly slow pace of the snippet below, contrasted with the frenetic pace of the one above. When Chahi speaks about trying to capture the rhythm of a great movie, this is what he means.

For its creator, the ending had another special resonance. When implementing the final scene, two years after retiring into his parents’ basement, Chahi himself felt much like poor exhausted Lester, crawling toward the finish line.

But, you might ask, what has the player spent all of the time between the ominous opening cut scene and the transcendent final one actually doing? In some ways, that’s the least interesting aspect of Another World. The game is at bottom a platforming action-adventure, with a heavy emphasis on the action. Each scene is a challenge to be tackled in two phases: first, you have to figure out what Chahi wants you to do in order to get through its monsters, tricks, and traps; then, you have to execute it all with split-second precision. It’s not particularly easy. The idealized perfect player can make a perfect run through Another World, including watching all of the cut scenes, in half an hour. Imperfect real-world players, on the other hand, can expect to watch Lester die over and over as they slowly blunder their way through the game. At least you’re usually allowed to pick up pretty close to where you left off when Lester dies — because, trust me, he will die, and often.

When we begin to talk of influences and points of comparison for Another World inside the realm of games, one name inevitably leaps to mind first. I already mentioned Jordan Mechner in the context of his own work with rotoscoping, but that’s only the tip of an iceberg of similarities between Another World and his two famous games, Karateka and Prince of Persia. He was another young man with a cinematic eye, more interested in translating the “rhythm and drama” of film to an interactive medium than he was in making “interactive movies” in the sense that his industry at large tended to understand that term. Indeed, Chahi has named Karateka as perhaps the most important ludic influence on Another World, and if anything the parallels between the latter and Prince of Persia are even stronger: both were the virtually single-handed creations of their young auteurs; both largely eschew text in favor of visual storytelling; both clear their screen of score markers and other status indicators in the name of focusing on what’s really important; both are brutally difficult platformers; both can be, because of that brutal difficulty, almost more fun to watch someone else play than they are to play yourself, at least for those of us who aren’t connoisseurs of their try-and-try-again approach to game design.

Still, for all the similarities, nobody is ever likely to mistake Prince of Persia for Another World. Much of the difference must come down to — to engage in yet more crude national stereotyping — the fact that one game is indisputably American, the other very, very French. Mechner, who has vacillated between a career as a game-maker and a filmmaker throughout his life, wrote his movie scripts in the accessible, family-friendly tradition of Steven Spielberg, his favorite director, and brought the same sensibility to his games. But Chahi’s Another World has, as we’ve seen, the sensibility of an art film more so than a blockbuster. The two works together stand as a stark testimony to the way that things which are so superficially similar in art can actually be so dramatically different.

A mentally and physically drained Éric Chahi crawled the final few feet into Delphine’s offices to deliver the finished Another World in late 1991. His final task was to paint the cover art for the box, a last step in the cementing of the game as a deeply personal expression in what was already becoming known as a rather impersonal medium. It was released in Europe before the end of the year, whereupon it became a major, immediate hit for reasons that, truth be told, probably had little to do with its more emotionally resonant qualities: in a market that thrived on novelty, it looked like absolutely nothing else. That alone was enough to drive sales, but in time at least some of the young videogame freaks who purchased it found in it something they’d never bargained for: the ineffable magic of a close encounter with real Art. Memories of those feelings continue to make it a perennial today whenever people of a certain age draw up lists of their favorite games.

Delphine had an established relationship with Interplay as their American publisher. The latter were certainly intrigued by Chahi’s creation, but seemed a little nonplussed by its odd texture. They thus lobbied him for permission to replace its evocative silences, which were only occasionally broken up by Jean-François Freitas’s haunting score, with a more conventional thumping videogame soundtrack. Chahi was decidedly opposed, to the extent of sending Interplay’s offices an “infinite fax” repeating the same sentence again and again: “Keep the original music!” Thankfully, they finally agreed to do so, although conflicts with a long-running daytime soap opera which was also known as Another World did force them to change the name of the game in the United States to the more gung-ho-sounding Out of This World. But on the positive side, they put the game through the rigorous testing process the air-fairy artistes at Delphine couldn’t be bothered with, forcing Chahi to fix hundreds of major and minor bugs and unquestionably turning it into a far tighter, more polished experience.

I remember Out of this World‘s 1992 arrival in the United States with unusual vividness. I was still an Amiga loyalist at the time, even as the platform’s star was all too obviously fading in my country. It will always remain imprinted on my memory as the last “showpiece” Amiga game I encountered, the last time I wanted to call others into the room and tell them to “look at this!” — the last of a long line of such showpieces that had begun with Defender of the Crown back in 1986. For me, then, it marked the end of an era in my life. Shortly thereafter, my once-beloved old Amiga got unceremoniously dumped into the closet, and I didn’t have much to do with computers at all for the next two or three years.

But Interplay, of course, wasn’t thinking of endings when the Amiga version of Out of this World was greeted with warm reviews in the few American magazines still covering Amiga games. Computer Gaming World called the now-iconic introductory cut scene “one of the most imaginative pieces of non-interactive storytelling ever associated with a computer game” — a description which might almost, come to think of it, be applied to the game as a whole, depending on how broad your definition of “interactive storytelling” is willing to be. Reviewers did note that the game was awfully short, however, prompting Interplay to cajole the exhausted Chahi into making one more scene for the much-anticipated MS-DOS port. This he duly did, diluting the concentrated experience that was the original version only moderately in the process.

The game was ported to many more platforms in the years that followed, including to consoles like the Super Nintendo and Sega Genesis, eventually even to iOS and Android in the form of a “20th Anniversary Edition.” Chahi estimates that it sold some 1 million copies in all during the 1990s alone. He made the mistake of authorizing Interplay to make a sequel called Heart of the Alien for the Sega CD game console in 1994, albeit with the typically artsy stipulation that it must be told from the point of view of Buddy. The results were so underwhelming that he regrets the decision to this day, and has resisted all further calls to make or authorize sequels. Instead he’s worked on other games over the years, but only intermittently, mixing his work in games with a range of other pursuits such as volcanology, photography, and painting. His ludography remains tiny — another trait, come to think of it, that he shares with Jordan Mechner — and he is still best known by far for Another World, which is perhaps just as well; it’s still his own personal favorite of his games. It remains today a touchstone for a certain school of indie game developers in particular, who continue to find inspiration in its artsy, affective simplicity.

In fact, Another World raises some interesting questions about the very nature of games. Is it possible for a game that’s actually not all that great at all in terms of mechanics and interactivity to nevertheless be a proverbial great game in some more holistic sense? The brilliant strategy-game designer Sid Meier has famously called a good game “a series of interesting decisions.” Another World resoundingly fails to meet this standard of ludic goodness. In it, you the player have virtually no real decisions to make at all; your task is rather to figure out the decisions which Éric Chahi has already made for Lester, and thereby to advance him to the next scene. Of course, the Sid Meier definition of gaming goodness can be used to criticize plenty of other games — even other entire game genres. Certainly most adventure games as well are largely exercises in figuring out the puzzle solutions the author has already set in place. Yet even they generally offer a modicum of flexibility, a certain scope for exploration in, if nothing else, the order in which you approach the puzzles. Another World, on the other hand, allows little more scope for exploration or improvisation than the famously straitjacketed Dragon’s Lair — which is, as it happens, another game Chahi has listed as an inspiration. Winning Dragon’s Lair entails nothing more nor less than making just the right pre-determined motions with the controller at just the right points in the course of watching a static video clip. In Another World, Lester is at least visibly responsive to your commands, but, again, anything but the exactly right commands, executed with perfect precision, just gets him killed and sends you back to the last checkpoint to try again.

So, for all that it’s lovely and moving to look at, does Another World really have any right to be a game at all? Might it not work better as an animated short? Or, to frame the question more positively, what is it about the interactivity of Another World that actually adds to the audiovisual experience? Éric Chahi, for his part, makes a case for his game using a very different criterion from that of Meier’s “interesting decisions”:

It’s true that Another World is difficult. When I played it a year ago, I discovered how frustrating it can be sometimes — and breathtaking at the same time. The trial-and-error doesn’t disturb me, though. Another World is a game of survival on a hostile world, and it really is about life and death. Death doesn’t mean the end of the game, but it is a part of the exploration, a part of the experience. That’s why the death sequences are so diversified. To solve many puzzles, I recognize that you have to die at least once, and this certainly isn’t the philosophy of today’s game design. It is a controversial point in Another World’s design because it truly serves the emotional side of things and the player’s attachment to the characters, but it sometimes has a detrimental effect on the gameplay. Because of this, Another World must be considered first as an intense emotional experience.

Personally, I’m skeptical of whether deliberately frustrating the player, even in the name of artistic affect, is ever a good design strategy, and I must confess that I remain in the camp of players who would rather watch Another World than try to struggle through it on their own. Yet there’s no question that Éric Chahi’s best-remembered game does indeed deserve to be remembered for its rare aesthetic sophistication, and for stimulating emotional responses that go way beyond the typical action-game palette of anger and fear. While there is certainly room for “interesting decisions” in games — and perhaps a few of them might not have gone amiss in Another World itself — games ought to be able to make us feel as well. This lesson of Another World is one every game designer can stand to profit from.

(Sources: the book Principles of Three-Dimension Animation: Modeling, Rendering, and Animating with 3D Computer Graphics by Michael O’Rourke; Computer Gaming World of August 1992; Game Developer of November 2011; Questbusters of June/July 1992; The One of October 1991 and October 1992; Zero of November 1991; Retro Gamer 24 and 158; Amiga Format 1992 annual; bonus materials included with the 20th Anniversary edition of Another World; an interview with Éric Chahi conducted for the film From Bedrooms to Billions: The Amiga Years; Chahi’s postmorten talk about the game at the 2011 Game Developers Conference; “How ‘French Touch’ Gave Early Videogames Art, Brains” from Wired; “The Eccentricities of Eric Chahi” from Eurogamer. The cut-scene and gameplay footage in the article is taken from a World of Longplays YouTube video.

Another World is available for purchase on GOG.com in a 20th Anniversary Edition with lots of bonus content.)


  1. More modern polygonal-graphics implementations do make use of something called splines to allow for curvature, but these weren’t practical to implement using 1980s and early 1990s computers. 

  2. Again, the state of the art in modern polygonal graphics is much different today in this area than it was in <em>Another World</em>‘s time. Today textures are mapped on polygonal surfaces to create a more realistic appearance, and scenes are illuminated by light sources that produce realistic shadings and shadows across the whole. But all of this was hopelessly far beyond what Chahi or anyone else of <em>Another World’</em>s era could hope to implement in a game which needed to be interactive and to run at a reasonable speed. 

 

Tags: , , ,

The View from the Trenches (or, Some Deadly Sins of CRPG Design)

From the beginning of this project, I’ve worked to remove the nostalgia factor from my writing about old games, to evaluate each game strictly on its own merits and demerits. I like to think that this approach has made my blog a uniquely enlightening window into gaming history. Still, one thing my years as a digital antiquarian have taught me is that you tread on people’s nostalgia at your peril. Some of what I’ve written here over the years has certainly generated its share of heat as well as light, not so much among those of you who are regular readers and commenters — you remain the most polite, thoughtful, insightful, and just plain nice readers any writer could hope to have — as among the ones who fire off nasty emails from anonymous addresses, who post screeds on less polite sites to which I’m occasionally pointed, or who offer up their drive-by comments right here every once in a while.

A common theme of these responses is that I’m not worthy of writing about this stuff, whether because I wasn’t there at the time — actually, I was, but whatever — or because I’m just not man enough to take my lumps and power through the really evil, unfair games. This rhetoric of inclusion and exclusion is all too symptomatic of the uglier sides of gaming culture. Just why so many angry, intolerant personalities are so attracted to computer games is a fascinating question, but must remain a question for another day. For today I will just say that, even aside from their ugliness, I find such sentiments strange. As far as I know, there’s zero street cred to be gained in the wider culture from being good at playing weird old videogames — or for that matter from being good at playing videogames of any stripe. What an odd thing to construct a public persona around. I’ve made a job out of analyzing old games, and even I sometimes want to say, “Dude, they’re just old games! Really, truly, they’re not worth getting so worked up over.”

That said, there do remain some rays of light amidst all this heat. It’s true that my experience of these games today — of playing them in a window on this giant monitor screen of mine, or playing them on the go on a laptop — must be in some fairly fundamental ways different from the way the same games were experienced all those years ago. One thing that gets obviously lost is the tactile, analog side of the vintage experience: handling the physical maps and manuals and packages (I now reference that stuff as PDF files, which isn’t quite the same); drawing maps and taking notes using real pen and paper (I now keep programs open in separate windows on that aforementioned giant monitor for those purposes); listening to the chuck-a-chunk of disk drives loading in the next bit of text or scenery (replacing the joy of anticipation is the instant response of my modern supercomputer). When I allow myself to put on my own nostalgia hat, just for a little while, I recognize that all these things are intimately bound up with my own memories of playing games back in the day.

And I also recognize that the discrepancies between the way I play now and the way I played back then go even further. Some of the most treasured of vintage games weren’t so much single works to be played and completed as veritable lifestyle choices. Ultima IV, to name a classic example, was huge enough and complicated enough that a kid who got it for Christmas in 1985 might very well still be playing it by the time Ultima V arrived in 1988; rinse and repeat for the next few entries in the series. From my jaded perspective, I wouldn’t brand any of these massive CRPGs as overly well-designed in the sense of being a reasonably soluble game to be completed in a reasonable amount of time, but then that wasn’t quite what most of the people who played them way back when were looking for in them. Actually solving the games became almost irrelevant for a kid who wanted to live in the world of Britannia.

I get that. I really do. No matter how deep a traveler in virtual time delves into the details of any era of history, there are some things he can never truly recapture. Were I to try, I would have to go away to spend a year or two disconnected from the Web and playing no other game — or at least no other CRPG — than the Ultima I planned to write about next. That, as I hope you can all appreciate, wouldn’t be a very good model for a blog like this one.

When I think in the abstract about this journey through gaming history I’ve been on for so long now, I realize that I’ve been trying to tell at least three intertwining stories.

One story is a critical design history of games. When I come to a game I judge worthy of taking the time to write about in depth — a judgment call that only becomes harder with every passing year, let me tell you — I play it and offer you my thoughts on it, trying to judge it not only in the context of our times but also in the context of its own times, and in the context of its peers.

A second story is that of the people who made these games, and how they went about doing so — the inevitable postmortems, as it were.

Doing these first two things is relatively easy. What’s harder is the third leg of the stool: what was it like to be a player of computer games all those years ago? Sometimes I stumble upon great anecdotes in this area. For instance, did you know about Clancy Shaffer?

In impersonal terms, Shaffer was one of the slightly dimmer stars among the constellation of adventure-game superfans — think Roe Adams III, Shay Addams, Computer Gaming World‘s indomitable Scorpia — who parlayed their love of the genre and their talent for solving games quickly into profitable sidelines if not full-on careers as columnists, commentators, play-testers, occasionally even design consultants; for his part, Shaffer contributed his long experience as a player to the much-loved Sir-Tech title Jagged Alliance.

Most of the many people who talked with Shaffer via post, via email, or via telephone assumed he was pretty much like them, an enthusiastic gamer and technology geek in his twenties or thirties. One of these folks, Rich Heimlich, has told of a time when a phone conversation turned to the future of computer technology in the longer view. “Frankly,” said Shaffer, “I’m not sure I’ll even be here to see it.” He was, he explained to his stunned interlocutor, 84 years old. He credited his hobby for the mental dexterity that caused so many to assume he was in his thirties at the oldest. Shaffer believed he had remained mentally sharp through puzzling his way through so many games, while he needed only look at the schedule of upcoming releases in a magazine to have something to which to look forward in life.  Many of his friends who, like him, had retired twenty years ago were dead or senile, a situation Shaffer blamed on their having failed to find anything to do with themselves after leaving the working world behind.

Shaffer died in 2010 at age 99. Only after his passing, after reading his obituary, did Heimlich and other old computer-game buddies realize what an extraordinary life Shaffer had actually led, encompassing an education from Harvard University, a long career in construction and building management, 18 patents in construction engineering, an active leadership role in the Republican party, a Golden Glove championship in heavyweight boxing, and a long and successful run as a yacht racer and sailor of the world’s oceans. And yes, he had also loved to play computer games, parlaying that passion into more than 500 published articles.

But great anecdotes like this one from the consumption side of the gaming equation are the exception rather than the rule, not because they aren’t out there in spades in theory — I’m sure there have been plenty of other fascinating characters like Clancy Shaffer who have also made a passion for games a part of their lives — but because they rarely get publicized. The story of the players of vintage computer games is that of a huge, diffuse mass of millions of people whose individual stories almost never stretch beyond their immediate families and friends.

The situation becomes especially fraught when we try to zero in on the nitty-gritty details of how games were played and judged in their day. Am I as completely out of line as some have accused me of being in harping so relentlessly on the real or alleged design problems of so many games that others consider to be classics? Or did people back in the day, at least some of them, also get frustrated and downright angry at betrayals of their trust in the form of illogical puzzles and boring busywork? I know that I certainly did, but I’m only one data point.

One would think that the magazines, that primary link between the people who made games and those who played them, would be the best way of finding out what players were really thinking. In truth, though, the magazines rarely provided skeptical coverage of the games industry. The companies whose games they were reviewing were of course the very same companies that were helping to pay their bills by buying advertising — an obvious conflict of interest if ever there was one. More abstractly but no less significantly, there was a sense among those who worked for the magazines and those who worked for the game publishers that they were all in this together, living as they all were off the same hobby. Criticizing individual games too harshly, much less entire genres, could damage that hobby, ultimately damaging the magazines as much as the publishers. Thus when the latest heavily hyped King’s Quest came down the pipe, littered with that series’s usual design flaws, there was little incentive for the magazines to note that this monarch had no clothes.

So, we must look elsewhere to find out what average players were really thinking. But where? Most of the day-to-day discussions among gamers back in the day took place over the telephone, on school playgrounds, on computer bulletin boards, or on the early commercial online services that preceded the World Wide Web. While Jason Scott has done great work snarfing up a tiny piece of the online world of the 1980s and early 1990s, most of it is lost, presumably forever. (In this sense at least, historians of later eras of gaming history will have an easier time of it, thanks to archive.org and the relative permanence of the Internet.) The problem of capturing gaming as gamers knew it thus remains one without a comprehensive solution. I must confess that this is one reason I’m always happy when you, my readers, share your experiences with this or that game in the comments section — even, or perhaps especially, when you disagree with my own judgments on a game.

Still, relying exclusively on first-hand accounts from decades later to capture what it was like to be a gamer in the old days can be problematic in the same way that it can be problematic to rely exclusively on interviews with game developers to capture how and why games were made all those years ago: memories can fade, personal agendas can intrude, and those rose-colored glasses of nostalgia can be hard to take off. Pretty soon we’re calling every game from our adolescence a masterpiece and dumping on the brain-dead games played by all those stupid kids today — and get off my lawn while you’re at it. The golden age of gaming, like the golden age of science fiction, will always be twelve or somewhere thereabouts. All that’s fine for hoisting a beer with the other old-timers, but it can be worse than useless for doing serious history.

Thankfully, every once in a while I stumble upon another sort of cracked window into this aspect of gaming’s past. As many of you know, I’ve spent a couple of weeks over the last couple of years trolling through the voluminous (and growing) game-history archives of the Strong Museum of Play. Most of this material, hugely valuable to me though it’s been and will doubtless continue to be, focuses on the game-making side of the equation. Some of the archives, though, contain letters from actual players, giving that unvarnished glimpse into their world that I so crave. Indeed, these letters are among my favorite things in the archives. They are, first of all, great fun. The ones from the youngsters are often absurdly cute; it’s amazing how many liked to draw pictures to accompany their missives.

But it’s when I turn to the letters from older writers that I’m gratified and, yes, made to feel a little validated when I read that people were in fact noticing that games weren’t always playing fair with them. I’d like to share a couple of the more interesting letters of this type with you today.

We’ll begin with a letter from one Wes Irby of Plano, Texas, describing what he does and especially what he doesn’t enjoy in CRPGs. At the time he sent it to the Questbusters adventure-game newsletter in October of 1988, Irby was a self-described “grizzled computer adventurer” of age 43. Shay Addams, Questbusters’s editor, found the letter worthy enough to spread around among publishers of CRPGs. (Perhaps tellingly, he didn’t choose to publish it in his newsletter.)

Irby titles his missive “Things I Hate in a Fantasy-Role-Playing Game.” Taken on its own, it serves very well as a companion piece to a similar article I once wrote about graphic adventures. But because I just can’t shut up, and because I can’t resist taking the opportunity to point out places where Irby is unusually prescient or insightful, I’ve inserted my own comments into the piece; they appear in italics in the text that follows. Otherwise, I’ve only cleaned up the punctuation and spelling a bit here and there. The rest is Irby’s original letter from 1988.


I hate rat killing!!! In Shard of Spring, I had to kill dozens of rats, snakes, kobolds, and bats before I could get back to the tower after a Wind Walk to safety. In Wizardry, the rats were Murphy’s ghosts, which I pummeled for hours when developing a new character. Ultima IV was perhaps the ultimate rat-killing game of all time; hour upon hour was spent in tedious little battles that I could not possibly lose and that offered little reward for victory. Give me a good battle to test my mettle, but don’t sentence me to rat killing!

Amen. The CRPG genre became the victim of an expectation which took hold early on that the games needed to be really, really long, needed to consume dozens if not hundreds of hours, in order for players to get their money’s worth. With disk space precious and memory space even more so on the computers of the era, developers had to pad out their games with a constant stream of cheap low-stakes random encounters to reach that goal. Amidst the other Interplay materials hosted at the Strong archive are several mentions of a version of Wasteland, prepared specially for testers in a hurry, in which the random encounters were left out entirely. That’s the version of Wasteland I’d like to play.

I hate being stuck!!! I enjoy the puzzles, riddles, and quests as a way to give some story line to the real heart of the game, which is killing bad guys. Just don’t give me any puzzles I can’t solve in a couple of hours. I solved Rubik’s Cube in about thirty hours, and that was nothing compared to some of the puzzles in The Destiny Knight. The last riddle in Knight of Diamonds delayed my completion (and purchase of the sequel) for nearly six months, until I made a call to Sir-Tech.

I haven’t discussed the issue of bad puzzle design in CRPGs to the same extent as I have the same issue in adventure games, but suffice to say that just about everything I’ve written in the one context applies equally in the other. Certainly riddles remain among the laziest — they require almost no programming effort to implement — and most problematic — they rely by definition on intuition and external cultural knowledge — forms of puzzle in either genre. Riddles aren’t puzzles at all really; the answer either pops into your head right away or it doesn’t, meaning the riddle turns into either a triviality or a brick wall. A good puzzle, by contrast, is one you can experiment with on your way to the correct solution. And as for the puzzles in The Bard’s Tale II: The Destiny Knight… much more on them a little later.

Perhaps the worst aspect of being stuck is the clue-book dilemma. Buying a clue book is demeaning. In addition, buying clue books could encourage impossible puzzles to boost the aftermarket for clue books. I am a reformed game pirate (that is how I got hooked), and I feel it is just as unfair for a company to charge me to finish the game I bought as it was for me to play the games (years ago) without paying for them. Multiple solutions, a la Might and Magic, are very nice. That game also had the desirable feature of allowing you to work on several things simultaneously so that being stuck on one didn’t bring the whole game to a standstill.

Here Irby brings up an idea I’ve also touched on once or twice: that the very worst examples of bad design can be read as not just good-faith disappointments but actual ethical lapses on the part of developers and publishers. Does selling consumers a game with puzzles that are insoluble except through hacking or the most tedious sort of brute-force approaches equate to breaching good faith by knowingly selling them a defective product? I tend to feel that it does.

As part of the same debate, the omnipresent clue books became a locus of much dark speculation and conspiracy theorizing back in the day. Did publishers, as Irby suggests, intentionally release games that couldn’t be solved without buying the clue book, thereby to pick up additional sales? The profit margins on clue books, not incidentally, tended to be much higher than that enjoyed by the games themselves. Still, the answer is more complicated than the question may first appear. Based on my research into the industry of the time, I don’t believe that any publishers or developers made insoluble games with the articulated motive of driving clue-book sales. To the extent that there was an ulterior motive surrounding the subject of clue books, it was that the clue books would allow them to make money off some of the people who pirated their games. (Rumors — almost certainly false, but telling by their very presence — occasionally swirled around the industry about this or that popular title whose clue-book sales had allegedly outstripped the number of copies of the actual game which had been sold.) Yet the fact does remain that even the hope of using clue books as a way of getting money out of pirates required games that would be difficult enough to cause many pirates to go out and buy the book. The human mind is a funny place, and the clue-book business likely did create certain almost unconscious pressures on game designers to design less soluble games.

I hate no-fault life insurance! If there is no penalty, there is no risk, there is no fear — translate that to no excitement. The adrenaline actually surged a few times during play of the Wizardry series when I encountered a group of monsters that might defeat me. In Bard’s Tale II, death was so painless that I committed suicide several times because it was the most expedient way to return to the Adventurer’s Guild.

When you take the risk of loss out of the game, it might as well be a crossword puzzle. The loss of possessions in Ultima IV and the loss of constitution in Might and Magic were tolerable compromises. The undead status in Phantasie was very nice. Your character was unharmed except for the fact that no further advancement was possible. Penalties can be too severe, of course. In Shard of Spring, loss of one battle means all characters are permanently lost. Too tough.

Here Irby hits on one of the most fraught debates in CRPG design, stretching from the days of the original Wizardry to today: what should be the penalty for failure? There’s no question that the fact that you couldn’t save in the dungeon was one of the defining aspects of Wizardry, the game that did more than any other to popularize the budding genre in the very early 1980s. Exultant stories of escaping the dreaded Total Party Loss by the skin of one’s teeth come up again and again when you read about the game. Andrew Greenberg and Bob Woodhead, the designers of Wizardry, took a hard-line stance on the issue, insisting that the lack of an in-dungeon save function was fundamental to an experience they had carefully crafted. They went so far as to issue legal threats against third-party utilities designed to mitigate the danger.

Over time, though, the mainstream CRPG industry moved toward the save-often, save-anywhere model, leaving Wizardry’s approach only to a hardcore sub-genre known as roguelikes. It seems clear that the change had some negative effects on encounter design; designers, assuming that players were indeed saving often and saving everywhere, felt they could afford to worry less about hitting players with impossible fights. Yet it also seems clear that many or most players, given the choice, would prefer to avoid the exhilaration of escaping near-disasters in Wizardry in favor of avoiding the consequences of unescaped disasters. The best solution, it seems to me, is to make limited or unlimited saving a player-selectable option. Failing that, it strikes me as better to err on the side of generosity; after all, hardcore players can still capture the exhilaration and anguish of an iron-man mode by simply imposing their own rules for when they allow themselves to save. All that said, the debate will doubtless continue to rage.

I hate being victimized. Loss of life, liberty, etc., in a situation I could have avoided through skillful play is quite different from a capricious, unavoidable loss. The Amulet of Skill in Knight of Diamonds was one such situation. It was not reasonable to expect me to fail to try the artifacts I found — a fact I soon remedied with my backup disk!!! The surprise attacks of the mages in Wizardry was another such example. Each of the Wizardry series seems to have one of these, but the worst was the teleportation trap on the top level of Wizardry III, which permanently encased my best party in stone.

Beyond rather putting the lie to some of Greenberg and Woodhead’s claims of having exhaustively balanced the Wizardry games, these criticisms again echo those I’ve made in the context of adventure games. Irby’s examples are the CRPG equivalents of the dreaded adventure-game Room of Sudden Death — except that in CRPGs like Wizardry with perma-death, their consequences are much more dire than just having to go back to your last save.

I hate extraordinary characters! If everyone is extraordinary then extraordinary becomes extra (extremely) ordinary and uninteresting. The characters in Ultima III and IV and Bard’s Tale I and II all had the maximum ratings for all stats before the end of the game. They lose their personalities that way.

This is one of Irby’s subtler complaints, but also I think one of his most insightful. Characters in CRPGs are made interesting, as he points out, through a combination of strengths and weaknesses. I spent considerable time in a recent article describing how the design standards of SSI’s “Gold Box” series of licensed Dungeons & Dragons CRPGs declined over time, but couldn’t find a place for the example of Pools of Darkness, the fourth and last game in the series that began with Pool of Radiance. Most of the fights in Pools of Darkness are effectively unwinnable if you don’t have “extraordinary” characters, in that they come down to quick-draw contests to find out whether your party or the monsters can fire off devastating area-effect magic first. Your entire party needs to have a maxed-out dexterity score of 18 to hope to consistently survive these battles. Pools of Darkness thus rewards cheaters and punishes honest players; it represents a cruel betrayal of players who had played through the entire series honestly to that point, without availing themselves of character editors or the like. CRPGs should strive not to make the extraordinary ordinary, and they should certainly not demand extraordinary characters that the player can only come by through cheating.

There are several more features which I find undesirable, but are not sufficiently irritating to put them in the “I hate” category. One such feature is the inability to save the game in certain places or situations. It is miserable to find yourself in a spot you can’t get out of (or don’t want to leave because of the difficulty in returning) at midnight (real time). I have continued through the wee hours on occasion, much to my regret the next day. At other times it has gotten so bad I have dozed off at the keyboard. The trek from the surface to the final set of riddles in Ultima IV takes nearly four hours. Without the ability to save along the way, this doesn’t make for good after-dinner entertainment. Some of the forays in the Phantasie series are also long and difficult, with no provision to save. This problem is compounded when you have an old machine like mine that locks up periodically. Depending on the weather and the phase of the moon, sometimes I can’t rely on sessions that average over half an hour.

There’s an interesting conflict here, which I sense that the usually insightful Irby may not have fully grasped, between his demand that death have consequences in CRPGs and his belief that he should be able to save anywhere. At the same time, though, it’s not an irreconcilable conflict. Roguelikes have traditionally made it possible to save anywhere by quitting the game, but immediately delete the save when you start to play again, thus making it impossible to use later on as a fallback position.

Still, it should always raise a red flag when a given game’s designers claim something which just happens to have been the easier choice from a technical perspective to have been a considered design choice. This skepticism should definitely be applied to Wizardry. Were the no-save dungeons that were such an integral part of the Wizardry experience really a considered design choice or a (happy?) accident arising from technical affordances? It’s very difficult to say this many years on. What is clear is that saving state in any sort of comprehensive way was a daunting challenge for 8-bit CRPGs spread over multiple disk sides. Wizardry and The Bard’s Tale didn’t really even bother to try; literally the only persistent data in these games and many others like them is the state of your characters, meaning not only that the dungeons are completely reset every time you enter them but that it’s possible to “win” them over and over again by killing the miraculously resurrected big baddie again and again. The 8-bit Ultima games did a little better, saving the state of the world map but not that of the cities or the dungeons. (I’ve nitpicked the extreme cruelty of Ultima IV’s ending, which Irby also references, enough on earlier occasions that I won’t belabor it any more here.) Only quite late in the day for the 8-bit CRPG did games like Wasteland work out ways to create truly, comprehensively persistent environments — in the case of Wasteland, by rewriting all of the data on each disk side on the fly as the player travels around the world (a very slow process, particularly in the case of the Commodore 64 and its legendarily slow disk drive).

Tedium is a killer. In Bard’s Tale there was one battle with 297 bersekers that always took fifteen or twenty minutes with the same results (this wasn’t rat-killing because the reward was significant and I could lose, maybe). The process of healing the party in the dungeon in Wizardry and the process of identifying discovered items in Shard of Spring are laborious. How boring it was in Ultima IV to stand around waiting for a pirate ship to happen along so I could capture it. The same can be said of sitting there holding down a key in Wasteland or Wrath of Denethenor while waiting for healing to occur. At least give me a wait command so I can read a book until something interesting happens.

I’m sort of ambivalent toward most aspects of mapping. A good map is satisfying and a good way to be sure nothing has been missed. Sometimes my son will use my maps (he hates mapping) in a game and find he is ready to go to the next level before his characters are. Mapping is a useful way to pace the game. The one irritating aspect of mapping is running off the edge of the paper. In Realms of Darkness mapping was very difficult because there was no “locater” or “direction” spell. More bothersome to me, though, was the fact that I never knew where to start on my paper. I had the same problem with Shard of Spring, but in retrospect that game didn’t require mapping.

Mapping is another area where the technical affordances of the earliest games had a major effect on their designs. The dungeon levels in most 8-bit CRPGs were laid out on grids of a consistent number of squares across and down; such a template minimized memory usage and simplified the programmer’s task enormously. Unrealistic though it was, it was also a blessing for mappers. Wizardry, a game that was oddly adept at turning its technical limitations into player positives, even included sheets of graph paper of exactly the right size in the box. Later games like Dungeon Master, whose levels sprawl everywhere, run badly afoul of the problem Irby describes above — that of maps “running off the edge of the paper.” In the case of Dungeon Master, it’s the one glaring flaw in what could otherwise serve as a masterclass in designing a challenging yet playable dungeon crawl.

I don’t like it when a program doesn’t take advantage of my second disk drive, and I would feel that way about my printer if I had one. I don’t like junk magic (spells you never use), and I don’t like being stuck forever with the names I pick on the spur of the moment. A name that struck my fancy one day may not on another.

Another problem similar to “junk magic” that only really began to surface around the time that Irby was writing this letter is junk skills. Wasteland is loaded with skills that are rarely or never useful, along with others that are essential, and there’s no way for the new player to identify which are which. It’s a more significant problem than junk magic usually is because you invest precious points into learning and advancing your skills; there’s a well-nigh irreversible opportunity cost to your choices. All of what we might call the second generation of Interplay CRPGs, which began with Wasteland, suffer at least somewhat from this syndrome. Like the sprawling dungeon levels in Dungeon Master, it’s an example of the higher ambitions and more sophisticated programming of later games impacting the end result in ways that are, at best, mixed in terms of playability.

I suppose you are wondering why I play these stupid games if there is so much about them I don’t like. Actually, there are more things I do like, particularly when compared to watching Gilligan’s Island or whatever the current TV fare is. I suppose it would be appropriate to mention a few of the things I do like.

In discussing the unavoidably anachronistic experience we have of old games today, we often note how many other games are at our fingertips — a luxury a kid who might hope to get one new game every birthday and Christmas most definitely didn’t enjoy. What we perhaps don’t address as much as we should is how much the entertainment landscape in general has changed. It can be a little tough even for those of us who lived through the 1980s to remember what a desert television was back then. I remember a television commercial — and from the following decade at that — in which a man checked into a hotel of the future, and was told that every movie ever made was available for viewing at the click of a remote control. Back then, this was outlandish science fiction. Today, it’s reality.

I like variety and surprises. Give me a cast of thousands over a fixed party anytime. Of course, the game designer has to force the need for multiple parties on me, or I will stick with the same group throughout because that is the best way to “win” the game. The Minotaur Temple in Phantasie I and the problems men had in Portsmouth in Might and Magic and the evil and good areas of Wizardry III were nice. More attractive are party changes for strategic reasons. What good are magic users in no-magic areas or a bard in a silent room? A rescue mission doesn’t need a thief and repetitive battles with many small opponents don’t require a fighter that deals heavy damage to one bad guy.

I like variety and surprises in the items found, the map, the specials encountered, in short in every aspect of the game. I like figuring out what things are and how they work. What a delight the thief’s dagger in Wizardry was! The maps in Wasteland are wonderful because any map may contain a map. The countryside contains towns and villages, the towns contain buildings, some buildings contain floors or secret passages. What fun!!!

I like missions and quests to pursue as I proceed. Some of these games are so large that intermediate goals are necessary to keep you on track. Might and Magic, Phantasie, and Bard’s Tale do a good job of creating a path with the “missions.” I like self-contained clues about the puzzles. In The Return of Heracles the sage was always there to provide an assist (for money, of course)  if you got stuck. The multiple solutions or sources of vital information in Might and Magic greatly enhanced the probability of completing the missions and kept the game moving.

I like the idea of recruiting new characters, as opposed to starting over from scratch. In Galactic Adventurers your crew could be augmented by recruiting survivors of a battle, provided they were less experienced than your leader. Charisma (little used in most games) could impact recruiting. Wasteland provides for recruiting of certain predetermined characters you encounter. These NPCs can be controlled almost like your characters and will advance with experience. Destiny Knight allows you to recruit (with a magic spell) any of the monsters you encounter, and requires that some specific characters be recruited to solve some of the puzzles, but these NPCs can’t be controlled and will not advance in level, so they are temporary members. They will occasionally turn on you, an interesting twist!!!

I like various skills, improved by practice or training for various characters. This makes the characters unique individuals, adding to the variety. This was implemented nicely in both Galactic Adventurers and Wasteland.

Eternal growth for my characters makes every session a little different and intriguing. If the characters “top out” too soon that aspect of the game loses its fascination. Wizardry was the best at providing continual growth opportunities because of the opportunity to change class and retain some of the abilities of the previous class. The Phantasie series seemed nicely balanced, with the end of the quest coming just before/as my characters topped out.

Speaking of eternal, I have never in all of my various adventures had a character retire because of age. Wizardry tried, but it never came into play because it was cheaper to heal at the foot of the stairs while identifying loot (same trip or short run to the dungeon for that purpose). Phantasie kept up with age, but it never affected play. I thought Might and Magic might, but I found the Fountain of Youth. The only FRPG I have played where you had to beat the clock is Tunnels of Doom, a simple hack-and-slash on my TI 99/4A that takes about ten hours for a game. Of course, it is quite different to spend ten hours and fail because the king died than it is to spend three months and fail by a few minutes. I like for time to be a factor to prevent me from being too conservative.

This matter of time affecting play really doesn’t fit into the “like” or the “don’t like” because I’ve never seen it effectively implemented. There are a couple of other items like that on my wish list. For example, training of new characters by older characters should take the place of slugging it out with Murphy’s ghost while the newcomers watch from the safety of the back row.

The placing of time limits on a game sounds to me like a very dangerous proposal. It was tried in 1989, the year after Irby wrote this letter, by The Magic Candle, a game that I haven’t played but that is quite well-regarded by the CRPG cognoscenti. That game was, however, kind enough to offer three difficulty levels, each with its own time limit, and the easiest level was generous enough that most players report that time never became a major factor. I don’t know of any game, even from this much crueler era of game design in general, that was cruel enough to let you play 100 hours or more and then tell you you’d lost because the evil wizard had finished conquering the world, thank you very much. Such an approach might have been more realistic than the alternative, where the evil wizard cackles and threatens occasionally but doesn’t seem to actually do much, but, as Sid Meier puts it, fun ought to trump realism every time in game design.

A very useful feature would be the ability to create my own macro consisting of a dozen or so keystrokes. Set up Control-1 through Control-9 and give me a simple way to specify the keystrokes to be executed when one is pressed.

Interestingly, this exact feature showed up in Interplay’s CRPGs very shortly after Irby wrote this letter, beginning with the MS-DOS version of Wasteland in March of 1989. And we do know that Interplay was one of the companies to which Shay Addams sent the letter. Is this a case of a single gamer’s correspondence being responsible for a significant feature in later games? The answer is likely lost forever to the vagaries of time and the inexactitude of memory.

A record of sorts of what has happened during the game would be nice. The chevron in Wizardry and the origin in Phantasie is the most I’ve ever seen done with this. How about a screen that told me I had 93 sessions, 4 divine interventions (restore backup), completed 12 quests, raised characters from the dead 47 times, and killed 23,472 monsters? Cute, huh?

Another crazily prescient proposal. These sorts of meta-textual status screens would become commonplace in CRPGs in later years. In this case, though, “later years” means much later. Thus, rather than speculating on whether he actively drove the genre’s future innovations, we can credit Irby this time merely with predicting them.

One last suggestion for the manufacturers: if you want that little card you put in each box back, offer me something I want. For example, give me a list of all the other nuts in my area code who have purchased this game and returned their little cards.

Enough of this, Wasteland is waiting.


With some exceptions — the last suggestion, for instance, would be a privacy violation that would make even the NSA raise an eyebrow — I agree with most of Irby’s positive suggestions, just as I do his complaints. It strikes me as I read through his letter that my own personal favorite among 8-bit CRPGs, Pool of Radiance, manages to avoid most of Irby’s pitfalls while implementing much from his list of desirable features — further confirmation of just what a remarkable piece of work that game, and to an only slightly lesser extent its sequel Curse of the Azure Bonds, really were. I hope Wes Irby got a chance to play them.

I have less to say about the second letter I’d like to share with you, and will thus present it without in-line commentary. This undated letter was sent directly to Interplay by its writer: Thomas G. Gutheil, an associate professor at the Harvard Medical School Department of Psychiatry, on whose letterhead it’s written. Its topic is The Bard’s Tale II: The Destiny Knight, a game I’ve written about only in passing but one with some serious design problems in the form of well-nigh insoluble puzzles. Self-serving though it may be, I present Gutheil’s letter to you today as one more proof that players did notice the things that were wrong with games back in the day — and that my perspective on them today therefore isn’t an entirely anachronistic one. More importantly, Gutheil’s speculations are still some of the most cogent I’ve ever seen on how bad puzzles make their way into games in the first place. For this reason alone, it’s eminently worthy of being preserved for posterity.


I am writing you a combination fan letter and critique in regard to the two volumes of The Bard’s Tale, of which I am a regular and fanatic user.

First, the good news: this is a TERRIFIC game, and I play it with addictive intensity, approximately an hour almost every day. The richness of the graphics, the cute depictions of the various characters, monsters, etc., and rich complexity and color of the mazes, tasks, issues, as well as the dry wit that pervades the program, make it a superb piece and probably the best maze-type adventure product on the market today. I congratulate you on this achievement.

Now, the bad news: the one thing I feel represents a defect in your program (and I only take your time to comment on it because it is so central) and one which is perhaps the only area where the Wizardry series (of which I am also an avid player and expert) is superior, is the notion of the so-called puzzles, a problem which becomes particularly noticeable in the “snares of death” in the second scenario. In all candor, speaking as an old puzzle taker and as a four-time grand master of the Boston Phoenix Puzzle Contest, I must say that these puzzles are simply too personal and idiosyncratic to be fair to the player. I would imagine you are doing a booming business in clue books since many of the puzzles are simply not accomplishable otherwise without hours of frustrating work, most of it highly speculative.

Permit me to try to clarify this point, since I am aware of the sensitive nature of these comments, given that I would imagine you regard the puzzles as being the “high art” of the game design. There should be an organic connection between the clues and the puzzles. For example, in Wizardry (sorry to plug the competition), there is a symbolic connection between the clue and its function. As one simplistic example, at the simplest level a bear statuette get you through a gate guarded by a bear, a key opens a particular door, and a ship-in-a-bottle item gets you across an open expanse of water.

Let me try to contrast this with some of the situations in your scenarios. You may recall that in one of the scenarios the presence of a “winged one” in the party was necessary to get across a particular chasm. The Winged One introduces himself to the party as one of almost a thousand individual wandering creatures that come and offer to join the party, to be attacked, or to be left in peace. This level of dilution and the failure to separate out the Winged One in some way makes it practically unrecallable much later on when you need it, particularly since there are several levels of dungeon (and in real life perhaps many interposing days and weeks) between the time you meet the Winged One (who does not stand out among the other wandering characters in any particular way) and the time you actually need him. Even if (as I do) you keep notes, there would be no particular reason to record this creature out of all. Moreover, to have this added character stuck in your party for long periods of time, when you could instead have the many-times more effective demons, Kringles, and salamanders, etc., would seem strategically self-defeating and therefore counter-intuitive for the normal strategy of game play AS IT IS ACTUALLY PLAYED.

This is my point: in many ways your puzzles in the scenarios seem to have been designed by someone who is not playing the in the usual sequence, but designed as it were from the viewpoint of the programmer, who looks at the scenario “from above” — that is, from omniscient knowledge. In many situations the maze fails to take into account the fact that parties will not necessarily explore the maze in the predictable direct sequence you have imagined. The flow of doors and corridors do not appropriately guide a player so that they will take the puzzles in a meaningful sequence. Thus, when one gets a second clue before a first clue, only confusion results, and it is rarely resolved as the play advances.

Every once in a while you do catch on, and that is when something like the rock-scissors-paper game is invoked in your second scenario. That’s generally playing fair, although not everyone has played that game or would recognize it in the somewhat cryptic form in which it is presented. Thus the player does not gain the satisfaction of use of intellect in problem solving; instead, it’s the frustration of playing “guess what I’m thinking” with the author.

Despite all of the above criticism, the excitement and the challenge of playing the game still make it uniquely attractive; as you have no doubt caught on, I write because I care. I have had to actively fight the temptation to simply hack my way through the “snares of death” by direct cribbing from the clue books, so that I could get on to the real interest of the game, which is working one’s way through the dungeons and encountering the different items, monsters, and challenges. I believe that this impatience with the idiosyncratic (thus fundamentally unfair) design of these puzzles represents an impediment, and I would be interested to know if others have commented on this. Note that it doesn’t take any more work for the programmer, but merely a shift of viewpoint to make the puzzles relevant and fair to the reader and also proof against being taken “out of order,” which largely confuses the meaning. A puzzle that is challenging and tricky is fair; a puzzle that is idiosyncratically cryptic may not be.

Thank you for your attention to this somewhat long-winded letter; it was important to me to write. Given how much I care for this game and how devoted I am to playing it and to awaiting future scenarios, I wanted to call your attention to this issue. You need not respond personally, but I would of course be interested in any of your thoughts on this.


I conclude this article as a whole by echoing Gutheil’s closing sentiments; your feedback is the best part of writing this blog. I hope you didn’t find my musings on the process of doing history too digressive, and most of all I hope you found Wes Irby and Thomas Gutheil’s all too rare views from the trenches as fascinating as I did.

 

Tags: , ,

Turning on, Booting up, and Jacking into Neuromancer

When a novel becomes notably successful, Hollywood generally comes calling to secure the film rights. Many an author naively assumes that the acquisition of films rights means an actual film will get made, and in fairly short order at that. And thus is many an author sorely disappointed. Almost every popular novelist who’s been around for a while has stories to tell about Hollywood’s unique form of development purgatory. The sad fact is that the cost of acquiring the rights to even the biggest bestseller is a drop in the bucket in comparison to the cost of making a film out of them. Indeed, the cost is so trivial in terms of Hollywood budgets that many studios are willing to splash out for rights to books they never seriously envision doing anything productive with at all, simply to keep them out of the hands of rivals and protect their own properties in similar genres.

One could well imagine the much-discussed but never-made movie of William Gibson’s landmark cyberpunk novel Neuromancer falling into this standard pattern. Instead, though, its story is far, far more bizarre than the norm — and in its weird way far more entertaining.

Our story begins not with the power brokers of Hollywood, but rather with two young men at the very bottom of the Tinseltown social hierarchy. Ashley Tyler and Jeffrey Kinart were a pair of surfer dudes and cabana boys who worked the swimming pool of the exclusive Beverly Hills Hotel. Serving moguls and stars every day, they decided that the things they observed their charges doing really didn’t seem all that difficult at all. With a little luck and a little drive, even a couple of service workers like them could probably become players. Despite having no money, no education in filmmaking, and no real inroads with the people who tipped them to deliver poolside drinks, they hatched a plan in early 1985 to make a sequel to their favorite film of all time, the previous year’s strange postmodern action comedy The Adventures of Buckaroo Banzai Across the 8th Dimension.

The idea was highly problematic, not only for all of the reasons I’ve just listed but also because Buckaroo Banzai, while regarded as something of a cult classic today, had been a notorious flop in its own day, recouping barely a third of its production budget — hardly, in other words, likely sequel fodder. Nevertheless, Tyler and Kinart managed to recruit Earl Mac Rauch, the creator of the Buckaroo Banzai character and writer of the film’s screenplay, to join their little company-in-name-only, which they appropriately titled Cabana Boy Productions. As they made the rounds of the studios, the all-too-plainly clueless Tyler and Kinart didn’t manage to drum up much interest for their Buckaroo Banzai sequel, but the Hollywood establishment found their delusions of grandeur and surfer-boy personalities so intriguing that there was reportedly some talk of signing them to a deal — not to make a Buckaroo Banzai movie, but as the fodder for a television comedy, a sort of Beverly Hillbillies for the 1980s.

After some months, the cabana boys finally recognized that Buckaroo Banzai had little chance of getting resurrected, and moved on to wanting to make a movie out of the hottest novel in science fiction: William Gibson’s Neuromancer. Rauch’s own career wasn’t exactly going gangbusters; in addition to Buckaroo Banzai, he also had on his resume New York, New York, mob-movie maestro Martin Scorsese’s equally notorious attempt to make a classic Hollywood musical. Thus he agreed to stick with the pair, promising to write the screenplay if they could secure the rights to Neuromancer. In the meantime, they continued to schmooze the guests at the Beverly Hills Hotel, making their revised pitch to any of them who would listen. Against all the odds, they stumbled upon one guest who took them very seriously indeed.

As was all too easy to tell by taking a glance at her rictus smile, Deborah Rosenberg was the wife of a plastic surgeon. Her husband, Victor Rosenberg, had been in private practice in New York City since 1970, serving the rich, the famous, and the would-be rich and famous. He also enjoyed a profitable sideline as a writer and commentator on his field for the supermarket tabloids, the glossy beauty magazines, and the bored-housewife talk-show circuit, where he was a regular on programs like Live with Regis and Kathie Lee, The Oprah Winfrey Show, and Donahue. When business took him and his wife to Beverly Hills in late 1985, Deborah was left to loiter by the pool while her husband attended a medical convention. It was thus that she made the acquaintance of Tyler and Kinart.

Smelling money, the cabana boys talked up their plans to her with their usual gusto despite her having nothing to do with the film industry. Unaccountably, Deborah Rosenberg thought the idea of making Neuromancer with them a smashing one, and managed to convince her husband to put up seed capital for the endeavor. Tyler, in a scenario that raises the specter of professional couch-crasher Kato Kaelin, actually followed the Rosenbergs back to New York and moved into their mansion as a permanent house guest while he and Deborah continued to work on their plans. There would be much speculation around both Hollywood and New York in the months to come about exactly what sort of relationship Deborah and Ashley actually had, and whether her husband a) was aware of Deborah’s possible extramarital shenanigans and b) cared if he was.

While the irony of Gibson’s book full of cosmetic surgeries and body modifications of all descriptions being adapted by a plastic surgeon would have been particularly rich, Victor took little active role in the project, seeming to regard it (and possibly Ashley?) primarily as a way to keep his high-maintenance wife occupied. He did, however, help her to incorporate Cabana Boy Productions properly in January of 1986, and a few weeks later, having confirmed that Neuromancer rather surprisingly remained un-optioned, offered William Gibson $100,000 for all non-print-media rights to the novel. Gibson was by all indications almost as naive as Deborah and her cabana boys; in his late thirties though he was, he had never earned more than the most menial of wages before finishing the science-fiction novel of the decade eighteen months before. He jumped at the offer with no further negotiation whatsoever, mumbling something about using the unexpected windfall to remodel his kitchen. The film rights to the hottest science-fiction novel in recent memory were now in the hands of two California surfer dudes and a plastic surgeon’s trophy wife. And then, just to make the situation that much more surreal, Timothy Leary showed up.

I should briefly introduce Leary for those of you who may not be that familiar with the psychologist whom President Nixon once called “the most dangerous man in America.” At the age of 42 in 1963, the heretofore respectable Leary was fired from his professorship at Harvard, allegedly for skipping lectures but really for administering psychedelic drugs to students without proper authorization. Ousted by the establishment, he joined the nascent counterculture as an elder statesman and cool hippie uncle. Whilst battling unsuccessfully to keep LSD and similar drugs legal — by 1968, they would be outlawed nationwide despite his best efforts — Leary traveled the country delivering “lectures” that came complete with a live backing band, light shows, and more pseudo-mystical mumbo jumbo than could be found anywhere this side of a Scientology convention. In his encounters with the straight mainstream press, he strained to be as outrageous and confrontational as possible. One of his favorite sayings became one of the most enduring of the entire Age of Aquarius: “Turn on, tune in, drop out.” Persecuted relentlessly by the establishment as the Judas who had betrayed their trust, Leary was repeatedly arrested for drug possession. This, of course, only endeared him that much more to the counterculture, who regarded each successive bust as another instance of his personal martyrdom for their cause. The Moody Blues wrote an oh-so-sixties anthem about him called “Legend of a Mind” and made it the centerpiece of their 1968 album In Search of the Lost Chord; likewise, the Beatles song “Come Together” was begun as a campaign anthem for Leary’s farcical candidacy for governor of California.

In January of 1970, Leary, the last person in the world on whom any judge was likely to be lenient, was sentenced to ten years imprisonment by the state of California for the possession of two marijuana cigarettes. With the aid of the terrorist group the Weather Underground, he escaped from prison that September and fled overseas, first to Algeria, then to Switzerland, where, now totally out of his depth in the criminal underworld, he wound up being kept under house arrest as a sort of prize pet by a high-living international arms dealer. When he was recaptured by Swiss authorities and extradited back to the United States in 1972, it thus came as something of a relief for him. He continued to write books in prison, but otherwise kept a lower profile as the last embers of the counterculture burned themselves out. His sentence was commuted by California Governor Jerry Brown in 1976, and he was released.

Free at last, he was slightly at loose ends, being widely regarded as a creaky anachronism of a decade that already felt very long ago and far away; in the age of disco, cocaine was the wonderdrug rather than LSD. But in 1983, when he played Infocom’s Suspended, he discovered a new passion that would come to dominate the last 13 years of his life. He wrote to Mike Berlyn, the author of the game, to tell him that Suspended had “changed his life,” that he had been “completely overwhelmed by the way the characters split reality into six pieces.” He had, he said, “not thought much of computers before then,” but Suspended “had made computers a reality” for him. Later that year, he visited Infocom with an idea for, as one employee of the company remembers it, “a personality that would sit on top of the operating system, observe what you did, and modify what the computer would do and how it would present information based on your personal history, what you’d done on the computer.” If such an idea seems insanely ambitious in the context of early 1980s technology, it perhaps points to some of the issues that would tend to keep Leary, who wasn’t a programmer and had no real technical understanding of how computers worked, at the margins of the industry. Certainly his flamboyance and tendency to talk always in superlatives made him a difficult fit with the more low-key personality of Infocom. Another employee remembers Leary as being “too self-centered to make a good partner. He wanted his name and his ideas on something, but he didn’t want us to tell him how to do it.”

Mind Mirror

His overtures to Infocom having come to naught, Leary moved on, but he didn’t forget about computers. Far from it. As the waves of hype about home computers rolled across the nation, Leary saw in them much the same revolutionary potential he had once seen in peace, love, and LSD — and he also saw in them, one suspects, a new vehicle to bring himself, an inveterate lover of the spotlight, back to a certain cultural relevance. Computers, he declared were better than drugs: “the language of computers [gives] me the metaphor I was searching for twenty years ago.” He helpfully provided the media with a new go-to slogan to apply to his latest ideas, albeit one that would never quite catch on like the earlier had: “Turn on, boot up, jack in.” “Who controls the pictures on the screen controls the future,” he said, “and computers let people control their own screen.”

In that spirit, he formed a small software developer of his own, which he dubbed Futique. Futique’s one tangible product became something he called Mind Mirror, published by Electronic Arts in 1986. It stands to this day as the single strangest product Electronic Arts has ever released. Billed as “part tool, part game, and part philosopher on a disk,” Mind Mirror was mostly incomprehensible — a vastly less intuitive Alter Ego with all the campy fun of that game’s terrible writing and dubious psychological insights leached out in favor of charts, graphs, and rambling manifestos. Electronic Arts found that Leary’s cultural cachet with the average computer user wasn’t as great as they might have hoped; despite their plastering his name and picture all over the box, Mind Mirror resoundingly flopped.

It was in the midst of much of this activity that Leary encountered William Gibson’s novel Neuromancer. Perhaps unsurprisingly given the oft-cited link between Gibson’s vision of an ecstatic virtual reality called the Matrix and his earlier drug experiences, Leary became an instant cyberpunk convert, embracing the new sub-genre with all of his characteristic enthusiasm. Gibson, he said, had written “the New Testament of the 21st century.” Having evidently decided that the surest route to profundity lay in placing the prefix “cyber-” in front of every possible word, he went on to describe Neuromancer as “an encyclopedic epic for the cyber-screen culture of the immediate future, and an inspiring cyber-theology for the Information Age.” He reached out to the man he had anointed as the cyber-prophet behind this new cyber-theology, sparking up an acquaintance if never quite a real friendship. It was likely through Gibson — the chain of events isn’t entirely clear — that Leary became acquainted with the unlikely management of Cabana Boy Productions and their plans for a Neuromancer film. He promptly jumped in with them.

Through happenstance and sheer determination, the cabana boys now had a real corporation with at least a modicum of real funding, the rights to a real bestselling novel, and a real professional screenwriter — and the real Timothy Leary, for whatever that was worth. They were almost starting to look like a credible operation — until, that is, one of them actually started to talk. Therein lay the rub.

Around the middle of 1986, Cabana Boy made a sizzle reel featuring most of their principal personalities to shop around the Hollywood studios. Even William Gibson along with his agent  and his publicist with Berkley Books were convinced to show up and offer a few pleasantries. Almost everyone comes across as hopelessly vacuous in this, the only actual footage Cabana Boy would ever manage to produce. If these were the best bits that were hand-selected to put Cabana Boy’s best foot forward, one shudders to think about what must have been left on the cutting-room floor.


The Cabana Boy’s attempts to sell their proposed $20 million film to Hollywood were, according to one journalist, “a comedy of errors and naivete — but what they lack in experience they are making up for in showmanship.” Although they were still not taken all that seriously by anyone, their back story and their personalities were enough to secure brief write-ups in People and Us, and David Letterman, always on the lookout for endearing eccentrics to interview and/or make fun of on his late-night talk show, seriously considered having them on. “My bet,” concluded the journalist, “is that they’ll make a movie about Cabana Boy before Neuromancer ever gets off the ground.”

Shortly after the sizzle reel was made, Earl Mac Rauch split when he was offered the chance to work on a biopic about comedian John Belushi. No problem, said Deborah Rosenberg and Ashley Tyler, we’ll just write the Neuromancer script ourselves — this despite neither of them having ever written anything before, much less the screenplay to a proverbial “major motion picture.” At about the same time, Jeffrey Kinart had a falling-out with his old poolside partner — his absence from the promo video may be a sign of the troubles to come — and left as well. Tyler himself split at the end of 1987, marking the exit of the last actual cabana boy from Cabana Boy, even as Deborah Rosenberg remained no closer to a production contract than she had been at the beginning of the endeavor. On the other hand, she had acquired two entertainment lawyers, a producer, a production designer, a bevy of “financial consultants,” offices in three cities for indeterminate purposes, and millions of dollars in debt. Still undaunted, on August 4, 1988, she registered her completed script, a document it would likely be fascinating but also kind of horrifying to read, with the United States Copyright Office.

While all this was going on, Timothy Leary had become obsessed with what may very well have been his real motivation for associating himself with Cabana Boy in the first place: turning Neuromancer into a computer game, or, as he preferred to call it, a “mind play” or “performance book.” Cabana Boy had, you’ll remember, picked up all electronic-media rights to the novel in addition to the film rights. Envisioning a Neuromancer game developed for the revolutionary new Commodore Amiga by his own company Futique, the fabulously well-connected Leary assembled a typically star-studded cast of characters to help him make it. It included David Byrne, lead singer of the rock band Talking Heads; Keith Haring, a trendy up-and-coming visual artist; Helmut Newton, a world-famous fashion photographer; Devo, the New Wave rock group; and none other than William Gibson’s personal literary hero William S. Burroughs to adapt his protege’s work to the computer.

This image created for Timothy Leary's "mind play" of Neuromancer features the artist Keith Haring, who was to play the role of Case. Haring died of AIDS in 1990 at the age of just 31, but nevertheless left behind him a surprisingly rich legacy.

This image created for Timothy Leary’s “mind play” of Neuromancer features David Byrne of the band Talking Heads.

Leary sub-contracted the rights for a Neuromancer game from Cabana Boy, and was able to secure a tentative deal with Electronic Arts. But that fell through when Mind Mirror hit the market and bombed. Another tentative agreement, this time with Jim Levy’s artistically ambitious Activision, fell through when the much more practical-minded Bruce Davis took over control of that publisher in January of 1987. Neuromancer was a property that should have had huge draw with the computer-game demographic, but everyone, it seemed, was more than a little leery of Leary and his avant-garde aspirations. For some time, the game project didn’t make much more headway than the movie.

Neuromancer the game was saved by a very unusual friendship. While Leary was still associated with Electronic Arts, an unnamed someone at the publisher had introduced him to the head of one of their best development studios, Brian Fargo of Interplay, saying that he thought the two of them “will get along well.” “Timothy and his wife Barbara came down to my office, and sure enough we all hit it off great,” remembers Fargo. “Tim was fascinated by technology; he thought about it and talked about it all the time. So I was his go-to guy for questions about it.”

Being friends with the erstwhile most dangerous man in America was quite an eye-opening experience for the clean-cut former track star. Leary relished his stardom, somewhat faded though its luster may have been by the 1980s, and gloried in the access it gave him to the trendy jet-setting elite. Fargo remembers that Leary “would take me to all the hottest clubs in L.A. I got to go to the Playboy Mansion when I was 24 years old; I met O.J. and Nicole Simpson at his house, and Devo, and David Byrne from Talking Heads. It was a good time.”

His deals with Electronic Arts and Activision having fallen through, it was only natural for Leary to turn at last to his friend Brian Fargo to get his Neuromancer game made. Accepting the project, hot property though Neuromancer was among science-fiction fans, wasn’t without risk for Fargo. Interplay was a commercially-focused developer whose reputation rested largely on their Bard’s Tale series of traditional dungeon-crawling CRPGs; “mind plays” hadn’t exactly been in their bailiwick. Nor did they have a great deal of financial breathing room for artistic experimentation. Interplay, despite the huge success of the first Bard’s Tale game in particular, remained a small, fragile company that could ill-afford an expensive flop. In fact, they were about to embark on a major transition that would only amplify these concerns. Fargo, convinced that the main reason his company wasn’t making more money from The Bard’s Tale and their other games was the lousy 15 percent royalty they were getting from Electronic Arts — a deal which the latter company flatly refused to renegotiate — was moving more and more toward severing those ties and trying to go it alone as a publisher as well as a developer. Doing so would among other things mean giving up the possibility of making more Bard’s Tale games; that trademark would remain with Electronic Arts. Without that crutch to lean on, an independent Interplay would need to make all-new hits right out of the gate. And, as shown by the example of Mind Mirror, a Timothy Leary mind play wasn’t all that likely to become one.

Fargo must therefore have breathed a sigh of relief when Leary, perhaps growing tired of this project he’d been flogging for quite some time, perhaps made more willing to trust Fargo’s instincts by the fact that he considered him a friend, said he would be happy to step back into a mere “consulting” role. He did, however, arrange for William Gibson to join Fargo at his house one day to throw out ideas. Gibson was amiable enough, but ultimately just not all that interested, as he tacitly admitted: “I was offered a lot more opportunity for input than I felt capable of acting on. One thing that quickly became apparent to me was that I hadn’t the foggiest notion of the way an interactive computer game had to be constructed, the various levels of architecture involved. It was fascinating, but I felt I’d best keep my nose out of it and let talented professionals go about the actual business of making the game.” So, Fargo and his team, which would come to include programmer Troy A. Miles, artist Charles H.H. Weidman III, and writers and designers Bruce Balfour and Mike Stackpole, were largely left alone to make their game. While none of them was a William Gibson, much less a William S. Burroughs, they did have a much better idea of what made for a fun, commercially viable computer game than did anyone on the dream team Leary had assembled.

Three fifths of the team that wound up making the completed Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

Three fifths of the team that wound up making Interplay’s Neuromancer: Troy Miles, Charles H.H. Weidman III, and Bruce Balfour.

And one member of Leary’s old team did agree to stay with the project. Brian Fargo:

My phone rang one night at close to one o’clock in the morning. It was Timothy, and he was all excited that he had gotten Devo to do the soundtrack. I said, “That’s great.” But however I said it, he didn’t think I sounded enthused enough, so he started yelling at me that he had worked so hard on this, and he should get more excitement out of me. Of course, I literally had just woken up.

So, next time I saw him, I said, “Tim, you can’t do that. It’s not fair. You can’t wake me up out of a dead sleep and tell me I’m not excited enough.” He said, “Brian, this is why we’re friends. I really appreciate the fact that you can tell me that. And you’re right.”

In the end, Devo didn’t quite provide a full soundtrack, but rather a chiptunes version of “Some Things Never Change,” a track taken from their latest album Total Devo which plays over Neuromancer‘s splash screen.

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of "synth-spaghetti."

The opening of the game. Case, now recast as a hapless loser, not much better than a space janitor, wakes up face-down in a plate of “synth-spaghetti.”

As an adaptation of the novel, Neuromancer the game can only be considered a dismal failure. Like the novel, the game’s story begins in a sprawling Japanese metropolis of the future called Chiba City, stars a down-on-his-luck console cowboy named Case, and comes to revolve around a rogue artificial intelligence named Neuromancer. Otherwise, though, the plot of the game has very little resemblance to that of the novel. Considered in any other light than the commercial, the license is completely pointless; this could easily have been a generic cyberpunk adventure.

The game’s tone departs if anything even further from its source material than does its plot. Out of a sense of obligation, it occasionally manages to shoehorn in a few lines of Gibson’s prose, but rather than even trying to capture the noirish moodiness of the novel the game aims for considerably lower-hanging fruit. In what was almost becoming a default setting for adventure games by the late 1980s, Case is now a semi-incompetent loser whom the game can feel free to make fun of, inhabiting a science-fiction-comedy universe which has much more to do with Douglas Adams — or, to move the fruit just that much lower, Planetfall or Space Quest — than William Gibson. This approach tended to show up so much in adventure games for very practical reasons: it removed most of the burden from the designers of trying to craft really coherent, believable narratives out of the very limited suite of puzzle and game-play mechanics at their disposal. Being able to play everything for absurdist laughs just made design so much easier. Cop-out though it kind of was, it must be admitted that some of the most beloved classics of the adventure-game genre use exactly this approach. Still, it does have the effect of making Neuromancer the game read almost like a satire of Neuromancer the novel, which can hardly be ideal, at least from the standpoint of the licenser.

And yet, when divorced of its source material and considered strictly as a computer game Neuromancer succeeds rather brilliantly. It plays on three levels, only the first of which is open to you in the beginning. Those earliest stages confine you to “meat space,”  where you walk around, talk with other characters, and solve simple puzzles. Once you manage to get your console back from the man to whom you pawned it, you’ll be able to enter the second level. Essentially a simulation of the online bulletin-board scene of the game’s own time, it has you logging onto various “databases,” where you can download new programs to run on your console, piece together clues and passwords, read forums and email, and hack banks and other entities. Only around the mid-way point of the game will you manage to get onto the Matrix proper, a true virtual-reality environment. Here you’ll have to engage in graphical combat with ever more potent forms of ICE (“Intrusion Countermeasures Electronics”) to penetrate ever more important databases.

Particularly at this stage, the game manifests a strong CRPG component; not only do you need to earn money to buy ever better consoles, software, and “skill chips” that conveniently slot right into Case’s brain, but as Case fights ICE on the Matrix his skills also improve with experience. It’s a heady brew, and a wonderfully varied and entertaining one. Despite the limitations of the Commodore 64, the platform on which it made its debut, Neuromancer is one of the most content-rich games of its era, with none of the endless random combats and assorted busywork that stretches the other contemporaneous CRPGs of Interplay and others to such interminable lengths. Neuromancer ends just about when you feel it ought to end, having provided that addictive CRPG rush of building up a character from a weakling to a powerhouse without ever having bored you in the process.

Reading messages from The Scene... err, from Neuromancer's hacker underground.

Reading messages from the Scene… err, from Neuromancer‘s version of the hacker underground.

One of the more eyebrow-raising aspects of Neuromancer is the obvious influence that the real underground world of the Scene had on its own online milieu. The lingo, the attitudes… all of it is drawn from pirate BBS culture, circa 1988. Ironically, it actually evokes the spirit of the Scene far better than it does anything from Gibson’s novel, serving in this respect as quite a lovely historical time capsule. At least some people at Interplay, it seems, were far more familiar with that illegal world than any upstanding citizen ought to have been. Neuromancer is merely one more chapter in the long shared history of legitimate software developers and pirates, who were always much more interconnected and even mutually dependent than the strident rhetoric of the Software Publishers Association might lead one to suspect. Richard Garriott’s Akalabeth was first discovered by his eventual publisher California Pacific via a pirated version someone brought into the office; Sid Meier ran one of the most prolific piracy rings in Baltimore before he became one of the most famous game designers in history… the anecdotes are endless. Just to blur the lines that much more, soon after Neuromancer some cracking groups would begin to go legitimate, becoming game makers in their own rights.

Like other Interplay games from this period, Neuromancer is also notable for how far it’s willing to push the barriers of content in what was still the games industry’s equivalent of pre-Hayes Code Hollywood. There’s an online sex board you can visit, a happy-ending massage parlor, a whore wandering the streets. Still, while it’s not exactly a comedic revelation, I find the writing in Neuromancer makes it a more likable game than, say, Wasteland with its somewhat juvenile transgression for transgression’s sake. Neuromancer walks right up to that line on one or two occasions, but never quite crosses it in this critic’s opinion.

That said, it’s not of course a perfect game. The interface, especially in the meat-space portions, is a little clunky; it looks like a typical point-and-click adventure game, but its actual control scheme is far more baroque, which can lead to some cognitive dissonance when you first start to play. But that sorts itself out once you get into the swing of things. Neuromancer is by far my favorite Interplay game of the 1980s, boldly original but also thoroughly playable — and, it should be noted, rigorously fair. Take careful notes and do your due diligence, and you can feel confident of being able to solve this one.

About to do battle with an artificial intelligence, the most fearsome of the foes you'll encounter in the Matrix.

About to do battle with an artificial intelligence, the most fearsome of the foes you’ll encounter in the Matrix.

Neuromancer was released on the Commodore 64 and the Apple II in late 1988 as one of Interplay’s first two self-published games. The other, fortunately for Interplay but perhaps unfortunately for Neuromancer‘s commercial prospects, was an Amiga game called Battle Chess. Far less conceptually ambitious than Neuromancer, Battle Chess was nothing more or less than a rather everyday chess program, the like of which dozens could be found in the public domain, onto which Interplay had grafted “4 MB of animation” and “400 K of digitized sound” (yes, those figures were considered very impressive at the time). Specifically, when you moved a piece on the board, you got to watch it walk over to its new position, possibly killing other pieces in the process. And that was it, the entire gimmick. But, in those days when games were so frequently purchased as showpieces for one’s graphics and sound hardware, it was more than enough. Battle Chess became just the major hit Interplay needed to establish themselves as a publisher, but one that in the process sucked all of Neuromancer‘s oxygen right out of the room. Despite the strength of the license, the latter game went comparatively neglected by Interplay, still a very small company with very limited resources, in the rush to capitalize on the Battle Chess sensation. Neuromancer was ported to MS-DOS and the Apple IIGS in 1989 and to the Amiga in 1990 — in my opinion this last is the definitive version — but was never a big promotional priority and never sold in more than middling numbers. Early talk of a sequel, to have been based on William Gibson’s second novel Count Zero, remained only that. Neuromancer is all but forgotten today, one of the lost gems of its era.

I always make it a special point to highlight games I consider to be genuine classics, the ones that still hold up very well today, and that goes double if they aren’t generally well-remembered. Neuromancer fits into both categories. So, please, feel free to download the Amiga version from right here, pick up an Amiga emulator if you don’t have one already, and have at it. This one really is worth it, folks.

I’ll of course have much more to say about the newly self-sufficient Interplay in future articles. But as for the other players in today’s little drama:

Timothy Leary remained committed to using computers to “express the panoramas of your own brain” right up until he died in 1996, although without ever managing to bring any of his various projects, which increasingly hewed to Matrix-like three-dimensional virtual realities drawn from William Gibson, into anything more than the most experimental of forms.

William Gibson himself… well, I covered him in my last article, didn’t I?

Deborah Rosenberg soldiered on for quite some time alone with the cabana-boy-less Cabana Boy; per contractual stipulation, the Neuromancer game box says it’s “soon to be a major motion picture from Cabana Boy Productions.” And, indeed, she at last managed to sign an actual contract with Tri-Star Pictures on June 2, 1989, to further develop her screenplay, at which point Tri-Star would, “at its discretion,” “produce the movie.” But apparently Tri-Star took discretion to be the better part of valor in the end; nothing else was ever heard of the deal. Cabana Boy was officially dissolved on March 24, 1993. There followed years of litigation between the Rosenbergs and the Internal Revenue Service; it seems the former had illegally deducted all of the money they’d poured into the venture, more than $2 million all told, from their tax returns. (It’s largely thanks to the paper trail left behind by the tax-court case, which wasn’t finally settled until 2000, that we know as much about the details of Cabana Boy as we do.) Deborah Rosenberg has presumably gone back to being simply the wife of a plastic surgeon to the stars, whatever that entails, her producing and screenwriting aspirations nipped in the bud and tucked back away wherever it was they came from.

Since Cabana Boy’s collapse, Neuromancer has continued to kick around Hollywood without ever managing to get made. As of this writing, it appears to still be at least ostensibly a living project. Whether it really matters at this point, when most of the novel’s once-revolutionary ideas have become conventional wisdom, is a question for those still bearing the torch to debate.

Earl Mac Rauch did write the screenplay for Wired, the John Belushi biopic, but it attracted jeers and walk-outs at the 1989 Cannes Film Festival, and was a critical and financial disaster. Having collected three strikes in the form of New York, New York, Buckaroo Banzai, and now Wired, Rauch was out. He vanished into obscurity, although I understand he has resurfaced in recent years to write some Buckaroo Banzai graphic novels.

And as for our two cabana boys, Ashley Tyler and Jeffrey Kinart… who knows? Perhaps they’re patrolling some pool somewhere to this day, regaling the guests with glories that were or glories that may, with a minor financial contribution, yet be.

(Sources: Computer Gaming World of September 1988; The Games Machine of October 1988; Aboriginal Science Fiction of October 1986; AmigaWorld of May 1988; Compute! of October 1991; The One of February 1989; Starlog of July 1984; Spin of April 1987. Online sources include the sordid details of the Cabana Boy tax case, from the United States Tax Court archive and an Alison Rhonemus’s blog post on some of the contents of Timothy Leary’s papers, which are now held at the New York Public Library. I also made use of the Get Lamp interview archives which Jason Scott so kindly shared with me. Finally, my huge thanks to Brian Fargo for taking time from his busy schedule to discuss his memories of Interplay’s early days with me.)

 
35 Comments

Posted by on November 11, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,