RSS

Search results for ‘trinity’

Trinity Postscript: Selling Tragedy

Like A Mind Forever Voyaging, Trinity seemed destined to become a casualty of an industry that just wasn’t equipped to appreciate what it was trying to do. Traditional game-review metrics like “fun” or “value for money” only cheapened it, while reviewers lacked the vocabulary to even begin to really address its themes. Most were content to simply mention, in passing and often with an obvious unease, that those themes were present. In Computer Gaming World, for instance, Scorpia said that it was “not for the squeamish,” would require of the player “some unpleasant actions,” that it was “overall a serious game, not a light-hearted one,” and then on to the firmer ground of puzzle hints. And that was downright thoughtful in comparison to Shay Addams’s review for Questbusters, which tried in a weird and clunky way to be funny in all the ways that Trinity doesn’t: “It blowed up real good!” runs the review’s tagline, which goes on to ask if they’ll be eating “fission chips” in the Kensington Gardens after the missiles drop. (Okay, that one’s dumb enough to be worth a giggle…) But the review’s most important point is that Trinity is “mainly a game” again after the first Interactive Fiction Plus title, A Mind Forever Voyaging, so disappointed: “The puzzles are back!”

Even Infocom themselves weren’t entirely sure how to sell or even how to talk about Trinity. The company’s creative management had been unstintingly supportive of Brian Moriarty while he was making the game, but “marketing,” as he said later, “was a little more concerned/disturbed. They didn’t quite know what to make of it.” The matrix of genres didn’t have a slot for “Historical Tragedy.” In the end they slapped a “Fantasy” label on it, although it doesn’t take a long look at Trinity and the previous games to wear that label — the Zork and Enchanter series — to realize that one of these things is not quite like the others.

Moriarty admits to “a few tiffs” with marketing over Trinity, but he was a reasonable guy who also understood that Infocom needed to sell their games and that, while the occasional highbrow press from the likes of The New York Times Book Review had been nice and all, the traditional adventure-game market was the only place they had yet succeeded in consistently doing that. Thus in interviews and other promotions for Trinity he did an uncomfortable dance, trying to talk seriously about the game and the reasons he wrote it while also trying not to scare away people just looking for a fun text adventure. The triangulations can be a bit excruciating: “It isn’t a gloomy game, but it does have a dark undertone to it. It’s not like it’s the end of the world.” (Actually, it is.) Or: “It’s kind of a dark game, but it’s also, I like to think, kind of a fun game too.” (With a ringing endorsement like “I like to think it’s kind of a fun game,” how could anyone resist?)

Trinity‘s commercial saving grace proved to be a stroke of serendipity having nothing to do with any its literary qualities. The previous year Commodore had released what would prove to be their last 8-bit computer, the Commodore 128. Despite selling quite well, the machine had attracted very little software support. The cause, ironically, was also the reason it had done so well in comparison to the Plus/4, Commodore’s previous 8-bit machine. The 128, you see, came equipped with a “64 Mode” in which it was 99.9 percent compatible with the Commodore 64. Forced to choose between a modest if growing 128 user base and the massive 64 user base through which they could also rope in all those 128 users, almost all publishers, with too many incompatible machines to support already, made the obvious choice.

Infocom’s Interactive Fiction Plus system was, however, almost unique in the entertainment-software industry in running on the 128 in its seldom-used (at least for games) native mode. And all those new 128 owners were positively drooling for a game that actually took advantage of the capabilities of their shiny new machines. A Mind Forever Voyaging and Trinity arrived simultaneously on the Commodore 128 when the Interactive Fiction Plus interpreter was ported to that platform in mid-1986. But the puzzleless A Mind Forever Voyaging was a bit too outré for most gamers’ tastes. Plus it was older, and thus not getting the press or the shelf space that Trinity was. Trinity, on the other hand, fit the bill of “game I can use to show off my 128” just well enough, even for 128 users who might otherwise have had little interest in an all-text adventure game. Infocom’s sales were normally quite evenly distributed across the large range of machines they supported, but Trinity‘s were decidedly lopsided in favor of the Commodore 128. Those users’ numbers were enough to push Trinity to the vicinity of 40,000 in sales, not a blockbuster — especially by the standards of Infocom’s glory years — but enough to handily outdo not just A Mind Forever Voyaging but even more traditional recent games like Spellbreaker. Like the Cold War Trinity chronicles, it could have been much, much worse.

 
5 Comments

Posted by on February 26, 2015 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Trinity

Trinity

During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had taken office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became one of Infocom’s last substantial hits. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)

 
 

Tags: , ,

From Wingleader to Wing Commander

No one at Origin had much time to bask in the rapturous reception accorded to Wingleader at the 1990 Summer Consumer Electronics Show. Their end-of-September deadline for shipping the game was now barely three months away, and there remained a daunting amount of work to be done.

At the beginning of July, executive producer Dallas Snell called the troops together to tell them that crunch time was beginning in earnest; everyone would need to work at least 55 hours per week from now on. Most of the people on the project only smiled bemusedly at the alleged news flash. They were already working those kinds of hours, and knew all too well that a 55-hour work week would probably seem like a part-timer’s schedule before all was said and done.

Dallas Snell

At the beginning of August, Snell unceremoniously booted Chris Roberts, the project’s founder, from his role as co-producer, leaving him with only the title of director. Manifesting a tendency anyone familiar with his more recent projects will immediately recognize, Roberts had been causing chaos on the team by approving seemingly every suggested addition or enhancement that crossed his desk. Snell, the brutal pragmatist in this company full of dreamers, appointed himself as Warren Spector’s new co-producer. His first action was to place a freeze on new features in favor of getting the game that currently existed finished and out the door. Snell:

The individuals in Product Development are an extremely passionate group of people, and I love that. Everyone is here because, for the most part, they love what they’re doing. This is what they want to do with their lives, and they’re very intense about it and very sensitive to your messing around with what they’re trying to accomplish. They don’t live for getting it done on time or having it make money. They live to see this effect or that effect, their visions, accomplished.

It’s always a continual antagonistic relationship between the executive producer and the development teams. I’m always the ice man, the ogre, or something. It’s not fun, but it gets the products done and out. I guess that’s why I have the room with the view. Anyway, at the end of the project, all of Product Development asked me not to get that involved again.

One problem complicating Origin’s life enormously was the open architecture of MS-DOS, this brave new world they’d leaped into the previous year. Back in the Apple II days, they’d been able to write their games for a relatively static set of hardware requirements, give or take an Apple IIGS running in fast mode or a Mockingboard sound card. The world of MS-DOS, by contrast, encompassed a bewildering array of potential hardware configurations: different processors, different graphics and sound cards, different mice and game controllers, different amounts and types of memory, different floppy-disk formats, different hard-disk capacities. For a game like Wingleader, surfing the bleeding edge of all this technology but trying at the same time to offer at least a modicum of playability on older setups, all of this variance was the stuff of nightmares. Origin’s testing department was working 80-hour weeks by the end, and, as we’ll soon see, the final result would still leave plenty to be desired from a quality-control perspective.

As the clock was ticking down toward release, Origin’s legal team delivered the news that it probably wouldn’t be a good idea after all to call the game Wingleader — already the company’s second choice for a name — thanks to a number of existing trademarks on the similar “Wingman.” With little time to devote to yet another naming debate, Origin went with their consensus third choice of Wing Commander, which had lost only narrowly to Wingleader in the last vote. This name finally stuck. Indeed, today it’s hard to imagine Wing Commander under any other name.

The game was finished in a mad frenzy that stretched right up to the end; the “installation guide” telling how to get it running was written and typeset from scratch in literally the last five hours before the whole project had to be packed into a box and shipped off for duplication. That accomplished, everyone donned their new Wing Commander baseball caps and headed out to the front lawn for Origin’s traditional ship-day beer bash. There Robert Garriott climbed onto a picnic table to announce that all of Chris Roberts’s efforts in creating by far the most elaborate multimedia production Origin had ever released had been enough to secure him, at long last, an actual fast job at the company. “As of 5 P.M. this afternoon,” said Garriott, “Chris is Origin’s Director of New Technologies. Congratulations, Chris, and welcome to the Origin team.” The welcome was, everyone had to agree, more than a little belated.

We’ll turn back to Roberts’s later career at Origin in future articles. At this point, though, this history of the original Wing Commander must become the story of the people who played it rather than that of the people who created it. And, make no mistake, play it the people did. Gamers rushed to embrace what had ever since that Summer CES show been the most anticipated title in the industry. Roberts has claimed that Wing Commander sold 100,000 copies in its first month, a figure that would stand as ridiculous if applied to just about any other computer game of the era, but which might just be ridiculous enough to be true in the case of Wing Commander. While hard sales figures for the game or the franchise it would spawn have never to my knowledge been made public, I can feel confident enough in saying that sales of the first Wing Commander soared into the many, many hundreds of thousands of units. The curse of Ultima was broken; Origin now had a game which had not just become a hit in spite of Ultima‘s long shadow, they had a game which threatened to do the unthinkable — to overshadow Ultima in their product catalog. Certainly all indications are that Wing Commander massively outsold Ultima VI, possibly by a factor of two to one or more. It would take a few years, until the release of Doom in 1993, for any other name to begin to challenge that of Wing Commander as the most consistent money spinner in American computer gaming.

But why should that have been? Why should this particular game of all others have become such a sensation? Part of the reason must be serendipitous timing. During the 1990s as in no decade before or since, the latest developments in hardware would drive sales of games that could show them off to best effect, and Wing Commander set the stage for this trend. Released at a time when 80386-based machines with expanded memory, sound cards, and VGA graphics were just beginning to enter American homes in numbers, Wing Commander took advantage of all those things like no other game on the market. It benefited enormously from this singularity among those who already owned the latest hardware setups, while causing yet many more jealous gamers who hadn’t heretofore seen a need to upgrade to invest in hot machines of their own — the kind of virtuous circle to warm any capitalist’s heart.

Yet there was also something more going on with Wing Commander than just a cool-looking game for showing off the latest hardware, else it would have suffered the fate of the slightly later bestseller Myst: that of being widely purchased, but very rarely actually, seriously played. Unlike the coolly cerebral Myst, Wing Commander was a crowd-pleaser from top to bottom, with huge appeal, even beyond its spectacular audiovisuals, to anyone who had ever thrilled to the likes of a Star Wars film. It was, in other words, computerized entertainment for the mainstream rather than for a select cognoscenti. Just as all but the most incorrigible snobs could have a good time at a Star Wars showing, few gamers of any stripe could resist the call of Wing Commander. In an era when the lines of genre were being drawn more and more indelibly, one of the most remarkable aspects of Wing Commander‘s reception is the number of genre lines it was able to cross. Whether they normally preferred strategy games or flight simulators, CRPGs or adventures, everybody wanted to play Wing Commander.

At a glance, Chris Roberts’s gung-ho action movie of a game would seem to be rather unsuited for the readership of Computer Gaming World, a magazine that had been born out of the ashes of the tabletop-wargaming culture of the 1970s and was still beholden most of all to computer games in the old slow-paced, strategic grognard tradition. Yet the magazine and its readers loved Wing Commander. In fact, they loved Wing Commander as they had never loved any other game before. After reaching the number-one position in Computer Gaming World‘s readers’ poll in February of 1991, it remained there for an unprecedented eleven straight months, attaining already in its second month on top the highest aggregate score ever recorded for a game. When it was finally replaced at number one in January of 1992, the replacement was none other than the new Wing Commander IIWing Commander I then remained planted right there behind its successor at number two until April, when the magazine’s editors, needing to make room for other games, felt compelled to “retire” it to their Hall of Fame.

In other places, the huge genre-blurring success of Wing Commander prompted an identity crisis. Shay Addams, adventure-game solver extraordinaire, publisher of the Questbusters newsletter and the Quest for Clues series of books, received so many requests to cover Wing Commander that he reported he had been “on the verge of scheduling a brief look” at it. But in the end, he had decided a little petulantly, it “is just a shoot-em-up-in-space game in which the skills necessary are vastly different from those required for completing a quest. (Then again, there is always the possibility of publishing Simulationbusters.)” The parenthetical may have sounded like a joke, but Addams apparently meant it seriously – or, at least, came to mean it seriously. The following year, he started publishing a sister newsletter to Questbusters called Simulations!. It’s hard to imagine him making such a decision absent the phenomenon that was Wing Commander.

So, there was obviously much more to Wing Commander than a glorified tech demo. If we hope to understand what its secret sauce might have been, we need to look at the game itself again, this time from the perspective of a player rather than a developer.

One possibility can be excised immediately. The “space combat simulation” part of the game — i.e., the game part of the game — is fun today and was graphically spectacular back in 1990, but it’s possessed of neither huge complexity nor the sort of tactical or strategic interest that would seem to be required of a title that hoped to spend eleven months at the top of the Computer Gaming World readers’ charts. Better graphics and embodied approach aside, it’s a fairly commonsense evolution of Elite‘s combat engine, complete with inertia and sounds in the vacuum of space and all the other space-fantasy trappings of Star Wars. If we hope to find the real heart of the game’s appeal, it isn’t here that we should look, but rather to the game’s fiction — to the movie Origin Systems built around Chris Roberts’s little shoot-em-up-in-space game.

Wing Commander casts you as an unnamed young pilot, square-jawed and patriotic, who has just been assigned to the strike carrier Tiger’s Claw, out on the front lines of humanity’s war against the vicious Kilrathi, a race of space-faring felines. (Cat lovers should approach this game with caution!) Over the course of the game, you fly a variety of missions in a variety of star systems, affecting the course of the wider war as you do so in very simple, hard-branching ways. Each mission is introduced via a briefing scene, and concluded, if you make it back alive, with a debriefing. (If you don’t make it back alive, you at least get the rare pleasure of watching your own funeral.) Between missions, you can chat with your fellow pilots and a friendly bartender in the Tiger’s Claw‘s officers lounge, play on a simulator in the lounge that serves as the game’s training mode, and keep track of your kill count along with that of the other pilots on the squadron blackboard. As you fly missions and your kill count piles up, you rise through the Tiger’s Claw‘s hierarchy from an untested rookie to the steely-eyed veteran on which everyone else in your squadron depends. You also get the chance to fly several models of space-borne fighters, each with its own flight characteristics and weapons loadouts.

A mission briefing.

The inspirations for Wing Commander as a piece of fiction aren’t hard to find in either the game itself or the many interviews Chris Roberts has given about it over the years. Leaving aside the obvious influence of Star Wars on the game’s cinematic visuals, Wing Commander fits most comfortably into the largely book-bound sub-genre of so-called “military science fiction.” A tradition which has Robert Heinlein’s 1959 novel Starship Troopers as its arguable urtext, military science fiction is less interested in the exploration of strange new worlds, etc., than it is in the exploration of possible futures of warfare in space.

There isn’t much doubt where Wing Commander‘s historical inspiration lies.

Because worldbuilding is hard and extrapolating the nitty-gritty details of future modes of warfare is even harder, much military science fiction is built out of thinly veiled stand-ins for the military and political history of our own little planet. So, for example, David Weber’s long-running Honor Harrington series transports the Napoleonic Wars into space, while Joe Haldeman’s The Forever War — probably the sub-genre’s best claim to a work of real, lasting literary merit — is based largely on the author’s own experiences in Vietnam. Hewing to this tradition, Wing Commander presents a space-borne version of the grand carrier battles which took place in the Pacific during World War II — entirely unique events in the history of human warfare and, as this author can well attest, sheer catnip to any young fellow with a love of ships and airplanes and heroic deeds and things that go boom. Wing Commander shares this historical inspiration with another of its obvious fictional inspirations, the fun if terminally cheesy 1978 television series Battlestar Galactica. (Come to think of it, much the same description can be applied to Wing Commander.)

Sparkling conversationalists these folks aren’t.

Wing Commander is also like Battlestar Galactica in another respect: it’s not so much interested in constructing a detailed technological and tactical framework for its vision of futuristic warfare — leave that stuff to the books! — as it is in choosing whatever thing seems coolest at any given juncture. We know nothing really about how or why any of the stuff in the game works, just that’s it’s our job to go out and blow stuff up with it. Nowhere is that failing, if failing it be, more evident than in the very name of the game. “Wing Commander” is a rank in the Royal Air Force and those of Commonwealth nations denoting an officer in charge of several squadrons of aircraft. It’s certainly not an appropriate designation for the role you play here, that of a rookie fighter pilot who commands only a single wingman. This Wing Commander is called Wing Commander strictly because it sounds cool.

In time, Origin’s decision to start hiring people to serve specifically in the role of writer would have a profound effect on the company’s games, but few would accuse this game, one of Origin’s first with an actual, dedicated “lead writer,” of being deathless fiction. To be fair to David George, it does appear that he spent the majority of his time drawing up the game’s 40 missions, serving in a role that would probably be dubbed “scenario designer” or “level designer” today rather than “writer.” And  it’s not as if Chris Roberts’s original brief gave him a whole lot to work with. This is, after all, a game where you’re going to war against a bunch of anthropomorphic house cats. (Our cat told me she thought about conquering the galaxy once or twice, but she wasn’t sure she could fit it into the three hours per day she spends awake.) The Kilrathi are kind of… well, there’s just no getting around it, is there? The whole Kilrathi thing is pretty stupid, although it does allow your fellow pilots to pile on epithets like “fur balls,” “fleabags,” and, my personal favorite, “Killie-cats.”

Said fellow pilots are themselves a collection of ethnic stereotypes so over-the-top as to verge on the offensive if it wasn’t so obvious that Origin just didn’t have a clue. Spirit is Japanese, so of course she suffixes every name with “-san” or “-sama” even when speaking English, right? And Angel is French, so of course she says “bonjour” a lot, right? Right?

My second favorite Wing Commander picture comes from the manual rather than the game proper. Our cat would look precisely this bitchy if I shoved her into a spacesuit.

Despite Chris Roberts’s obvious and oft-stated desire to put you into an interactive movie, there’s little coherent narrative arc to Wing Commander, even by action-movie standards. Every two to four missions, the Tiger’s Claw jumps to some other star system and some vague allusion is made to the latest offensive or defensive operation, but there’s nothing to really hang your hat on in terms of a clear unfolding narrative of the war. A couple of cut scenes do show good or bad events taking place elsewhere, based on your performance in battle — who knew one fighter pilot could have so much effect on the course of a war? — but, again, there’s just not enough detail to give a sense of the strategic situation. One has to suspect that Origin didn’t know what was really going on any better than the rest of us.

My favorite Wing Commander pictures, bar none. What I love best about these and the picture above is the ears on the helmets. And what I love best about the ears on the helmets is that there’s no apparent attempt to be cheeky or funny in placing them there. (One thing this game is totally devoid of is deliberate humor. Luckily, there’s plenty of non-deliberate humor to enjoy.) Someone at Origin said, “Well, they’re cats, so they have to have space in their helmets for their ears, right?” and everyone just nodded solemnly and went with it. If you ask me, nothing illustrates Wing Commander‘s charming naivete better than this.

In its day, Wing Commander was hugely impressive as a technological tour de force, but it’s not hard to spot the places where it really suffered from the compressed development schedule. There’s at least one place, for example, where your fellow pilots talk about an event that hasn’t actually happened yet, presumably due to last minute juggling of the mission order. More serious are the many and varied glitches that occur during combat, from sound drop-outs to the occasional complete lock-up. Most bizarrely of all to our modern sensibilities, Origin didn’t take the time to account for the speed of the computer running the game. Wing Commander simply runs flat-out all the time, as fast as the hosting computer can manage. This delivered a speed that was just about perfect on a top-of-the-line 80386-based machine of 1990, but that made it effectively unplayable on the next generation of 80486-based machines that started becoming popular just a couple of years later; this game was definitely not built with any eye to posterity. Wing Commander would wind up driving the development of so-called “slowdown” programs that throttled back later hardware to keep games like this one playable.

Still, even today Wing Commander remains a weirdly hard nut to crack in this respect. For some reason, presumably involving subtle differences between real and emulated hardware, it’s impossible to find an entirely satisfactory speed setting for the game in the DOSBox emulator. A setting which seems perfect when flying in open space slows down to a crawl in a dogfight; a setting which delivers a good frame rate in a dogfight is absurdly fast when fewer other ships surround you. The only apparent solution to the problem is to adjust the DOSBox speed settings on the fly as you’re trying not to get shot out of space by the Kilrathi — or, perhaps more practically, to just find something close to a happy medium and live with it. One quickly notices when reading about Wing Commander the wide variety of opinions about its overall difficulty, from those who say it’s too easy to those who say it’s way too hard to those who say it’s just right. I wonder whether this disparity is down to the fact that, thanks to the lack of built-in throttling, everyone is playing a slightly different version of the game.

The only thing worse than being a cat lover in this game is being a pacifist. And everyone knows cats don’t like water, Shotglass… sheesh.

It becomes clear pretty quickly that the missions are only of a few broad types, encompassing patrols, seek-and-destroy missions, and escort missions (the worst!), but the context provided by the briefings keeps things more interesting than they might otherwise be, as do the variety of spacecraft you get to fly and fight against. The mission design is pretty good, although the difficulty does ebb and spike a bit more than it ideally might. In particular, one mission found right in the middle of the game — the second Kurosawa mission, for those who know the game already — is notorious for being all but impossible. Chris Roberts has bragged that the missions in the finished game “were exactly the ones that Jeff George designed on paper — we didn’t need to do any balancing at all!” In truth, I’m not sure the lack of balancing isn’t a bug rather than a feature.

Um, yes. I’m standing here, aren’t I? Should this really be a judgment call?

Roberts’s decision to allow you to take your lumps and go on even when you fail at a mission was groundbreaking at the time. Yet, having made this very progressive decision, he then proceeded to implement it in the most regressive way imaginable. When you fail in Wing Commander, the war as a whole goes badly, thanks again to that outsize effect you have upon it, and you get punished by being forced to fly against even more overwhelming odds in inferior fighters. Imagine, then, what it’s like to play Wing Commander honestly, without recourse to save games, as a brand new player. Still trying to get your bearings as a rookie pilot, you don’t perform terribly well in the first two or three missions. In response, your commanding officer delivers a constant drumbeat of negative feedback, while the missions just keep getting harder and harder at what feels like an almost exponential pace, ensuring that you continue to suck every time you fly. By the time you’ve failed at 30 missions and your ineptitude has led to the Tiger’s Claw being chased out of the sector with its (striped?) tail between its legs, you might just need therapy to recover from the experience.

What ought to happen, of course, is that failing at the early missions should see you assigned to easier rather than harder ones — no matter the excuse; Origin could make something up on the fly, as they so obviously did so much of the game’s fiction — that give you a chance to practice your skills. Experienced, hardcore players could still have their fun by trying to complete the game in as few missions as possible, while newcomers wouldn’t have to feel like battered spouses. Or, if such an elegant solution wasn’t possible, Origin could at least have given us player-selectable difficulty levels.

As it is, the only practical way to play as a newcomer is to ignore all of Origin’s exhortations to play honestly and just keep reloading until you successfully complete each mission; only in this way can you keep the escalating difficulty manageable. (The one place where I would recommend that you take your lumps and continue is in the aforementioned second Kurosawa mission. Losing here will throw you briefly off-track, but the missions that follow aren’t too difficult, and it’s easier to play your way to victory through them than to try to beat Mission Impossible.) This approach, it should be noted, drove Chris Roberts crazy; he considered it nothing less than a betrayal of the entire premise around which he’d designed his game. Yet he had only himself to blame. Like much in Wing Commander, the discrepancy between the game Roberts wants to have designed and the one he’s actually designed speaks to the lack of time to play it extensively before its release, and thereby to shake all these problems out.

And yet. And yet…

Having complained at such length about Wing Commander, I find myself at something of an impasse, in that my overall verdict on the game is nowhere near as negative as these complaints would imply. It’s not even a case of Wing Commander being, like, say, most of the Ultima games, a groundbreaking work in its day that’s a hard sell today. No, Wing Commander is a game I continue to genuinely enjoy despite all its obvious problems.

In writing about all these old games over the years, I’ve noticed that those titles I’d broadly brand as classics and gladly recommend to contemporary players tend to fall into two categories. There are games like, say, The Secret of Monkey Island that know exactly what they’re trying to do and proceed to do it all almost perfectly, making all the right choices; it’s hard to imagine how to improve these games in any but the tiniest of ways within the context of the technology available to their developers. And then there are games like Wing Commander that are riddled with flaws, yet still manage to be hugely engaging, hugely fun, almost in spite of themselves. Who knows, perhaps trying to correct all the problems I’ve spent so many words detailing would kill something ineffably important in the game. Certainly the many sequels and spinoffs to the original Wing Commander correct many of the failings I’ve described in this article, yet I’m not sure any of them manage to be a comprehensively better game. Like so many creative endeavors, game design isn’t a zero-sum game. Much as I loathe the lazy critic’s cliche “more than the sum of its parts,” it feels hard to avoid it here.

It’s true that many of my specific criticisms have an upside to serve as a counterpoint. The fiction may be giddy and ridiculous, but it winds up being fun precisely because it’s so giddy and ridiculous. This isn’t a self-conscious homage to comic-book storytelling of the sort we see so often in more recent games from this Age of Irony of ours. No, this game really does think this stuff it’s got to share with you is the coolest stuff in the world, and it can’t wait to get on with it; it lacks any form of guile just as much as it does any self-awareness. In this as in so many other senses, Wing Commander exudes the personality of its creator, helps you to understand why it was that everyone at Origin Systems so liked to have this high-strung, enthusiastic kid around them. There’s an innocence about the game that leaves one feeling happy that Chris Roberts was steered away from his original plans for a “gritty” story full of moral ambivalence; one senses that he wouldn’t have been able to do that anywhere near as well as he does this. Even the Kilrathi enemies, silly as they are, take some of the sting out of war; speciesist though the sentiment may be, at least it isn’t people you’re killing out there. Darned if the fiction doesn’t win me over in the end with its sheer exuberance, all bright primary emotions to match the bright primary colors of the VGA palette. Sometimes you’re cheering along with it, sometimes you’re laughing at it, but you’re always having a good time. The whole thing is just too gosh-darned earnest to annoy me like most bad writing does.

Even the rogue’s gallery of ethnic stereotypes that is your fellow pilots doesn’t grate as much as it might. Indeed, Origin’s decision to include lots of strong, capable women and people of color among the pilots should be applauded. Whatever else you can say about Wing Commander, its heart is almost always in the right place.

Winning a Golden Sun for “surviving the destruction of my ship.” I’m not sure, though, that “sacrificing my vessel” was really an act of bravery, under the circumstances. Oh, well, I’ll take whatever hardware they care to give me.

One thing Wing Commander understands very well is the value of positive reinforcement — the importance of, as Sid Meier puts it, making sure the player is always the star of the show. In that spirit, the kill count of even the most average player will always advance much faster on the squadron’s leader board than that of anyone else in the squadron. As you play through the missions, you’re given promotions and occasionally medals, the latter delivered amidst the deafening applause of your peers in a scene lifted straight from the end of the first Star Wars film (which was in turn aping the Nuremberg Rally shown in Triumph of the Will, but no need to think too much about that in this giddy context). You know at some level that you’re being manipulated, just as you know the story is ridiculous, but you don’t really care. Isn’t this feeling of achievement a substantial part of the reason that we play games?

Another thing Wing Commander understands — or perhaps stumbled into accidentally thanks to the compressed development schedule — is the value of brevity. Thanks to the tree structure that makes it impossible to play all 40 missions on any given run-through, a typical Wing Commander career spans no more than 25 or 30 missions, most of which can be completed in half an hour or so, especially if you use the handy auto-pilot function to skip past all the point-to-point flying and just get to the places where the shooting starts. (Personally, I prefer the more organic feel of doing all the flying myself, but I suspect I’m a weirdo in this as in so many other respects.) The relative shortness of the campaign means that the game never threatens to run into the ground the flight engine’s rather limited box of tricks. It winds up leaving you wanting more rather than trying your patience. For all these reasons, and even with all its obvious problems technical and otherwise, Wing Commander remains good fun today.

Which doesn’t of course mean that any self-respecting digital antiquarian can afford to neglect its importance to gaming history. The first blockbuster of the 1990s and the most commercially dominant franchise in computer gaming until the arrival of Doom in 1993 shook everything up yet again, Wing Commander can be read as cause or symptom of the changing times. There was a sense even in 1990 that Wing Commander‘s arrival, coming so appropriately at the beginning of a new decade, marked a watershed moment, and time has only strengthened that impression. Chris Crawford, this medium’s eternal curmudgeon — every creative field needs one of them to serve as a corrective to the hype-merchants — has accused Wing Commander of nothing less than ruining the culture of gaming for all time. By raising the bar so high on ludic audiovisuals, runs his argument, Wing Commander dramatically raised the financial investment necessary to produce a competitive game. This in turn made publishers, reluctant to risk all that capital on anything but a sure bet, more conservative in the sorts of projects they were willing to approve, causing more experimental games with only niche appeal to disappear from the market. “It became a hit-driven industry,” Crawford says. “The whole marketing strategy, economics, and everything changed, in my opinion, much for the worse.”

There’s some truth to this assertion, but it’s also true that publishers had been growing more conservative and budgets had been creeping upward for years before Wing Commander. By 1990, Infocom’s literary peak was years in the past, as were Activison’s experimental period and Electronic Arts’s speculations on whether computers could make you cry. In this sense, then, Wing Commander can be seen as just one more point on a trend line, not the dramatic break which Crawford would claim it to be. Had it not come along when it did to raise the audiovisual bar, something else would have.

Where Wing Commander does feel like a cleaner break with the past is in its popularizing of the use of narrative in a traditionally non-narrative-driven genre. This, I would assert, is the real source of the game’s appeal, then and now. The shock and awe of seeing the graphics and hearing the sound and music for the first time inevitably faded even back in the day, and today of course the whole thing looks garish and a little kitschy with those absurdly big pixels. And certainly the space-combat game alone wasn’t enough to sustain obsessive devotion back in the day, while today the speed issues can at times make it more than a little exasperating to actually play Wing Commander at all. But the appeal of, to borrow from Infocom’s old catch-phrase, waking up inside a story — waking up inside a Star Wars movie, if you like — and being swept along on a rollicking, semi-interactive ride is, it would seem, eternal. It may not have been the reason most people bought Wing Commander in the early 1990s — that had everything to do with those aforementioned spectacular audiovisuals — but it was the reason they kept playing it, the reason it remained the best single computer game in the country according to Computer Gaming World‘s readers for all those months. Come for the graphics and sound, stay for the story. The ironic aspect of all this is that, as I’ve already noted, Wing Commander‘s story barely qualified as a story at all by the standards of conventional fiction. Yet, underwhelming though it was on its own merits, it worked more than well enough in providing structure and motivation for the individual missions.

The clearest historical antecedent to Wing Commander must be the interactive movies of Cinemaware, which had struggled to combine cinematic storytelling with modes of play that departed from traditional adventure-game norms throughout the second half of the 1980s, albeit with somewhat mixed success. John Cutter, a designer at Cinemaware, has described how Bob Jacob, the company’s founder and president, reacted to his first glimpse of Wing Commander: “I don’t think I’ve ever seen him look so sad.” With his company beginning to fall apart around him, Jacob had good reason to feel sad. He least of all would have imagined Origin Systems — they of the aesthetically indifferent CRPG epics — as the company that would carry the flag of cinematic computer gaming forward into the new decade, but the proof was right there on the screen in front of him.

There are two accounts, both of them true in their way, to explain how the adventure game, a genre that in the early 1990s was perhaps the most vibrant and popular in computer gaming, ended the decade an irrelevancy to gamers and publishers alike. One explanation, which I’ve gone into a number of times already on this blog, focuses on a lack of innovation and, most of all, a lack of good design practices among far too many adventures developers; these lacks left the genre identified primarily with unfun pixel hunts and illogical puzzles in the minds of far too many players. But another, more positive take on the subject says that adventure games never really went away at all: their best attributes were rather merged into other genres. Did adventure games disappear or did they take over the world? As in so many cases, the answer depends on your perspective. If you focus on the traditional mechanics of adventure games — exploring landscapes and solving puzzles, usually non-violently — as their defining attributes, the genre did indeed go from thriving to all but dying in the course of about five years. If, on the other hand, you choose to see adventure games more broadly as games where you wake up inside a story, it can sometimes seem like almost every game out there today has become, whatever else it is, an adventure game.

Wing Commander was the first great proof that many more players than just adventure-game fans love story. Players love the way a story can make them feel a part of something bigger as they play, and, more prosaically but no less importantly, they love the structure it can give to their play. One of the dominant themes of games in the 1990s would be the injection of story into genres which had never had much use for it before: the unfolding narrative of discovery built into the grand-strategy game X-Com, the campaign modes of the real-time-strategy pioneers Warcraft and Starcraft, the plot that gave meaning to all the shooting in Half-Life. All of these are among the most beloved titles of the decade, spawning franchises that remain more than viable to this day. One has to assume this isn’t a coincidence. “The games I made were always about narrative because I felt that was missing for me,” says Chris Roberts. “I wanted that sense of story and progression. I felt like I wasn’t getting that in games. That was one of my bigger drives when I was making games, was to get that, that I felt like I really wanted and liked from other media.” Clearly many others agreed.

(Sources: the books Wing Commander I and II: The Ultimate Strategy Guide by Mike Harrison and Game Design Theory and Practice by Richard Rouse III; Retro Gamer 59 and 123; Questbusters of July 1989, August 1990, and April 1991; Computer Gaming World of September 1989 and November 1992; Amiga Computing of December 1988. Online sources include documents hosted at the Wing Commander Combat Information Center, US Gamer‘s profile of Chris Roberts, The Escapist‘s history of Wing Commander, Paul Dean’s interview with Chris Roberts, and Matt Barton’s interview with George “The Fat Man” Sanger. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Wing Commander I and II can be purchased in a package together with all of their expansion packs from GOG.com.)

 
 

Tags: , ,

The Freedom to Associate

In 1854, an Austrian priest and physics teacher named Gregor Mendel sought and received permission from his abbot to plant a two-acre garden of pea plants on the grounds of the monastery at which he lived. Over the course of the next seven years, he bred together thousands upon thousands of the plants under carefully controlled circumstances, recording in a journal the appearance of every single offspring that resulted, as defined by seven characteristics: plant height, pod shape and color, seed shape and color, and flower position and color. In the end, he collected enough data to formulate the basis of the modern science of genetics, in the form of a theory of dominant and recessive traits passed down in pairs from generation to generation. He presented his paper on the subject, “Experiments on Plant Hybridization,” before the Natural History Society of Austria in 1865, and saw it published in a poorly circulated scientific journal the following year.

And then came… nothing. For various reasons — perhaps due partly to the paper’s unassuming title, perhaps due partly to the fact that Mendel was hardly a known figure in the world of biology, undoubtedly due largely to the poor circulation of the journal in which it was published — few noticed it at all, and those who did dismissed it seemingly without grasping its import. Most notably, Charles Darwin, whose On the Origin of Species had been published while Mendel was in the midst of his own experiments, seems never to have been aware of the paper at all, thereby missing this key gear in the mechanism of evolution. Mendel was promoted to abbot of his monastery shortly after the publication of his paper, and the increased responsibilities of his new post ended his career as a scientist. He died in 1884, remembered as a quiet man of religion who had for a time been a gentleman dabbler in the science of botany.

But then, at the turn of the century, the German botanist Carl Correns stumbled upon Mendel’s work while conducting his own investigations into floral genetics, becoming in the process the first to grasp its true significance. To his huge credit, he advanced Mendel’s name as the real originator of the set of theories which he, along with one or two other scientists working independently, was beginning to rediscover. Correns effectively shamed those other scientists as well into acknowledging that Mendel had figured it all out decades before any of them even came close. It was truly a selfless act; today the name of Carl Correns is unknown except in esoteric scientific circles, while Gregor Mendel’s has been done the ultimate honor of becoming an adjective (“Mendelian”) and a noun (“Mendelism”) locatable in any good dictionary.

Vannevar Bush

Vannevar Bush

So, all’s well that ends well, right? Well, maybe, but maybe not. Some 30 years after the rediscovery of Mendel’s work, an American named Vannevar Bush, dean of MIT’s School of Engineering, came to see the 35 years that had passed between the publication of Mendel’s theory and the affirmation of its importance as a troubling symptom of the modern condition. Once upon a time, all knowledge had been regarded as of a piece, and it had been possible for a great mind to hold within itself huge swathes of this collective knowledge of humanity, everything informing everything else. Think of that classic example of a Renaissance man, Leonardo da Vinci, who was simultaneously a musician, a physicist, a mathematician, an anatomist, a botanist, a geologist, a cartographer, an alchemist, an astronomer, an engineer, and an inventor. Most of all, of course, he was a great visual artist, but he used everything else he was carrying around in that giant brain of his to create paintings and drawings as technically meticulous as they were artistically sublime.

By Bush’s time, however, the world had long since entered the Age of the Specialist. As the sheer quantity of information in every field exploded, those who wished to do worthwhile work in any given field — even those people gifted with giant brains — were increasingly being forced to dedicate their intellectual lives entirely to that field and only that field, just to keep up. The intellectual elite were in danger of becoming a race of mole people, closeted one-dimensionals fixated always on the details of their ever more specialized trades, never on the bigger picture. And even then, the amount of information surrounding them was so vast, and existing systems for indexing and keeping track of it all so feeble, that they could miss really important stuff within their own specialties; witness the way the biologists of the late nineteenth century had missed Gregor Mendel’s work, and the 35-years head start it had cost the new science of genetics. “Mendel’s work was lost,” Bush would later write, “because of the crudity with which information is transmitted between men.” How many other major scientific advances were lying lost in the flood of articles being published every year, a flood that had increased by an order of magnitude just since Mendel’s time? “In this are thoughts,” wrote Bush, “certainly not often as great as Mendel’s, but important to our progress. Many of them become lost; many others are repeated over and over.” “This sort of catastrophe is undoubtedly being repeated all around us,” he believed, “as truly significant attainments become lost in the sea of the inconsequential.”

Bush’s musings were swept aside for a time by the rush of historical events. As the prospect of another world war loomed, he became President Franklin Delano Roosevelt’s foremost advisor on matters involving science and engineering. During the war, he shepherded through countless major advances in the technologies of attack and defense, culminating in the most fearsome weapon the world had ever known: the atomic bomb. It was actually this last that caused Bush to return to the seemingly unrelated topic of information management, a problem he now saw in a more urgent light than ever. Clearly the world was entering a new era, one with far less tolerance for the human folly, born of so much context-less mole-person ideology, that had spawned the current war.

Practical man that he was, Bush decided there was nothing for it but to roll up his sleeves and make a concrete proposal describing how humanity could solve the needle-in-a-haystack problem of the modern information explosion. Doing so must entail grappling with something as fundamental as “how creative men think, and what can be done to help them think. It is a problem of how the great mass of material shall be handled so that the individual can draw from it what he needs — instantly, correctly, and with utter freedom.”

As revolutionary manifestos go, Vannevar Bush’s “As We May Think” is very unusual in terms of both the man that wrote it and the audience that read it. Bush was no Karl Marx, toiling away in discontented obscurity and poverty. On the contrary, he was a wealthy upper-class patrician who was, as a member of the White House inner circle, about as fabulously well-connected as it was possible for a man to be. His article appeared first in the July 1945 edition of the Atlantic Monthly, hardly a bastion of radical thought. Soon after, it was republished in somewhat abridged form by Life, the most popular magazine on the planet. Thereby did this visionary document reach literally millions of readers.

With the atomic bomb still a state secret, Bush couldn’t refer directly to his real reasons for wanting so urgently to write down his ideas now. Yet the dawning of the atomic age nevertheless haunts his article.

It is the physicists who have been thrown most violently off stride, who have left academic pursuits for the making of strange destructive gadgets, who have had to devise new methods for their unanticipated assignments. They have done their part on the devices that made it possible to turn back the enemy, have worked in combined effort with the physicists of our allies. They have felt within themselves the stir of achievement. They have been part of a great team. Now, as peace approaches, one asks where they will find objectives worthy of their best.

Seen in one light, Bush’s essay is similar to many of those that would follow from other Manhattan Project alumni during the uncertain interstitial period between the end of World War II and the onset of the Cold War. Bush was like many of his colleagues in feeling the need to advance a utopian agenda to counter the apocalyptic potential of the weapon they had wrought, in needing to see the ultimate evil that was the atomic bomb in almost paradoxical terms as a potential force for good that would finally shake the world awake.

Bush was true to his engineer’s heart, however, in basing his utopian vision on technology rather than politics. The world was drowning in information, making the act of information synthesis — intradisciplinary and interdisciplinary alike — ever more difficult.

The difficulty seems to be, not so much that we publish unduly in view of the extent and variety of present-day interests, but rather that publication has been extended far beyond our present ability to make real use of the record. The summation of human experience is being expanded at a prodigious rate, and the means we use for threading through the consequent maze to the momentarily important item is the same as was used in the days of square-rigged ships.

Our ineptitude in getting at the record is largely caused by the artificiality of systems of indexing. When data of any sort are placed in storage, they are filed alphabetically or numerically, and information is found (when it is) by tracing it down from subclass to subclass. It can be in only one place, unless duplicates are used; one has to have rules as to which path will locate it, and the rules are cumbersome. Having found one item, moreover, one has to emerge from the system and reenter on a new path.

The human mind does not work that way. It operates by association. With one item in its grasp, it snaps instantly to the next that is suggested by the association of thoughts, in accordance with some intricate web of trails carried by the cells of the brain. It has other characteristics, of course; trails that are not frequently followed are prone to fade, items are not fully permanent, memory is transitory. Yet the speed of action, the intricacy of trails, the detail of mental pictures, is awe-inspiring beyond all else in nature.

Man cannot hope fully to duplicate this mental process artificially, but he certainly ought to be able to learn from it. In minor ways he may even improve it, for his records have relative permanency. The first idea, however, to be drawn from the analogy concerns selection. Selection by association, rather than indexing, may yet be mechanized. One cannot hope thus to equal the speed and flexibility with which the mind follows an associative trail, but it should be possible to beat the mind decisively in regard to the permanence and clarity of the items resurrected from storage.

Bush was not among the vanishingly small number of people who were working in the nascent field of digital computing in 1945. His “memex,” the invention he proposed to let an individual free-associate all of the information in her personal library, was more steampunk than cyberpunk, all whirring gears, snickering levers, and whooshing microfilm strips. But really, those things are just details; he got all of the important stuff right. I want to quote some more from “As We May Think,” and somewhat at length at that, because… well, because its vision of the future is just that important. This is how the memex should work:

When the user is building a trail, he names it, inserts the name in his code book, and taps it out on his keyboard. Before him are the two items to be joined, projected onto adjacent viewing positions. At the bottom of each there are a number of blank code spaces, and a pointer is set to indicate one of these on each item. The user taps a single key, and the items are permanently joined. In each code space appears the code word. Out of view, but also in the code space, is inserted a set of dots for photocell viewing; and on each item these dots by their positions designate the index number of the other item.

Thereafter, at any time, when one of these items is in view, the other can be instantly recalled merely by tapping a button below the corresponding code space. Moreover, when numerous items have been thus joined together to form a trail, they can be reviewed in turn, rapidly or slowly, by deflecting a lever like that used for turning the pages of a book. It is exactly as though the physical items had been gathered together from widely separated sources and bound together to form a new book. It is more than this, for any item can be joined into numerous trails.

The owner of the memex, let us say, is interested in the origin and properties of the bow and arrow. Specifically he is studying why the short Turkish bow was apparently superior to the English long bow in the skirmishes of the Crusades. He has dozens of possibly pertinent books and articles in his memex. First he runs through an encyclopedia, finds an interesting but sketchy article, leaves it projected. Next, in a history, he finds another pertinent item, and ties the two together. Thus he goes, building a trail of many items. Occasionally he inserts a comment of his own, either linking it into the main trail or joining it by a side trail to a particular item. When it becomes evident that the elastic properties of available materials had a great deal to do with the bow, he branches off on a side trail which takes him through textbooks on elasticity and tables of physical constants. He inserts a page of longhand analysis of his own. Thus he builds a trail of his interest through the maze of materials available to him.

And his trails do not fade. Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own memex, there to be linked into the more general trail.

Wholly new forms of encyclopedias will appear, ready-made with a mesh of associative trails running through them, ready to be dropped into the memex and there amplified. The lawyer has at his touch the associated opinions and decisions of his whole experience, and of the experience of friends and authorities. The patent attorney has on call the millions of issued patents, with familiar trails to every point of his client’s interest. The physician, puzzled by a patient’s reactions, strikes the trail established in studying an earlier similar case, and runs rapidly through analogous case histories, with side references to the classics for the pertinent anatomy and histology. The chemist, struggling with the synthesis of an organic compound, has all the chemical literature before him in his laboratory, with trails following the analogies of compounds, and side trails to their physical and chemical behavior.

The historian, with a vast chronological account of a people, parallels it with a skip trail which stops only on the salient items, and can follow at any time contemporary trails which lead him all over civilization at a particular epoch. There is a new profession of trail blazers, those who find delight in the task of establishing useful trails through the enormous mass of the common record. The inheritance from the master becomes, not only his additions to the world’s record, but for his disciples the entire scaffolding by which they were erected.

Ted Nelson

Ted Nelson

There is no record of what all those millions of Atlantic Monthly and Life readers made of Bush’s ideas in 1945 — or for that matter if they made anything of them at all. In the decades that followed, however, the article became a touchstone of the burgeoning semi-underground world of creative computing. Among its discoverers was Ted Nelson, who is depending on whom you talk to either one of the greatest visionaries in the history of computing or one of the greatest crackpots — or, quite possibly, both. Born in 1937 to a Hollywood director and his actress wife, then raised by his wealthy and indulgent grandparents following the inevitable Hollywood divorce, Nelson’s life would largely be defined by, as Gary Wolf put it in his classic profile for Wired magazine, his “aversion to finishing.” As in, finishing anything at all, or just the concept of finishing in the abstract. Well into middle-age, he would be diagnosed with attention-deficit disorder, an alleged malady he came to celebrate as his “hummingbird mind.” This condition perhaps explains why he was so eager to find a way of forging permanent, retraceable associations among all the information floating around inside and outside his brain.

Nelson coined the terms “hypertext” and “hypermedia” at some point during the early 1960s, when he was a graduate student at Harvard. (Typically, he got a score of Incomplete in the course for which he invented them, not to mention an Incomplete on his PhD as a whole.) While they’re widely used all but interchangeably today, in Nelson’s original formulation the former term was reserved for purely textual works, the later for those incorporating others forms of media, like images and sound. But today we’ll just go with the modern flow, call them all hypertexts, and leave it at that. In his scheme, then, hypertexts were texts capable of being “zipped” together with other hypertexts, memex-like, wherever the reader or writer wished to preserve associations between them. He presented his new buzzwords to the world at a conference of the Association for Computing Machinery in 1965, to little impact. Nelson, possessed of a loudly declamatory style of discourse and all the rabble-rousing fervor of a street-corner anarchist, would never be taken all that seriously by the academic establishment.

Instead, it being the 1960s and all, he went underground, embracing computing’s burgeoning counterculture. His eventual testament, one of the few things he ever did manage to complete — after a fashion, at any rate — was a massive 1200-page tome called Computer Lib/Dream Machines, self-published in 1974, just in time for the heyday of the Altair and the Homebrew Computer Club, whose members embraced Nelson as something of a patron saint. As the name would indicate, Computer Lib/Dream Machines was actually two separate books, bound back to back. Theoretically, Computer Lib was the more grounded volume, full of practical advice about gaining access to and using computers, while Dream Machines was full of the really out-there ideas. In practice, though, they were often hard to distinguish. Indeed, it was hard to even find anything in the books, which were published as mimeographed facsimile copies filled with jotted marginalia and cartoons drafted in Nelson’s shaky hand, with no table of contents or page numbers and no discernible organizing principle beyond the stream of consciousness of Nelson’s hummingbird mind. (I trust that the irony of a book concerned with finding new organizing principles for information itself being such an impenetrable morass is too obvious to be worth belaboring further.) Nelson followed Computer Lib/Dream Machines with 1981’s Literary Machines, a text written in a similar style that dwelt, when it could be bothered, at even greater length on the idea of hypertext.

The most consistently central theme of Nelson’s books, to whatever extent one could be discerned, was an elaboration of the hypertext concept he called Xanadu, after the pleasure palace in Samuel Taylor Coleridge’s poem “Kubla Khan.” The product of an opium-fueled hallucination, the 54-line poem is a mere fragment of a much longer work Coleridge had intended to write. Problem was, in the course of writing down the first part of his waking dream he was interrupted; by the time he returned to his desk he had simply forgotten the rest.

So, Nelson’s Xanadu was intended to preserve information that would otherwise be lost, which goal it would achieve through associative linking on a global scale. Beyond that, it was almost impossible to say precisely what Xanadu was or wasn’t. Certainly it sounds much like the World Wide Web to modern ears, but Nelson insists adamantly that the web is a mere bad implementation of the merest shadow of his full idea. Xanadu has been under allegedly active development since the late 1960s, making it the most long-lived single project in the history of computer programming, and by far history’s most legendary piece of vaporware. As of this writing, the sum total of all those years of work are a set of web pages written in Nelson’s inimitable declamatory style, littered with angry screeds against the World Wide Web, along with some online samples that either don’t work quite right or are simply too paradigm-shattering for my poor mind to grasp.

In my own years on this planet, I’ve come to reserve my greatest respect for people who finish things, a judgment which perhaps makes me less than the ideal critic of Ted Nelson’s work. Nevertheless, even I can recognize that Nelson deserves huge credit for transporting Bush’s ideas to their natural habitat of digital computers, for inventing the term “hypertext,” for defining an approach to links (or “zips”) in a digital space, and, last but far from least, for making the crucial leap from Vannevar Bush’s concept of the single-user memex machine to an interconnected global network of hyperlinks.

But of course ideas, of which both Bush and Nelson had so many, are not finished implementations. During the 1960s, 1970s, and early 1980s, there were various efforts — in addition, that is, to the quixotic effort that was Xanadu — to wrestle at least some of the concepts put forward by these two visionaries into concrete existence. Yet it wouldn’t be until 1987 that a corporation with real financial resources and real commercial savvy would at last place a reasonably complete implementation of hypertext before the public. And it all started with a frustrated programmer looking for a project.

Steve Jobs and Bill Atkinson

Steve Jobs and Bill Atkinson

Had he never had anything to do with hypertext, Bill Atkinson’s place in the history of computing would still be assured. Coming to Apple Computer in 1978, when the company was only about eighteen months removed from that famous Cupertino garage, Atkinson was instrumental in convincing Steve Jobs to visit the Xerox Palo Alto Research Center, thereby setting in motion the chain of events that would lead to the Macintosh. A brilliant programmer by anybody’s measure, he eventually wound up on the Lisa team. He wrote the routines to draw pixels onto the Lisa’s screen — routines on which, what with the Lisa being a fundamentally graphical machine whose every display was bitmapped, every other program depended. Jobs was so impressed by Atkinson’s work on what he named LisaGraf that he recruited him to port his routines over to the nascent Macintosh. Atkinson’s routines, now dubbed QuickDraw, would remain at the core of MacOS for the next fifteen years. But Atkinson’s contribution to the Mac went yet further: after QuickDraw, he proceeded to design and program MacPaint, one of the two applications included with the finished machine, and one that’s still justifiably regarded as a little marvel of intuitive user-interface design.

Atkinson’s work on the Mac was so essential to the machine’s success that shortly after its release he became just the fourth person to be named an Apple Fellow — an honor that carried with it, implicitly if not explicitly, a degree of autonomy for the recipient in the choosing of future projects. The first project that Atkinson chose for himself was something he called the Magic Slate, based on a gadget called the Dynabook that had been proposed years ago by Xerox PARC alum (and Atkinson’s fellow Apple Fellow) Alan Kay: a small, thin, inexpensive handheld computer controlled via a touch screen. It was, as anyone who has ever seen an iPhone or iPad will attest, a prescient project indeed, but also one that simply wasn’t realizable using mid-1980s computer technology. Having been convinced of this at last by his skeptical managers after some months of flailing,  Atkinson wondered if he might not be able to create the next best thing in the form of a sort of software version of the Magic Slate, running on the Macintosh desktop.

In a way, the Magic Slate had always had as much to do with the ideas of Bush and Nelson as it did with those of Kay. Atkinson had envisioned its interface as a network of “pages” which the user navigated among by tapping links therein — a hypertext in its own right. Now he transported the same concept to the Macintosh desktop, whilst making his metaphorical pages into metaphorical stacks of index cards. He called the end result, the product of many months of design and programming, “Wildcard.” Later, when the trademark “Wildcard” proved to be tied up by another company, it turned into “HyperCard” — a much better name anyway in my book.

By the time he had HyperCard in some sort of reasonably usable shape, Atkinson was all but convinced that he would have to either sell the thing to some outside software publisher or start his own company to market it. With Steve Jobs now long gone and with him much of the old Jobsian spirit of changing the world through better computing, Apple was heavily focused on turning the Macintosh into a practical business machine. The new, more sober mood in Cupertino — not to mention Apple’s more buttoned-down public image — would seem to indicate that they were hardly up for another wide-eyed “revolutionary” product. It was Alan Kay, still kicking around Cupertino puttering with this and that, who convinced Atkinson to give CEO John Sculley a chance before he took HyperCard elsewhere. Kay brokered a meeting between Sculley and Atkinson, in which the latter was able to personally demonstrate to the former what he’d been working on all these months. Much to Atkinson’s surprise, Sculley loved HyperCard. Apparently at least some of the old Jobsian fervor was still alive and well after all inside Apple’s executive suite.

At its most basic, a HyperCard stack to modern eyes resembles nothing so much as a PowerPoint presentation, albeit one which can be navigated non-linearly by tapping links on the slides themselves. Just as in PowerPoint, the HyperCard designer could drag and drop various forms of media onto a card. Taken even at this fairly superficial level, HyperCard was already a full-fledged hypertext-authoring (and hypertext-reading) tool — by no means the first specimen of its kind, but the first with the requisite combination of friendliness, practicality, and attractiveness to make it an appealing environment for the everyday computer user. One of Atkinson’s favorite early demo stacks had many cards with pictures of people wearing hats. If you clicked on a hat, you were sent to another card showing someone else wearing a hat. Ditto for other articles of fashion. It may sound banal, but this really was revolutionary, organization by association in action. Indeed, one might say that HyperCard was Vannevar Bush’s memex, fully realized at last.

But the system showed itself to have much, much more to offer when the author started to dig into HyperTalk, the included scripting language. All sorts of logic, simple or complex, could be accomplished by linking scripts to clicks on the surface of the cards. At this level, HyperCard became an almost magical tool for some types of game development, as we’ll see in future articles. It was also a natural fit for many other applications: information kiosks, interactive tutorials, educational software, expert systems, reference libraries, etc.

HyperCard in action

HyperCard in action

John Sculley himself premiered HyperCard at the August 1987 MacWorld show. Showing unusual largess in his determination to get HyperCard into the hands of as many people as possible as quickly as possible, he announced that henceforward all new Macs would ship with a free copy of the system, while existing owners could buy copies for their machines for just $49. He called HyperCard the most important product Apple had released during his tenure there. Considering that Sculley had also been present for the launch of the original Macintosh, this was certainly saying something. And yet he wasn’t clearly in the wrong either. As important as the Macintosh, the realization in practical commercial form of the computer-interface paradigms pioneered at Xerox PARC during the 1970s, has been to our digital lives of today, the concept of associative indexing — hyperlinking — has proved at least as significant. But then, the two do go together like strawberries and cream, the point-and-click paradigm providing the perfect way to intuitively navigate through a labyrinth of hyperlinks. It was no coincidence that an enjoyable implementation of hypertext appeared first on the Macintosh; the latter almost seemed a prerequisite for the former.

The full revolutionary nature of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly good go, paying due homage to Vannevar Bush in the process.

The full import of the concept of hypertext was far from easy to get across in advertising copy, but Apple gave it a surprisingly serious go, paying due homage to Vannevar Bush in the process.

In the wake of that MacWorld presentation, a towering tide of HyperCard hype rolled from one side of the computer industry to the other, out into the mainstream media, and then back again, over and over. Hypertext’s time had finally come. In 1985, it was an esoteric fringe concept known only to academics and a handful of hackers, being treated at real length and depth in print only in Ted Nelson’s own sprawling, well-nigh impenetrable tomes. Four years later, every bookstore in the land sported a shelf positively groaning with trendy paperbacks advertising hypertext this and hypertext that. By then the curmudgeons had also begun to come out in force, always a sure sign that an idea has truly reached critical mass. Presentations showed up in conference catalogs with snarky titles like “Hypertext: Will It Cook Me Breakfast Too?.”

The curmudgeons had plenty of rabid enthusiasm to push back against. HyperCard, even more so than the Macintosh itself, had a way of turning the most sober-minded computing veterans into starry-eyed fanatics. Jan Lewis, a long time business-computing analyst, declared that “HyperCard is going to revolutionize the way computing is done, and possibly the way human thought is done.” Throwing caution to the wind, she abandoned her post at InfoWorld to found HyperAge, the first magazine dedicated to the revolution. “There’s a tremendous demand,” she said. “If you look at the online services, the bulletin boards, the various ad hoc meetings, user groups — there is literally a HyperCulture developing, almost a cult.” To judge from her own impassioned statements, she should know. She recruited Ted Nelson himself — one of the HyperCard holy trinity of Bush, Nelson, and Atkinson — to write a monthly column.

HyperCard effectively amounted to an entirely new computing platform that just happened to run atop the older platform that was the Macintosh. As Lewis noted, user-created HyperCard stacks — this new platform’s word for “programs” or “software” — were soon being traded all over the telecommunications networks. The first commercial publisher to jump into the HyperCard game was, somewhat surprisingly, Mediagenic.1 Bruce Davis, Mediagenic’s CEO, has hardly gone down into history as a paradigm of progressive thought in the realms of computer games and software in general, but he defied his modern reputation in this one area at least by pushing quickly and aggressively into “stackware.” One of the first examples of same that Mediagenic published was Focal Point, a collection of business and personal-productivity tools written by one Danny Goodman, who was soon to publish a massive bible called The Complete HyperCard Handbook, thus securing for himself the mantle of the new ecosystem’s go-to programming guru. Focal Point was a fine demonstration that just about any sort of software could be created by the sufficiently motivated HyperCard programmer. But it was another early Mediagenic release, City to City, that was more indicative of the system’s real potential. It was a travel guide to most major American cities — an effortlessly browsable and searchable guide to “the best food, lodgings, and other necessities” to be found in each of the metropolises in its database.

City to City

City to City

Other publishers — large, small, and just starting out — followed Mediagenic’s lead, releasing a bevy of fascinating products. The people behind The Whole Earth Catalog — themselves the inspiration for Ted Nelson’s efforts in self-publication — converted their current edition into a HyperCard stack filling a staggering 80 floppy disks. A tiny company called Voyager combined HyperCard with a laser-disc player — a very common combination among ambitious early HyperCard developers — to offer an interactive version of the National Gallery of Art which could be explored using such associative search terms as “Impressionist landscapes with boats.” Culture 1.0 let you explore its namesake through “3700 years of Western history — over 200 graphics, 2000 hypertext links, and 90 essays covering topics from the Black Plague to Impressionism,” all on just 7 floppy disks. Mission: The Moon, from the newly launched interactive arm of ABC News, gathered together details of every single Mercury, Gemini, and Apollo mission, including videos of each mission hosted on a companion laser disc. A professor of music converted his entire Music Appreciation 101 course into a stack. The American Heritage Dictionary appeared as stackware. And lots of what we might call “middlestackware” appeared to help budding programmers with their own creations: HyperComposer for writing music in HyperCard, Take One for adding animations to cards.

Just two factors were missing from HyperCard to allow hypertext to reach its full potential. One was a storage medium capable of holding lots of data, to allow for truly rich multimedia experiences, combining the lavish amounts of video, still pictures, music, sound, and of course text that the system clearly cried out for. Thankfully, that problem was about to be remedied via a new technology which we’ll be examining in my very next article.

The other problem was a little thornier, and would take a little longer to solve. For all its wonders, a HyperCard stack was still confined to the single Macintosh on which it ran; there was no provision for linking between stacks running on entirely separate computers. In other words, one might think of a HyperCard stack as equivalent to a single web site running locally off a single computer’s hard drive, without the ability to field external links alongside its internal links. Thus the really key component of Ted Nelson’s Xanadu dream, that of a networked hypertext environment potentially spanning the entire globe, remained unrealized. In 1990, Bill Nisen, the developer of a hypertext system called Guide that slightly predated HyperCard but wasn’t as practical or usable, stated the problem thus:

The one thing that is precluding the wide acceptance of hypertext and hypermedia is adequate broadcast mechanisms. We need to find ways in which we can broadcast the results of hypermedia authoring. We’re looking to in the future the ubiquitous availability of local-area networks and low-cost digital-transmission facilities. Once we can put the results of this authoring into the hands of more users, we’re going to see this industry really explode.

Already at the time Nisen made that statement, a British researcher named Tim Berners-Lee had started to experiment with something he called the Hypertext Transfer Protocol. The first real web site, the beginning of the World Wide Web, would go online in 1991. It would take a few more years even from that point, but a shared hypertextual space of a scope and scale the likes of which few could imagine was on the way. The world already had its memex in the form of HyperCard. Now — and although this equivalency would scandalize Ted Nelson — it was about to get its Xanadu.

Associative indexing permeates our lives so thoroughly today that, as with so many truly fundamental paradigm shifts, the full scope of the change it has wrought can be difficult to fully appreciate. A century ago, education was still largely an exercise in retention: names, dates, Latin verb cognates. Today’s educational institutions  — at least the more enlightened ones — recognize that it’s more important to teach their pupils how to think than it is to fill their heads with facts; facts, after all, are now cheap and easy to acquire when you need them. That such a revolution in the way we think about thought happened in just a couple of decades strikes me as incredible. That I happened to be present to witness it strikes me as amazing.

What I’ve witnessed has been a revolution in humanity’s relationship to information itself that’s every bit as significant as any political revolution in history. Some Singularity proponents will tell you that it marks the first step on the road to a vast worldwide consciousness. But even if you choose not to go that far, the ideas of Vannevar Bush and Ted Nelson are still with you every time you bring up Google. We live in a world in which much of the sum total of human knowledge is available over an electronic connection found in almost every modern home. This is wondrous. Yet what’s still more wondrous is the way that we can find almost any obscure fact, passage, opinion, or idea we like from within that mass, thanks to selection by association. Mama, we’re all cyborgs now.

(Sources: the books Hackers: Heroes of the Computer Revolution and Insanely Great: The Life and Times of the Macintosh, the Computer That Changed Everything by Steven Levy; Computer Lib/Dream Machines and Literary Machines by Ted Nelson; From Memex to Hypertext: Vannevar Bush and the Mind’s Machine, edited by James M. Nyce and Paul Kahn; The New Media Reader, edited by Noah Wardrip-Fruin and Nick Montfort; Multimedia and Hypertext: The Internet and Beyond by Jakob Nielsen; The Making of the Atomic Bomb by Richard Rhodes. Also the June 1995 Wired magazine profile of Ted Nelson; Andy Hertzfeld’s website Folklore; and the Computer Chronicles television episodes entitled “HyperCard,” “MacWorld Special 1988,” “HyperCard Update,” and “Hypertext.”)


  1. Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. 

 
52 Comments

Posted by on September 23, 2016 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Moving to California

The work week of May 1, 1989, started off much like any other inside the beleaguered latter-day Infocom. In the cavernous 18,000 square feet of their office space at 125 CambridgePark Drive — its sheer size was an ever-present reminder of more optimistic times, when Infocom had thought themselves poised to become the next Lotus — the shrunken staff of just 26 souls puttered through another Monday, pausing now and again to chat about the weekend just passed. The old days when CambridgePark would buzz during off-hours with parties and socializing and passionate programmers and testers burning the midnight oil were now a memory of the past. Changing life circumstances — the majority of the remaining staff were now married, many with small children — had done as much as the generalized malaise now afflicting the place to put an end to all that. CambridgePark now felt much like any other office, albeit a peculiarly empty one, and one over which hung an almost palpable sense of impending doom. Still, when the axe finally fell it came as a shock. It always does.

A memo went out early that week asking everyone to attend a meeting on Thursday, May 4, “to discuss the next generation of internal products.” More ominously, the memo said that the 3:00 P.M. meeting would “go as late as necessary.” And evidently management expected that to mean quite late, for they would be “ordering out for dinner.”

The axe fell over the course of that long afternoon and evening. Infocom would be “moving” to California, where it was to be reconstituted and re-imagined as a more closely coupled subsidiary of Mediagenic,1 under a “general manager” named Rob Sears. Just 11 of the 26 current employees were offered positions at this new version of Infocom. Exactly whose name was and wasn’t on that list of job offers is neither necessary nor appropriate to discuss here. Suffice to say that those Mediagenic decided were desirable to retain often weren’t the pivotal creative voices you might expect, and that only 5 of the 11 accepted the offer anyway. Only one long-serving employee from Infocom’s glory days would end up making the move: Duncan Blanchard, a longstanding interpreter programmer and the last leader of the old Micro Group before it was assimilated into the Systems Group in 1987. For the other old-timers, it was all over. Another six weeks or so to finish a few final projects and tidy up the place, and that would be that.

Bob Bates, working on his licensed Abyss game from suburban Maryland, had planned to fly up to Cambridge for one of his regular design meetings on Monday, May 8. But Infocom’s new Mediagenic-installed head Joe Ybarra called him early in the week of May 1, saying he really needed him to come up this same week if at all possible. When Bates arrived on Friday, May 5, to a curiously subdued CambridgePark, he was ushered immediately into Ybarra’s office. Infocom was moving to California without most of its current employees, Ybarra informed him, and his Abyss project was being cancelled. Nor would Infocom be requiring Bates’s services again; his development contract was officially terminated as of today. When a shell-shocked Bates returned home on the red eye that same rainy night, he found that his roof was leaking buckets. It had turned into that sort of week for everyone.

Steve Meretzky had been scheduled to attend the Computer Game Developers’ Conference that very weekend in Sunnyvale, California. He was still allowed to fly out on Infocom’s dime, but replaced the company’s name on his badge with “Make Me an Offer!” It was at this event that word of the fate of Infocom, which everyone knew had long been troubled but which still remained one of the most respected names in computer games, was first spread within the industry.

News of Infocom’s fate first reached the world at large via an announcement in the May 22, 1989, issue of the Boston Globe Magazine. The understated headline has become oddly iconic among fans: “Computer-Games Firm Moving to California.” A “new consumer preference for games with graphics and sound,” went the workmanlike report, was responsible for Infocom’s travails, along with Nintendo and “the aging of Infocom’s traditional audience, composed of early computer users who spent evenings and weekends hunched over a terminal drawing maps in text-only games that took 20 to 50 hours to solve.”

When word reached the trade press, Mediagenic held tightly to the story that this was simply a move, not a shutdown. Rob Sears made the counter-intuitive claim that Mediagenic was doing what they were “not so much to close Infocom down as to ensure it survives.” “The Great Underground Empire, curiously enough, has not been shut down,” insisted Joe Ybarra. “What’s happened is we’re in the process of relocating it to the West Coast.” At the same time, though, Ybarra did have to quietly admit that none of the Imps who had built the Great Underground Empire would remain a part of it. He could only offer some unconvincingly vague suggestions that some of the former Imps might “do projects” at some point as outside contractors. Certainly anyone wedded to the idea of Infocom as a maker first and foremost of text adventures was given little reason for hope.

You’ll probably see a shift in direction that’s commensurate with which way the market is headed. If you look at all the successful products, they’re graphics- and sound-intensive. Products as a whole are pushing more toward role-playing than toward our classic adventure game. I think we’ll be building more hybrids that share elements of all these different genres. In particular, one of the areas I find most exciting is getting into more interactive graphics, the idea of doing things that are object-oriented… a cross between Manhole and the HyperCard environment and our traditional object-oriented ZIL environment.

(In case Ybarra’s comments don’t make it clear, know that “object-oriented” was one of the sexiest buzzwords of the period, to be applied to anything and everything possible.)

The personnel inside CambridgePark continued to perform their duties in desultory fashion during those final weeks following the meeting that informed them of their fate. There was still plenty to do; Infocom had still not delivered finalized versions of their four most recent works of graphical interactive fiction for MS-DOS, the most important platform in the industry. Yet there was, understandably, little enthusiasm for doing it. Employees spent a lot of time picking out free games from the collection around the office, bidding on the office furniture and computers, and indulging their black humor via vehicles like a lunchtime “slideshow history of Infocom” entitled “Cornerstone through Tombstone.” And then the last day came, and the lights inside CambridgePark were extinguished forever — or at least until the next corporate tenant arrived.

By the point of that final closure, a considerable amount of back-channel sniping by the people of the former Infocom had begun toward Mediagenic. Not coincidentally, Mediagenic’s own take on recent events also became less sanguine. Sources from Infocom claimed that Mediagenic had pulled the plug just as the money spigots were about to open, just before the all-important MS-DOS versions of their graphical interactive fictions finally hit the market; as it was, these versions would all be released by Mediagenic as un-promoted afterthoughts within weeks of the closure. Mediagenic, for whom Infocom’s slow progress on their MS-DOS interpreter had been a huge frustration and a significant factor in their decision to finally wash their hands of CambridgePark altogether, replied that “the consolidation might not have become necessary if the IBM SKUs could have been released initially.” Likewise, Joe Ybarra’s characterization of the fundamental failings of Infocom’s games grew more pointed: “We cannot continue, in the marketplace, living off products that take eight hours to play well and up to 200 hours to complete.”

The view of the decision of May 4, 1989, that prevails universally today, as representative of a definitive ending rather than a move or consolidation, was already taking hold. Mediagenic stopped giving even lip service to Infocom as an ongoing operation of its own in the spring of 1990, when Rob Sears left and the remaining handful of personnel who had worked under him were either let go or absorbed into the parent company. From now on, Infocom would be a mere label under which Mediagenic would release some of their more narrative-oriented games.

In the long run, the people who had made up the old Infocom would all be just fine. After all, they were one hell of an impressive group, with credentials and talents that made them eminently employable. For those stalwarts in positions of business or creative leadership, who had been forced to bear up under the ever more crushing burden of Infocom’s troubled finances since 1985, the final, sharply definitive ending to it all felt like something of a relief as soon as the shock and pain of the initial announcement had faded.

The majority of the old Infocom staff exited the games industry at the same time that they exited Infocom, never to return. The limited or nonexistent applicability of the skills of some of Infocom’s most essential employees to the games being made by other companies — like, for instance, those of editor, producer, and all-around unsung hero Jon Palace — says much about just how unique Infocom really was. For others, though, the decision to get out of games had more to do with their fatigue with such an eternally tormented and tormenting industry than it did with job opportunities or a lack thereof inside it. Put simply, there are easier ways to make a living than by making computer games, and masterful programmers like Tim Anderson, Dave Lebling, and Stu Galley reckoned they were ready for more ordinary jobs. They and many others like them went on to live happy lives, building good, enjoyable careers that needn’t consume them. But there were also some gluttons for punishment who hadn’t yet burnt out on games. Marc Blank, Steve Meretzky, Mike Berlyn, Brian Moriarty, Mike Dornbrook, and Bob Bates would all be stubborn and passionate enough to remain in the industry. We’ll thus be meeting at least some of them again in future articles.

Seen purely as a business proposition, Infocom had been a colossal, unadulterated failure. Whether as independent company or Mediagenic subsidiary, Infocom never enjoyed a single profitable year after 1983, and its final ledger shows it to be millions in the red over the course of its relatively brief lifetime. But very few of those who had worked there thought of Infocom as a failure in the aftermath of its death — not even those former employees whose jobs had entailed fretting about the endless cavalcade of quarterly and yearly losses.

For some former employees, including many who might have had little to no interest in the company’s actual products, Infocom remains forever in their memories just a really fun office to work in — indeed, the best they could ever imagine. Plenty of these people would be shocked to learn of the aura of awed respect and love that still surrounds the very name of Infocom in the minds of fans today; they never realized they were creating timeless games. Others, of course, including virtually everyone who played a major creative role in making the games, did realize, at least after the fact, that they had done something very special indeed. Some former employees accept the bad decisions and missed opportunities that so frustrate fans peaceably, as karma, fate, or just plain old learning experiences. Others, thankfully a minority, still curse the names of either or both Al Vezza and Bruce Davis, the two great villains of the story, and are intermittently tormented by thoughts of what might have been.

What might have been… it’s a fraught question, isn’t it? Yet it’s a question that we as humans, confronted with something as special and noble as Infocom that seems so self-evidently to have died too soon, can hardly resist asking. The historian in me knows to be very leery of setting off down that road. Still, just this once, coming as we are to the end of the most detailed story I’ve ever told on this blog, maybe we can indulge in a little bit of counter-factualizing.

It seems to me that the first and perhaps most important thing we need to do to come to grips with the might-have-beens that surround Infocom is to separate the company itself from the medium of the text adventure. Such a separation can be weirdly difficult to actually accomplish. Infocom didn’t create the text adventure, nor did the company’s end mark the medium’s end — far from it, as years of articles that are hopefully still to come right here on this site will underline — but the name of Infocom would always remain all but synonymous with the form. Jason Scott has told how, when he was making his Get Lamp documentary about the life and times of the text adventure, he was constantly asked by friends how his “Infocom movie” was coming. At a certain point, he just gave up on correcting them.

Given this close connection, it can be jarring to consider that few to none of the people working at Infocom, even among those who weren’t on Team Cornerstone, thought of their company as an exclusive maker of text adventures. The story of how Infocom first came to make text adventures almost accidentally — that of needing a product to bootstrap their operation, and pulling good old MIT Zork down off the shelf as the fastest way to make one — has of course been well-documented, here and in plenty of other places. But even after they had become identified as makers of the world’s most sophisticated text adventures, they were very reluctant to settle for that niche. A research project into cross-platform graphics was begun already in 1983, at the same time that they were running all those iconic “anti-graphics” advertisements; said advertisements were merely clever promotions, not the expression of an absolute corporate philosophy. In 1984, Mike Berlyn and Marc Blank poured considerable time and effort into another innovative research project that came to naught in the end, a multi-player MUD-like environment to be hosted by the online service CompuServe. The following year brought the multi-player computerized board game Fooblitzky, Infocom’s first graphical product and one of the oddest they ever released. In short, Infocom always had ambitions beyond the text adventure, but those ambitions were consistently crippled by the lack of money for game development that plagued the company beginning as early as 1983, when Cornerstone first began to suck all the oxygen out of the room.

The counter-factual scenario most likely to yield an Infocom that survives beyond the 1980s is, as fan wisdom has long attested, one in which they never start down the Cornerstone wormhole. Yet the same best-case scenario is also possessed of a trait that fans may be less eager to acknowledge: in it, the money not spent on Cornerstone isn’t spent on making ever more elaborate text adventures, but rather on embracing new genres, new paradigms of play. Infocom could quite likely have survived if they’d avoided Cornerstone and made smart business decisions, and the world of gaming would doubtless have been a better place for their tradition of literacy, thoughtfulness, and innovation. But unfortunately, those same smart business decisions would likely have to entail branching out from the text adventure early, and eventually moving on completely. Dave Lebling:

I think in terms of continuing to produce the kind of thing we had been producing — i.e., text adventures with lots of cool technology to make them more realistic, lots of plot value, etc. — we could have gone on forever. I’m less sure whether the market would have continued to buy those. We had big arguments about this even before the Mediagenic/Activision acquisition. If you’ve spent several thousand dollars for a computer with a color screen and a video card and you want to display lots of pretty pictures, are you going to settle for a text adventure?

In my opinion, that was sort of a minority taste, just like reading is somewhat of a minority taste. People would much rather look at pictures than read as a rule. There’s a subculture of people who love to read, who are passionate about reading, passionate about books, but it’s not the majority of the public. The same thing is true in computers. There are people who like pictures and action and so forth, and there are people who like reading. And again, they are a minority.

So, I don’t think Infocom could have continued to go on from strength to strength the way we seemed to have been doing initially; we would have plateaued out. I think we eventually would have had to branch out into other kinds of games ourselves. The advantage would have been that we would have decided what to do, rather than some other company.

For proof of Lebling’s assertions, we need only look to what happened in the broader computer-game industry of our own timeline during the mid- to late-1980s. In 1984, at the height of the bookware frenzy, at least a dozen publishers in the United States alone could lay claim to major initiatives in the realm of text adventures, a medium that, being in most people’s mind the ultimate anti-action game, seemed the perfect fit for post-Great Videogame Crash electronic entertainment. Every single one of those initiatives, excepting only the games Infocom released that year, disappointed to one degree or another. To imagine that a counter-factual Infocom — even one with the resources to improve their technology, to offer even bigger and better games than the ones we know, to include pictures and interface conveniences years before the Infocom of our own timeline — could have continued to buck the trend for very long seems a stretch. And indeed, many of Infocom’s financial travails, which began already in 1985 when a subtle but worrisome sales slowdown on the part of many of their games first became evident alongside the obvious disaster that was Cornerstone, had far more to do with the wider market for text adventures than it did with Cornerstone. Put another way: if their games business had continued to explode as it had in 1983 and 1984, Infocom could have weathered the storm of Cornerstone’s failure bruised but solvent. It was a perfect storm, a combination of their slackening games business and the fiasco that was Cornerstone, that cast them into Mediagenic’s arms in 1986.

So, to understand the reasons for Infocom’s collapse we need to ask why it was that the bookware boom, during which they were the shining example to be emulated by all those other publishers, so comprehensively failed to meet expectations. I think there are two reasons really, involving two D-words I tend to dwell on a lot around here: Demographics and Design.

Simply put, the games industry of the mid- to late-1980s wasn’t populated by enough readers to sustain a vibrant culture of commercial text adventures. The overwhelming computer-game demographic by 1985 was teenage boys, who have never been known as a terribly thoughtful group. The dominance enjoyed by text adventures during the earlier years of the decade owed much to the fact that computer gaming was a much more exclusive hobby during that period, practiced only by those with a restless bent of mind and the financial resources to invest thousands of dollars in an object as ultimately useless as an early microcomputer for the home. Mike Dornbrook and others involved with Infocom near the beginning have often mentioned their wonder at the sheer number of doctors and lawyers on their mailing lists. The demographics of gaming began to change with the arrival of the inexpensive Commodore 64 as a major market force in 1983. Within the next year or two, it remade the entire industry in its image — and most definitely not to the text adventure’s benefit.

At the same time that this demographic shift was underway, Infocom and the various bookware bandwagon jumpers were allowing themselves to become confused about the reasons for the text adventure’s ascendancy even among the relatively cerebral home-computer constituency of the early 1980s. Companies making text adventures in those early days can be divided into two groups: those like Sierra who were working in text because nothing else was practical at the time, and those like Infocom who saw the text adventure as a worthy new ludic and/or literary form unto itself. Sierra got away from text adventures just as soon as they could, and went on to become one of the biggest and most important game publishers of the 1990s. Infocom stuck with the form, and we know what happened to them. There is I think a lesson to be found therein. Infocom craved a sort of player who didn’t exist in the numbers they believed them to even in the early years, and who came to make up a smaller and smaller percentage of the gaming public as time went by. By 1987, some of Infocom’s experiments were aimed at a computer-game customer who was all but nonexistent: like a fan of New Yorker-style verbal wit in the case of Nord and Bert Couldn’t Make Head or Tail of It, or a romance-novel fan in the case of Plundered Hearts.

A tantalizing question must be whether a healthier Infocom could have created a market for such games among non-gaming, possibly non-computer-owning lovers of books and puzzles. Clearly their games did have appeal to some well outside of the typical computer-game demographic. Infocom during their halcyon days had enjoyed glowing write-ups in such places as the Boston Globe, the New York Times Review of Books, Discovery magazine, and even Rolling Stone. Still, the fact remained that their games threw up tremendous barriers to entry, beginning with the sheer cost of the equipment needed to run them and ending with the learning curve for interacting with them. While it’s tempting to imagine a world of interactive fiction existing entirely outside the rest of the games industry with its bash-and-crash take on existence — a world where literary sophisticates pick up a copy of the latest Infocom release from a kiosk in a trendy bookstore — it’s hard to imagine even a healthy Infocom creating such a milieu from scratch. It’s also doubtful, for that matter, whether most of their precious remaining base of customers really wanted to see them moving in that direction. The Infocom games that are most notable for their literary ambition, like A Mind Forever Voyaging and Trinity, were never among their biggest sellers. A substantial percentage of their customer base, as various Imps have wryly noted over the years, would have been quite happy if Infocom had churned out nothing but endless iterations on the original Zork. It was at least as much the Imps’ own creative restlessness as it was the need to serve the market that led them to dabble in so many different literary genres.

But what of those customers who were perfectly content with new iterations of Zork? Where did they disappear to as the years went by? After all, Infocom continued to indulge them with plenty of traditional games right up until the end, and plenty of other companies were equally willing to serve them. I think that it may be when we come to the Zorkian traditionalists that we especially have to consider that other D-word.

If you ask gaming old-timers about text adventures today, most will recall them as creaky, virtually unplayable things riddled with guess-the-verb issues and incomprehensible puzzles. And here’s the thing: such conventional wisdom really isn’t wrong. When I first began to write the history that this blog has become, I hoped I would be able to unearth a lot of hidden text-adventure gems from publishers other than Infocom to share with you. I did find some games that fit that description, but I also found that even the good games from other publishers stand as deviations from the norm of terrible design, sometimes fostered by an unusually dedicated development team, sometimes by the stars just seeming to align in the right way. It seems impossible to imagine that the bad design that was so endemic to the medium throughout the 1980s didn’t play a major role in turning many players away permanently. Infocom’s games were vastly better than those of their competitors, a fact which played a huge role in fostering the company’s small but legendarily loyal group of hardcore fans. Yet even Infocom’s games were hardly guaranteed to be completely free of design issues. Indeed, as Infocom’s personnel pool shrunk and the pressure from Mediagenic to release more games more quickly increased, design issues that they once seemed to have put behind them began to creep back into their games to a rather disconcerting degree. With almost all of the trade-magazine reviewers uninterested in really delving into issues of design, playability, and solubility, players had no real way of knowing which games they could trust and which they couldn’t. The graphic adventures that came to supersede text featured lots of terrible design choices in their own right, but they at least had the virtue of novelty, and that of serving as showcases for the graphics and sound of the latest home computers. (In the longer run, there’s a strong argument to be made that the graphic adventure would wind up shooting itself in the head via poor design by the end of the 1990s exactly as the text adventure had ten years before.)

But rather than unspooling further counter-factual speculations on how it all could have turned out differently, maybe we should ask ourselves another important question that’s less frequently discussed: that of whether an Infocom that survived and continued making text adventures of one sort or another would really have been the best thing for the still burgeoning art of interactive fiction. It’s hard not to remark the sense of creative exhaustion that imbues Infocom’s last gasp, their final four attempts at “graphical interactive fiction.” Much of that is doubtless down to the strain of their ever-worsening relationship with Bruce Davis and Mediagenic, and the long run of commercial disappointments that had prompted that strain. But is that all that was going on? Both Dave Lebling and Marc Blank have spoken of a sense of not really knowing what to do next with interactive fiction after having innovated so relentlessly for so long. Lebling:

I think the space of what can be done in text adventures has been well-explored by a variety of very creative people (by no means all of whom worked at Infocom). It would take, I fear, a qualitative leap in the development language or environment to expand that space. We never got very good at doing conversation, for example. There’s a long way to go before realistic conversations exist in games. We were okay but not spectacular at giving people more than one way to solve a problem. You need a more advanced input method to solve that one. People are just not that interested in typing to the game to simulate physical actions. A virtual-reality suit would solve that but they’re a long way off.

No one has yet solved the primary problem of adventure games, which is, what happens when the player doesn’t do what you expected? Once progress is made on that one, it might be fun to write an adventure game again.

And Blank:

To me, the problem was where it could go, whether we had reached some kind of practical limit in terms of writing a story that way. People used to always ask whether you could have a more powerful parser. Could you have a parser that understood different kinds of sentences? Questions, statements to other characters like “I’m hungry.” Better interaction than the very stilted kind of thing we did in the mysteries, or in Suspended where you could only say things like “go to this room” — where you’re basically just adding the name of a character and a comma at the beginning of a sentence, but everything else is the same.

The problem is that the more things you want to handle the more cases you have to handle, and it becomes very open-ended. You end up much more with the guess-the-word problem. If all of a sudden you can ask any question, but there are really only three questions that are important to the story, you’re either going to spend all this time coming up with answers that don’t mean anything or you’re going to have a lot of “I don’t know that,” which is frustrating. I always suspected it was a dead end. The nice thing about the command-oriented game is that you can come up with a pretty complete vocabulary and a pretty complete set of responses. As soon as it becomes more open-ended — if I can say, “I’m hungry” or “I like blue rubber balls” — how do you respond to that? It’s like Eliza. You get an answer, but it has nothing to do with what you asked, and at some point you realize it’s a fraud, that there’s no information there. What happens is that the worlds get bigger as you open up the vocabulary, but they get sparser. There’s less real information; it’s mostly noise just there to convince you of the world. I think that’s when it gets boring.

I worried about this a lot because people would always ask about the next step, the next thing we could do. It really wasn’t clear to me. Okay, you can make the writing better, and you can make puzzles that are more interesting. But as far as pushing toward a real interactive story — in a real story, you don’t just give everyone commands, right? — that was an issue. We worked on some of those issues for quite a while before we realized that we just weren’t getting anywhere. It was hard to know where to go with it, what was going to be the interesting part of it. Or were you turning it into a simulation, a world you can wander around in but not much happens? I always kind of hit a wall trying to move forward there.

So we said, okay, there are new [literary] genres. So then we had Amy doing Plundered Hearts, Jeff doing Nord and Bert, etc. We don’t know what the next step is technically, so instead we’re going to just kind of mess with the format. So we’ll do a satire and a pulp romance and a horror story. But there was a real issue of creative burnout. You’ve done all these things. Do you just keep doing them? Where does it go? Where does it lead? By the time Infocom closed down, I think it’s fair to say that it wasn’t obvious. I got the sense that some of the games were just an excuse to try something else: “I don’t know what to do, let’s try this.”

To some extent, Lebling and especially Blank fall victim here to their need, being technologists at heart, to always measure the progress of the medium of the text adventure in technological terms. No one declares the novel to be a dead form because the technology of printed text hasn’t advanced in hundreds of years. As many of my earlier articles attest, I see immense value in many of the literary experiments of Infocom’s later years that Blank is a bit too eager to dismiss.

I see evidence in Lebling and Blank’s comments of two creatively exhausted people rather than a creatively exhausted medium. I suspect that the group of people who made up Infocom, brilliant as they were, had taken the art of interactive fiction just about as far as they were personally able to by 1989. The innovations that would follow — and, contrary to both men’s statements above, they most definitely do exist — would largely come out of a very different culture, one free of the commercial pressures that had begun more and more to hamstring Infocom by the end. A work that is to be sold for $30 or more as a boxed computer game has to meet certain requirements, certain player expectations, that often worked at cross-purposes to the medium’s artistic evolution. Must a game require many hours to play? Must a game have puzzles? Can a game feel like a personal testament? Is an interactive-fiction game necessarily a game at all? To paraphrase that famous old Electronic Arts advertisement, can a work of interactive fiction make you cry? These were questions that Infocom — especially but not exclusively an Infocom under Mediagenic, laser-focused as the latter was on delivering conventional hit games — wasn’t in any position to further explore. The medium’s creative future would have to be left to the amateurs.

If we begin to see Infocom as, rather than a beautiful thing that was strangled far too soon, a beautiful thing that simply ran its course, we might just begin to upend the narrative of tragedy that surrounds the legendary company to this day. Among many fans of text adventures today, there’s still a marked tendency to look back on the heyday of Infocom and the commercial text adventure in general as the pivotal era in the medium’s history, a lost golden age that ended far too soon. That’s understandable on one level. This brief era marks the only period in history when it was realistically conceivable to make a living authoring text adventures, a career that plenty of hardcore fans would rate as their absolute first choice in careers out of all of them. We’ve thus seen the tragic version of the medium’s history repeated again and again for far longer than the alleged golden age actually lasted. Ironically, we tend to see it especially in those summations of interactive fiction and its history that try to reach beyond the insular community of present-day enthusiasts to serve as introductions for the uninitiated. Such articles almost always begin with Infocom, proceed to dwell at length on those glory days gone by, then mention the modern community — “but wait, interactive fiction isn’t dead!” — in a way that inevitably smacks of a lingering population of diehards. It seems rather a shabby way to frame the history of a living literary form, doesn’t it? Perhaps we can learn to do better.

In his 2007 PhD thesis on interactive fiction, Jeremy Douglass proposed recasting the commercial era as “an important anomaly, a brief big-business deviation from the otherwise constant association of the IF genre with individual authors each networked into a kind of literary salon culture.” This was what interactive fiction largely was before Infocom, and what it became again after them. Seeing the medium’s history in this way doesn’t mean minimizing the accomplishments of Infocom, whose 35-game canon deserves always to be regarded as the text adventure’s version of The Complete Works of William Shakespeare, the wellspring and constant source of inspiration for everything that followed. It does, however, mean recognizing that, in terms of great games that delight and amuse and tantalize and sometimes move their players, the text adventure was really just getting started even as Infocom died. Because this blog has long since begun to reach readers from well outside the interactive-fiction community from which it first sprang, I’m going to guess that some of you may have little experience with what came after Infocom. It’s for those readers among you especially that I plan to cover what came next with the same care I lavished on Infocom’s history. So, never fear. I plan to spend a lot more time praising the humble text adventure in the time to come, and I’m far from ready to bury it alongside Infocom.

(Sources: As usual with my Infocom articles, much of this one is drawn from the full Get Lamp interview archives which Jason Scott so kindly shared with me. Some of it is also drawn from Jason’s “Infocom Cabinet” of vintage documents. Periodical sources include Computer Gaming World of September 1989; The Boston Globe Magazine of May 22 1989; Questbusters of July 1989; The Games Machine of October 1989, December 1989, and July 1990. See also Adventure Classic Gaming’s interview with Dave Lebling and Jeremy Douglass’s PhD thesis. And my huge thanks go out to Bob Bates, who granted me an extended interview about his work with Infocom.)


  1. Mediagenic was known as Activision until mid-1988. To avoid confusion, I just stick with the name “Mediagenic” in this article. 

 

Tags: