RSS

T Plus 5: Bombs in Space

Trinity

Earth Orbit, on a satellite

The satellite you're riding is about twenty feet long, and shaped like a beer can.

>z
Time passes.

A red flash draws your eyes to the ground below, where the contrail of a missile is climbing into the stratosphere.

>z
Time passes.

The maneuvering thrusters on the satellite fire, turning the nose until it faces the ascending missile.

>z
Time passes.

The satellite erupts in a savage glare that lights up the ground below. A beam of violet radiation flashes downward, obliterating the distant missile. Unfortunately, you have little time to admire this triumph of engineering before the satellite's blast incinerates you.

Trinity aims in 256 K of text adventure to chronicle at least fifty years of humanity’s relationship to the atomic bomb, as encapsulated into seven vignettes. Two of these, the one dealing with the long-dreaded full-on nuclear war that begins with you on vacation in London’s Kensington Gardens and the one you see above involving a functioning version of Ronald Reagan’s “Star Wars” Strategic Defense Initiative (a proposition that all by itself justifies Trinity‘s “Fantasy” genre tag, as we’ll soon see), are actually speculative rather than historical, taking place at some point in the near future. The satirical comic that accompanies the game also reserves space for Reagan and his dream. It’s a bold choice to put Reagan himself in there, undisguised by pseudonymous machinations like A Mind Forever Voyaging‘s “Richard Ryder” — even a brave one for a company that was hardly in a position to alienate potential players. Trinity, you see, was released at the absolute zenith of Reagan’s popularity. While the comic and the game it accompanies hardly add up to a scathing sustained indictment a la A Mind Forever Voyaging, they do cast him as yet one more Cold Warrior in a conservative blue suit leading the world further along the garden path to the unthinkable. Today I’d like to look at this “orbiting ‘umbrella’ of high technology” that Trinity postulates — correctly — isn’t really going to help us all that much at all when the missiles start to fly. Along the way we’ll get a chance to explore some of the underpinnings of the nuclear standoff and also the way it came to an anticlimactically sudden end, so thankfully at odds with Trinity‘s more dramatic predictions of the supposed inevitable.

In November of 1985, while Trinity was in development, Ronald Reagan and the new Soviet General Secretary Mikhail Gorbachev met for the first American/Soviet summit of Reagan’s Presidency. The fact that the summit took place at all was almost entirely down to the efforts of Gorbachev, who quite skillfully made it politically impossible for Reagan not to attend. It marked the first time Reagan had actually talked with his Soviet counterpart face to face in his almost five years as President. The two men, as contemporary press reports would have it, “took the measure of each other” and largely liked what they saw, but came to no agreements. The second summit, held in Reykjavik, Iceland, in October of the following year, came to within a hair’s breadth of a major deal that would have started the superpowers down the road to the complete elimination of nuclear armaments and effectively marked the beginning of the end of the Cold War. The only stumbling block was the Strategic Defense Initiative. Gorbachev was adamant that Reagan give it up, or at least limit it to “laboratory testing”; Reagan just as adamantly refused. He repeatedly expressed to both Gorbachev and the press his bafflement at this alleged intransigence. SDI, he said, was to be a technology of defense, a technology for peace. His favorite metaphor was SDI as a nuclear “gas mask.” The major powers of the world had all banned poison gas by treaty after World War I, and, rather extraordinarily, even kept to that bargain through all the other horrors of World War II. Still, no one had thrown away their gas-mask stockpiles, and the knowledge that other countries still possessed them had just possibly helped to keep everyone honest. SDI, Reagan said, could serve the same purpose in the realm of nuclear weapons. He even made an extraordinary offer: the United States would be willing to give SDI to the Soviets “at cost” — whatever that meant — as soon as it was ready, as long as the Soviets would also be willing to share any fruits of their own (largely nonexistent) research. That way everyone could have nuclear gas masks! How could anyone who genuinely hoped and planned not to use nuclear weapons anyway possibly object to terms like that?

Gorbachev had a different view of the matter. He saw SDI as an inherently destabilizing force that would effectively jettison not one but two of the tacit agreements of the Cold War that had so far prevented a nuclear apocalypse. Would any responsible leader easily accept such an engine of chaos in return for a vague promise to “share” the technology? Would Reagan? It’s very difficult to know what was behind Reagan’s seeming naivete. Certainly his advisers knew that his folksy analogies hardly began to address Gorbachev’s very real and very reasonable concerns. If the shoe had been on the other foot, they would have had the same reaction. Secretary of Defense Caspar Weinberger had demonstrated that in December of 1983, when he had said, “I can’t imagine a more destabilizing factor for the world than if the Soviets should acquire a thoroughly reliable defense against these missiles before we did.” As for Reagan himself, who knows? Your opinion on the matter depends on how you take this famous but enigmatic man whom conservatives have always found as easy to deify as liberals to demonize. Was he a bold visionary who saved his country from itself, or a Machiavellian schemer who used a genial persona to institute an uglier, more heartless version of America? Or was he just a clueless if good-natured and very, very lucky bumbler? Or was he still the experienced actor, out there hitting his marks and selling the policies of his handlers like he had once shilled for General Electric? Regardless, let’s try to do more justice to Gorbachev’s concerns about SDI than Reagan did at their summits.

It’s kind of amazing that the Cold War never led to weapons in space. It certainly didn’t have to be that way. Histories today note what a shock it was to American pride and confidence when the Soviet Union became the first nation to successfully launch a satellite on October 4, 1957. That’s true enough, but a glance at the newspapers from the time also reveals less abstract fears. Now that the Soviets had satellites, people expected them to weaponize them, to use them to start dropping atomic bombs on their heads from space. One rumor, which amazingly turned out to have a basis in fact, claimed the Soviets planned to nuke the Moon, leading to speculation on what would happen if their missile was to miss the surface, boomerang around the Moon, and come back to Earth — talk about being hoisted by one’s own petard! The United States’s response to the Soviets’ satellite was par for the course during the Cold War: panicked, often ill-considered activity in the name of not falling behind. Initial responsibility for space was given to the military. The Navy and the Air Force, who often seemed to distrust one another more than either did the Soviets, promptly started squabbling over who owned this new seascape or skyscape, which depending on how you looked at it and how you picked your metaphors could reasonably be assumed to belong to either. While the Naval Research Laboratory struggled to get the United States’s first satellite into space, the more ambitious dreamers at the Air Force Special Weapons Center made their own secret plans to nuke the Moon as a show of force and mulled the construction of a manned secret spy base there.

But then, on July 29, 1958, President Eisenhower signed the bill that would transform the tiny National Advisory Committee for Aeronautics into the soon-to-be massive National Aeronautics and Space Administration — NASA. While NASA’s charter duly charged the new agency with making any “discoveries” available for “national defense” and with “the preservation of the role of the United States as a leader in aeronautical and space science and technology,” those goals came only after more high-toned abstractions like “the expansion of human knowledge” and the use of space for “peaceful and scientific purposes.” NASA was something of an early propaganda coup at a time when very little seemed to be going right with astronautics in the United States. The Soviet leadership had little choice but to accept the idea, publicly at least, of space exploration as a fundamentally peaceful endeavor. In 1967 the United States and the Soviet Union became signatories along with many other nations to the Outer Space Treaty that enshrined the peaceful status quo into international law. By way of compensation, the first operational ICBMs had started to come online by the end of the 1950s, giving both superpowers a way of dealing impersonal death from the stratosphere without having to rely on wonky satellites.

This is not to say that the Cold War never made it into space in any form. Far from it. Apollo, that grandest adventure of the twentieth century, would never have happened without the impetus of geopolitics. The Apollo 11 astronauts may have left a message on the Moon saying they had “come in peace for all mankind,” may have even believed it at some level, but that was hardly the whole story. President Kennedy, the architect of it all, had no illusions about the real purpose of his Moon pronouncement. “Everything that we do ought to be really tied into getting onto the Moon ahead of the Russians,” he told NASA Administrator James Webb in 1962. “Otherwise we shouldn’t be spending this kind of money because I’m not that interested in space.” The Moon Race, like war, was diplomacy through other means. As such, the division between military and civilian was not always all that clear. For instance, the first Americans to fly into orbit, like the first Soviets, did so mounted atop repurposed ICBMs.

Indeed, neither the American nor the Soviet military had any interest in leaving space entirely to the civilians. If one of the goals of NASA’s formation had been to eliminate duplications of effort, it didn’t entirely succeed. The Air Force in particular proved very reluctant to give up on their own manned space efforts, developing during the 1960s the X-15 rocket plane that Neil Armstrong among others flew to the edge of orbit, the cancelled Dyna-Soar space plane, and even a manned space station that also never got off the drawing board. Planners in both the United States and the Soviet Union seemed to treat the 1967 Outer Space Treaty as almost a temporary accord, waiting for the other shoe to drop and for the militarization of space to begin in earnest. I’ve already described in an earlier article how, once the Moon Race was over, NASA was forced to make an unholy alliance with the Air Force to build the space shuttle, whose very flight profile was designed to allow it to avoid space-based weaponry that didn’t yet actually exist.

Yet the most immediate and far-reaching military application of space proved to be reconnaissance satellites. Well before the 1960s were out these orbiting spies had become vital parts of the intelligence apparatus of both the United States and the Soviet Union, as well as vital tools for the detection of ICBM launches by the other side — yet another component of the ever-evolving balance of terror. Still, restrained by treaty, habit, and concern over what it might make the other guys do, neither of the superpowers ever progressed to the logical step of trying to shoot down those satellites that were spying on their countries. If you had told people in 1957 that there would still be effectively no weapons in space almost thirty years later, that there would never have been anything even remotely resembling a battle in space, I think they would be quite surprised.

But now SDI had come along and, at least in the Soviets’ view, threatened to undermine that tradition. They need only take at face value early reports of SDI’s potential implementations, which were all over the American popular media by the time of Reagan’s 1984 reelection campaign, to have ample grounds for concern. One early plan, proposed in apparent earnest by a committee who may have seen The Battle of Britain (or Star Wars) a few too many times, would have the United States and its allies protected by squadrons of orbiting manned fighter planes, who would rocket to the rescue to shoot down encroaching ICBMs, their daring pilots presumably wearing dashing scarves and using phrases like “Tally ho!” A more grounded plan, relatively speaking, was the one for hundreds of “orbiting battle stations” equipped with particle-beam weapons or missiles of their own — hey, whatever works — to pick off the ICBMs. Of course, as soon as these gadgets came into being the Soviets would have to develop gadgets of their own to try to take them out. Thus a precious accord would be shattered forever. To the Soviets, SDI felt like a betrayal, a breaking of a sacred trust that had so far kept people and satellites in space from having to shoot at each other and in doing so had just possibly prevented the development of a new generation of horrific weaponry.

And yet this was if anything the more modest of the two outrages they saw being inflicted on the world by SDI. The biggest problem was that it could be both a symptom and a cause of the ending of the old MAD doctrine — Mutually Assured Destruction — that had been the guiding principle of both superpowers for over twenty years and that had prevented them from blowing one another up along with the rest of the world. On its surface, the MAD formulation is simplicity itself. I have enough nuclear weapons to destroy your country — or at least to do unacceptable damage to it — and a window of time to launch them at you between the time I realize that you’ve launched yours at me and the time that yours actually hit me. Further, neither of us has the ability to stop the missiles of the other — at least, not enough of them. Therefore we’d best find some way to get along and not shoot missiles at each other. One comparison, so favored by Reagan that he drove Gorbachev crazy by using it over and over again at each of their summits, is that of two movie mobsters with cocked and loaded pistols pointed at each others’ heads.

That well-intentioned comparison is also a rather facile one. The difference is a matter of degree. Many of us had MAD, that most fundamental doctrine of the Cold War, engrained in us as schoolchildren to such a degree that it might be hard for us to really think about its horribleness anymore. Nevertheless, I’d like for us to try to do so now. Let’s think in particular about its basic psychological prerequisites. In order for the threat of nuclear annihilation to be an effective deterrent, in order for it never to be carried out, it must paradoxically be a real threat, one which absolutely, unquestionably would be carried out if the order was given. If the other side was ever to suspect that we were not willing to destroy them, the deterrent would evaporate. So, we must create an entire military superstructure, a veritable subculture, of many thousands of people all willing to unquestioningly annihilate tens or hundreds of millions of people. Indeed, said annihilation is the entire purpose of their professional existence. They sit in their missile silos or in their ready rooms or cruise the world in their submarines waiting for the order to push that button or turn that key that will quite literally end existence as they know it, insulated from the incalculable suffering that action will cause inside the very same sorts of “clean, carpeted, warmed, and well-lighted offices” that Reagan once described as the domain of the Soviet Union’s totalitarian leadership alone. If the rise of this sort of antiseptic killing is the tragedy of the twentieth century, the doctrine of MAD represents it taken to its well-nigh incomprehensible extreme.

MAD, requiring as it did people to be always ready and able to carry out genocide so that they would not have to carry out genocide, struck a perilous psychological balance. Things had the potential to go sideways when one of these actors in what most people hoped would be Waiting for Godot started to get a little bit too ready and able — in short, when someone started to believe that he could win. See for example General Curtis LeMay, head of the Strategic Air Command from its inception until 1965 and the inspiration for Dr. Strangelove‘s unhinged General Jack Ripper. LeMay believed to his dying day that the United States had “lost” the Cuban Missile Crisis because President Kennedy had squandered his chance to finally just attack the Soviet Union and be done with it; talked of the killing of 100 million human beings as a worthwhile trade-off for the decapitation of the Soviet leadership; openly campaigned for and sought ways to covertly acquire the metaphorical keys to the nuclear arsenal, to be used solely at his own dubious discretion. “If I see that the Russians are amassing their planes for an attack, I’m going to knock the shit out of them before they take off the ground,” he once told a civil-defense committee. When told that such an action would represent insubordination to the point of treason, he replied, “I don’t care. It’s my policy. That’s what I’m going to do.” Tellingly, Dr. Strangelove itself was originally envisioned as a realistic thriller. The film descended into black comedy only when Stanley Kubrick started his research and discovered that so much of the reality was, well, blackly comic. Much in Dr. Strangelove that moviegoers of 1964 took as satire was in fact plain truth.

If the belief by a single individual that a nuclear war can be won is dangerous, an institutionalized version of that belief might just be the most dangerous thing in the world. And here we get to the heart of the Soviets’ almost visceral aversion to SDI, for it seemed to them and many others a product of just such a belief.

During the mid-1970s, when détente was still the watchword of the day, a group of Washington old-timers and newly arrived whiz kids formed something with the Orwellian name of The Committee on the Present Danger. Its leading light was one Paul Nitze. A name few Americans then or now are likely to recognize, Nitze had been a Washington insider since the 1940s and would remain a leading voice in Cold War policy for literally the entire duration of the Cold War. He and his colleagues, many of them part of a new generation of so-called “neoconservative” ideologues, claimed that détente was a sham, that “the Soviets do not agree with the Americans that nuclear war is unthinkable and unwinnable and that the only objective of strategic doctrine must be mutual deterrence.” On the contrary, they were preparing for “war-fighting, war-surviving, and war-winning.” Their means for accomplishing the latter two objectives would be an elaborate civil-defense program that was supposedly so effective as to reduce their casualties in an all-out nuclear exchange to about 10 percent of what the United States could expect. The Committee offered little or no proof for these assertions and many others like them. Many simply assumed that the well-connected Nitze must have access to secret intelligence sources which he couldn’t name. If so, they were secret indeed. When the CIA, alarmed by claims of Soviet preparedness in the Committee’s reports that were completely new to them, instituted a two-year investigation to get to the bottom of it all, they couldn’t find any evidence whatsoever of any unusual civil-defense programs, much less any secret plans to start and win a nuclear war. It appears that Nitze and his colleagues exaggerated wildly and, when even that wouldn’t serve their ends, just made stuff up. (This pattern of “fixing the intelligence” would remain with Committee veterans for decades, leading most notably to the Iraq invasion of 2003.)

Throughout the Carter administration the Committee lobbied anyone who would listen, using the same sort of paranoid circular logic that had led to the nuclear-arms race in the first place. The Soviets, they said, have secretly abandoned the MAD strategy and embarked on a nuclear-war-winning strategy in its place. Therefore we must do likewise. There could be no American counterpart to the magical Soviet civil-defense measures that could somehow protect 90 percent of their population from the blasts of nuclear weapons and the long years of radioactive fall-out that would follow. This was because civil defense was “unattractive” to an “open society” (“unattractiveness” being a strangely weak justification for not doing something in the face of what the Committee claimed was an immediate existential threat, but so be it). One thing the United States could and must do in response was to engage in a huge nuclear- and conventional-arms buildup. That way it could be sure to hammer the Soviets inside their impregnable tunnels — or wherever it was they would all be going — just as hard as possible. But in addition, the United States must come up with a defense of its own.

Although Carter engaged in a major military buildup in his own right, his was nowhere near big enough in the Committee’s eyes. But then came the 1980 election of Ronald Reagan. Reagan took all of the Committee’s positions to heart and, indeed, took most of its most prominent members into his administration. Their new approach to geopolitical strategy was immediately apparent, and immediately destabilizing. Their endless military feints and probes and aggressive rhetoric seemed almost to have the intention of starting a war with the Soviet Union, a war they seemed to welcome whilst being bizarrely dismissive of its potentially world-ending consequences. Their comments read like extracts from Trinity‘s satirically gung-ho accompanying comic. “Nuclear war is a destructive thing, but it is still in large part a physics problem,” said one official. “If there are enough shovels to go around, everybody’s going to make it. It’s the dirt that does it,” said another. Asked if he thought that a nuclear war was “winnable,” Caspar Weinberger replied, “We certainly are not planning to be defeated.” And then, in March of that fraught year of 1983 when the administration almost got the nuclear war it seemed to be courting, came Reagan’s SDI speech.

The most important thing to understand about SDI is that it was always a fantasy, a chimera chased by politicians and strategists who dearly wished it was possible. The only actual scientist amongst those who lobbied for it was Edward Teller, well known to the public as the father of the hydrogen bomb. One of the few participants in the Manhattan Project which had built the first atomic bomb more than 35 years before still active in public life at the time that Reagan took office, Teller was a brilliant scientist when he wanted to be, but one whose findings and predictions were often tainted by his strident anti-communism and a passion for nuclear weapons that could leave him sounding as unhinged as General LeMay. Teller seldom saw a problem that couldn’t be solved just by throwing a hydrogen bomb or two at it. His response to Carter’s decision to return the Panama Canal to Panama, for instance, was to recommend quickly digging a new one across some more cooperative Central American country using hydrogen bombs. Now, alone amongst his scientific peers, Teller told the Reagan administration that SDI was possible. He claimed that he could create X-ray beams in space by, naturally, detonating hydrogen bombs just so. These could be aimed at enemy missiles, zapping them out of the sky. The whole system could be researched, built, and put into service within five years. As evidence, he offered some inconclusive preliminary results derived from experimental underground explosions. It was all completely ridiculous; we still don’t know how to create such X-ray beams today, decades on. But it was also exactly the sort of superficially credible scientific endorsement — and from the father of the hydrogen bomb, no less! — that the Reagan administration needed.

Reagan coasted to reelection in 1984 in a campaign that felt more like a victory lap, buoyed by “Morning Again in America,” an energetic economy, and a military buildup that had SDI as one of its key components. The administration lobbied Congress to give the SDI project twice the inflation-adjusted funding as that received by the Manhattan Project at the height of World War II. With no obviously promising paths at all to follow, SDI opted for the spaghetti approach, throwing lots and lots of stuff at the wall in the hope that something would stick. Thus it devolved into a whole lot of individual fiefdoms with little accountability and less coordination with one another. Dr. Ashton Carter of Harvard, a former Defense Department analyst with full security clearance tasked with preparing a study of SDI for the Congressional Budget Office, concluded that the prospect for any sort of success was “so remote that it should not serve as the basis of public expectations of national policy.” Most of the press, seduced by Reagan’s own euphoria, paid little heed to such voices, instead publishing articles talking about the relative merits of laser and kinetic-energy weapons, battle stations in space, and whether the whole system should be controlled by humans or turned over to a supercomputer mastermind. With every notion as silly and improbable as every other and no direction in the form of a coherent plan from the SDI project itself, everyone could be an expert, everyone could build their own little SDI castle above the stratosphere. When journalists did raise objections, Reagan replied with more folksy homilies about how everyone thought Edison was crazy until he invented the light bulb, appealing to the good old American ingenuity that had got us to the Moon and could make anything possible. The administration’s messaging was framed so as to make objecting to SDI unpatriotic, downright un-American.

And yet even if you thought that American ingenuity would indeed save the day in the end, SDI had a more fundamental problem that made it philosophically as well as scientifically unsound. This most basic objection, cogently outlined at the time by the great astronomer, science popularizer, space advocate, and anti-SDI advocate Carl Sagan, was a fairly simple one. Even the most fanciful predictions for SDI must have a capacity ceiling, a limit beyond which the system simply couldn’t shoot down any more missiles. And it would always be vastly cheaper to build a few dozen more missiles than it would be to build and launch and monitor another battle station (or whatever) to deal with them. Not only would SDI not bring an end to nuclear weapons, it was likely to actually accelerate the nuclear-arms race, as the Soviet Union would now feel the need to not only be able to destroy the United States ten times over but be able to destroy the United States ten times over while also comprehensively overwhelming any SDI system in place. Reagan’s public characterization of SDI as a “nuclear umbrella” under which the American public might live safe and secure had no basis in reality. Even if SDI could somehow be made 99 percent effective, a figure that would make it more successful than any other defense in the history of warfare, the 1 percent of the Soviet Union’s immense arsenal that got through would still be enough to devastate many of the country’s cities and kill tens or hundreds of millions. There may have been an argument to make for SDI research aimed at developing, likely decades in the future, a system that could intercept and destroy a few rogue missiles. As a means of protection against a full-on strategic strike, though… forget it. It wasn’t going to happen. Ever. As President Nixon once said, “With 10,000 of these damn things, there is no defense.”

As with his seeming confusion about Gorbachev’s objections to SDI at their summits, it’s hard to say to what degree Reagan grasped this reality. Was he living a fantasy like so many others in the press and public when he talked of SDI rendering ICBMs “impotent and obsolete”? Whatever the answer to that question, it seems pretty clear that others inside the administration knew perfectly well that SDI couldn’t possibly protect the civilian population as a whole to any adequate degree. SDI was in reality a shell game, not an attempt to do an end-run around the doctrine of mutually assured destruction but an attempt to make sure that mutually assured destruction stayed mutually assured when it came to the United States’s side of the equation. Cold War planners had fretted for decades about a nightmare scenario in which the Soviet Union launched a first strike and the United States, due to sabotage, Soviet stealth technology, or some failure of command and control, failed to detect and respond to it in time by launching its own missiles before they were destroyed in their silos by those of the Soviets. SDI’s immediate strategic purpose was to close this supposed “window of vulnerability.” The system would be given, not the impossible task of protecting the vast nation as a whole, but the merely hugely improbable one of protecting those few areas where the missile silos were concentrated. Asked point-blank under oath whether SDI was meant to protect American populations or American missile silos, Pentagon chief of research and engineering Richard DeLauer gave a telling non-answer: “What we are trying to do is enhance deterrence. If you enhance deterrence and your deterrence is credible and holds, the people are protected.” This is of course just a reiteration of the MAD policy itself, not a separate justification for SDI. MAD just kept getting madder.

The essential absurdity of American plans for SDI seems to have struck Gorbachev by the beginning of 1987. Soviet intelligence had been scrambling for a few years by then, convinced that there had to be some important technological breakthrough behind all of the smoke the Reagan administration was throwing. It seems that at about this point they may have concluded that, no, the whole thing really was as ridiculous as it seemed. At any rate, Gorbachev decided it wasn’t worth perpetuating the Cold War over. He backed away from his demands, offering the United States the opportunity to continue working on SDI if it liked, demanding only a commitment to inform the Soviet Union and officially back out of some relevant treaties (which might very possibly have to include the 1967 Outer Space Treaty that forbade nuclear explosions in space) if it decided to actually implement it. Coupled with Gorbachev’s soaring global popularity, it was enough to start getting deals done. Reagan and Gorbachev signed their first substantial agreement, to eliminate between them 2692 missiles, in December of 1987. More would follow, accompanied by shocking liberalization and reform behind the erstwhile Iron Curtain, culminating in the night of November 9, 1989, when the Berlin Wall, long the tangible symbol of division between East and West, came down. Just like that, almost incomprehensible in its suddenness, the Cold War was over. Trinity stands today as a cogent commentary on that strange shadow conflict, but it proved blessedly less than prescient about the way it would end. Whatever else is still to come, there will be no nuclear war between the United States of America and the Union of Soviet Socialist Republics.

If the end of the Cold War was shockingly unexpected, SDI played out exactly as you might expect. The program was renamed to the more modest Ballistic Missile Defense Organization and scaled back dramatically in 1993, by which time it had cost half again as much as the Manhattan Project — a staggering $30 billion, enough to make it the most expensive research program in history — and accomplished little. The old idea still resurfaces from time to time, but the fervor it once generated is all but forgotten now. SDI, like most of history, is now essentially a footnote.

A more inspiring closing subject is Mikhail Gorbachev. His Nobel Peace Prize notwithstanding, he strikes me as someone who hasn’t quite gotten his due yet from history. There are many reasons that the Cold War came to an end when it did. Prominent among them was the increasingly untenable Soviet economy, battered during the decade by “the Soviet Union’s Vietnam” (Gorbachev’s phrase) in Afghanistan, a global downturn in oil prices, and the sheer creaking inertia of many years of, as the old Soviet saying went, workers pretending to work while the state pretended to pay them for it. Nevertheless, I don’t agree with Marx that history is a compendium of economic forces. Many individuals across Eastern Europe stepped forward to end their countries’ totalitarian regimes — usually peacefully, sometimes violently, occasionally at the cost of their lives. But Gorbachev’s shadow overlays all the rest. Undaunted by the most bellicose Presidential rhetoric in two decades, he used politics, psychology, and logic to convince Reagan to sit down with him and talk, then worked with him to shape a better, safer world. While Reagan talked about ending MAD through his chimerical Star Wars, Gorbachev actually did it, by abandoning his predecessors’ traditional intransigence, rolling up his sleeves, and finding a way to make it work. Later, this was the man who didn’t choose to send in the tanks when the Warsaw Pact started to slip away, making him, as Victor Sebstyen put it, one of very few leaders in the history of the world to elect not to use force to maintain an empire. Finally, and although it certainly was never his intention, he brought the Soviet Union in for a soft landing, keeping the chaos to a minimum and keeping the missiles from flying. Who would have imagined Gorbachev was capable of such vision, such — and I don’t use this word lightly — heroism? Who would have imagined he could weave his way around the hardliners at home and abroad to accomplish what he did? Prior to assuming office in 1985, he was just a smart, capable Party man who knew who buttered his bread, who, as he later admitted, “licked Brezhnev’s ass” alongside his colleagues. And then when he got to the top he looked around, accepted that the system just wasn’t working, and decided to change it. Gorbachev reminds us that the hero is often not the one who picks up a gun but the one who chooses not to.

(In addition to the sources listed in the previous article, Way Out There in the Blue by Frances FitzGerald is the best history I’ve found of SDI and its politics.)

 
 

Tags: , ,

Trinity

Trinity

During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had taken office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became one of Infocom’s last substantial hits. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)

 
 

Tags: , ,

Out of the Frying Pan…

Activision, as embodied by Jim Levy (left), weds Infocom, as embodied by Joel Berez (right).

Activision, as embodied by Jim Levy (left), weds Infocom, as embodied by Joel Berez (right).

Activision’s first couple of years as a home-computer publisher were, for all their spirit of innovation and occasional artistic highs, mildly disappointing in commercial terms. While all of their games of this period were by no means flops, the only outsize hit among them was David Crane’s Ghostbusters. Activision was dogged by their own history; even selling several hundred thousand copies of Ghostbusters could feel anticlimactic when compared with the glory days of 1980 to 1983, when million-sellers were practically routine. And the company dragged along behind it more than psychological vestiges of that history. On the plus side, Jim Levy still had a substantial war chest made up of the profits socked away during those years with which to work. But on the minus side, the organization he ran was still too big, too unwieldy in light of the vastly reduced number of units they were moving these days in this completely different market. Levy was forced to authorize a painful series of almost quarterly layoffs as the big sales explosions stubbornly refused to come and Activision’s balance sheets remained in the red. Then came the departure of Alan Miller and Bob Whitehead to form the lean, mean Accolade, and that company’s galling instant profitability. Activision found themselves cast in the role of the bloated Atari of old, Jim Levy himself in that of the hated Ray Kassar. Nobody liked it one bit.

Levy and his board therefore adopted a new strategy for the second half of 1985: they would use some of that slowly dwindling war chest to acquire a whole stable of smaller developers, who would nevertheless continue to release games on their own imprints to avoid market saturation. The result would be more and more diverse games, separated into lines that would immediately identify for consumers just what type of game each title really was. In short order, Activision scooped up Gamestar, a developer of sports games. They also bought Creative Software, another tiny but stalwart industry survivor. Creative would specialize in home-oriented productivity software; ever since Brøderbund had hit it big with Bank Street Writer and The Print Shop publishers like Activision had been dreaming of duplicating their success. And then along came Infocom.

Joel Berez happened to run into Levy, accidentally or on purpose, during a business trip to Chicago in December of 1985. By this time Infocom’s travails were an open secret in the industry. Levy, by all accounts a genuine fan of Infocom’s games and, as Activision games like Portal attest, a great believer in the concept of interactive literature, immediately made it clear that Activision would be very interested in acquiring Infocom. Levy’s was literally the only offer on the table. After it dawned on them that Infocom alone could not possibly make a success out of Cornerstone, Al Vezza and his fellow business-oriented peers on Infocom’s board had for some time clung to the pipe dream of selling out to a big business publisher like Lotus, WordPerfect, or even Microsoft. But by now it was becoming clear even to them that absolutely no one cared a whit about Cornerstone, that the only value in Infocom was the games and the company’s still-sterling reputation as a game developer. However, those qualities, while by no means negligible, were outweighed in the eyes of most potential purchasers by the mountain of debt under which Infocom now labored, as well as by the worrisome shrinking sales of the pure text adventures released recently both by Infocom and their competitors. These were also very uncertain times for the industry in general, with many companies focused more on simple survival than expansion. Only Levy claimed to be able to sell his board on the idea of an Infocom acquisition. For Infocom, the choice was shaping up to be a stark one indeed: Activision subsidiary or bankruptcy. As Dave Lebling wryly said when asked his opinion on the acquisition, “What is a drowning man’s opinion of a life preserver?”

Levy was as good as his word. He convinced Activision’s board — some, especially in a year or two, might prefer to say “rammed the scheme through” — and on February 19, 1986, the two boards signed an agreement in principle for Activision to acquire Infocom by giving approximately $2.4 million in Activision stock to Infocom’s stockholders and assuming the company’s $6.8 million in debt. This was, for those keeping score, a pittance compared to what Simon & Schuster had been willing to pay barely a year before. But, what with their mountain of debt and flagging sales, Infocom’s new bargaining position wasn’t exactly strong; Simon & Schuster was now unwilling to do any deal at all, having already firmly rejected Vezza and Berez’s desperate, humiliating attempts to reopen the subject. As it was, Infocom considered themselves pretty lucky to get what they did; Levy could have driven a much harder bargain had he wanted to. And so Activision’s lawyers and accountants went to work to finalize things, and a few months later Infocom, Inc., officially ceased to exist. That fateful day was June 13, 1986, almost exactly seven years after a handful of MIT hackers had first gotten together with a vague intention to do “something with microcomputers.” It was also Friday the Thirteenth.

Still, even the most superstitious amongst Infocom’s staff could see little immediate ground for worry. If they had to give up their independence, it was hard to imagine a better guy to answer to than Jim Levy. He just “got” Infocom in a way that Al Vezza, for one, never had. He understood not only what the games were all about but also the company’s culture, and he seemed perfectly happy just to let both continue on as they were. During the due-diligence phase of the acquisition, Levy visited Infocom’s offices for a guided tour conducted, as one of his last official acts at Infocom, by an Al Vezza who visibly wanted nothing more by this time than to put this whole disappointing episode of his life behind him and return to the familiarity of MIT. In the process of duly demonstrating a series of games in progress, he came to Steve Meretzky’s next project, a risqué science-fiction farce (later succinctly described by Infocom’s newsletter as “Hitchhiker’s Guide with sex”) called Leather Goddesses of Phobos. “Of course, that’s not necessarily the final name,” muttered Vezza with embarrassment. “What? I wouldn’t call it anything else!” laughed a delighted Levy, to almost audible sighs of relief from the staffers around him.

Levy not only accepted but joined right in with the sort of cheerful insanity that had always made Vezza so uncomfortable. He cemented Infocom’s loyalty via his handling of the “InfoWedding” staffers threw for him and Joel Berez, who took over once again as Infocom’s top manager following Vezza’s unlamented departure. A description of the blessed nuptials appeared in Infocom’s newsletter.

In a dramatic affirmation of combinatorial spirit, Activision President James H. Levy and Infocom President Joel M. Berez were merged in a moving ceremony presided over by InfoRabbi Stuart W. Galleywitz. Infocommies cheered, participated in responsive readings from Hackers (written by Steven Levy — no relation to Jim), and threw rice at the beaming CEOs.

Berez read a tone poem drawn from the purple prose of several interactive-fiction stories, and Levy responded with a (clean) limerick.

The bride wore a veil made from five yards of nylon net, and carried artificial flowers. Both bride and groom wore looks of bemused surprise.

After a honeymoon at Aragain Falls, the newly merged couple will maintain their separate product-development and marketing facilities in Mountain View, California, and Cambridge, Massachusetts (i.e., we’ll still be the Infocom you know and love).

Queried about graphics in interactive-fiction stories, or better parsers in Little Computer People, the happy couple declined comment, but smiled enigmatically.

Soon after, Levy submitted to a gently mocking “Gruer’s Profile” (a play on a long-running advertising series by Dewar’s whiskey) prepared for the newsletter:

Hobby: Collecting software-development companies.

Latest Book Read: The Ballyhoo hint book.

Latest Accomplishment: Finding foster homes for all the Little Computer People.

Favorite Infocom Game: Cornerstone.

Why I Do What I Do: Alimony.

Quote: “People often mistake me for Bruce Willis.”

Profile: Charismatic. A real motivator. Looks great in a limousine.

His Drink: “Gruer’s Dark,” right out of a canteen. “Its taste blends perfectly with the sense of satisfaction I feel in knowing that I am now the kingpin of interactive fiction.”

Levy seemed to go out of his way to make the Infocom folks feel respected and comfortable within his growing Activision family. He was careful, for instance, never to refer to this acquisition as an acquisition or, God forbid, a buy-out. It was always a “merger” of apparent equals. The recently departed Marc Blank, who kept in close touch with his old colleagues and knew Levy from his time in the industry, calls him today a “great guy.” Brian Moriarty considered him “fairly benign” (mostly harmless?), with a gratifying taste for “quirky, interesting games”: “He seemed like a good match. It looked like we were going to be okay.”

Jim Levy's "Gruer's Profile"

This period immediately after the Activision acquisition would prove to be an Indian summer of sorts between a very difficult period just passed and another very difficult one to come. In some ways the Imps had it better than ever. With Vezza and his business-oriented allies now all gone, Infocom was clearly and exclusively a game-development shop; all of the cognitive dissonance brought on by Cornerstone was at long last in the past. Now everyone could just concentrate on making the best interactive fiction they possibly could, knowing as they did so that any money they made could go back into making still better, richer virtual worlds. Otherwise, things looked largely to be business as usual. The first game Infocom published as an Activision subsidiary was Moriarty’s Trinity, in my opinion simply the best single piece of work they would ever manage, and one which everyone inside Infocom recognized even at the time as one of their more “special” games. As omens go, that seemed as good as they come, certainly more than enough to offset any concerns about that unfortunate choice of Friday the Thirteenth.

Activision’s marketing people did almost immediately offer some suggestions — and largely very sensible ones — to Infocom. Some of these Mike Dornbrook’s marketing people greeted with open arms; they were things that they had been lobbying for to the Imps, usually without much success, for years now. Most notably, Activision strongly recommended that Infocom take a hard look at a back catalog that featured some of the most beloved classics in the young culture of computer gaming and think about how to utilize the goodwill and nostalgia they engendered. The Zork brand in particular, still by far the most recognizable in Infocom’s arsenal, had been, in defiance of all marketing wisdom, largely ignored since the original trilogy concluded back in 1982. Now Infocom prepared a pair of deluxe limited-edition slip-cased compilations of the Zork and Enchanter trilogies designed not only to give newcomers a convenient point of entry but also, and almost more importantly, to appeal to the collecting instinct that motivated (and still motivates) so many of their fans. Infocom had long since learned that many of their most loyal customers didn’t generally get all that far in the games at all. Some didn’t even play them all that much. Many just liked the idea of them, liked to collect them and see them standing there on the shelf. Put an old game in a snazzy new package and many of them would buy it all over again.

Infocom also got to work at long last — in fact, literally years after they should have in Dornbrook’s view — on a new game to bear the Zork name. While they were at it, they assigned Meretzky to bring back Infocom’s single most beloved character, the cuddly robot Floyd, in the sequel to Planetfall that that game’s finale had promised back in 1983 — just as soon as he was done with Leather Goddesses, that is, a game for which Infocom, in deference to the time-honored maxim that Sex Sells, also had very high hopes.

The Infocom/Activision honeymoon period and the spirit of creative and commercial renewal it engendered would last barely six months. The chummy dialogue between these two offices on opposite coasts would likewise devolve quickly into decrees and surly obedience — or, occasionally, covert or overt defiance. But that’s for a future article. For now we’ll leave Infocom to enjoy their Indian summer of relative content, and begin to look at the games that this period produced.

(Largely the usual Infocom sources this time out: Jason Scott’s Get Lamp interviews and Down From the Top of Its Game. The Dave Lebling quote comes from an interview with Adventure Classic Gaming. The anecdote about Vezza and Levy comes from Steve Meretzky’s interview for Game Design, Theory & Practice by Richard Rouse III.

Patreon supporters: this article is a bit shorter than the norm simply because that’s the length that it “wanted” to be. Because of that, I’m making it a freebie. In the future I’ll continue to make articles of less than 2500 words free.)

 
14 Comments

Posted by on December 29, 2014 in Digital Antiquaria, Interactive Fiction

 

Tags: ,

Ballyhoo

‘Tis true my form is something odd,
But blaming me is blaming God;
Could I create myself anew
I would not fail in pleasing you.

— poem by Joseph Merrick, “The Elephant Man”

Ballyhoo

This article does contain some spoilers for Ballyhoo!

Ballyhoo, a low-key mystery written by a new Implementor, was the last game ever released by an independent Infocom. When it appeared in February of 1986, Al Vezza and Joel Berez were desperately trying to reel in their lifeline of last resort, a competitor interested in acquiring this imploding company that had fallen from such a precipitous height in just a year’s time. Having come in like a lion with Zork, Infocom, Inc., would go out like a lamb with Ballyhoo; it would go on to become one of their least remembered and least remarked games. We’ll eventually get to some very good reasons for Ballyhoo to be regarded as one of the lesser entries in the Infocom canon. Still, it’s also deserving of more critical consideration than it’s generally received for its unique tone and texture and, most importantly, for a very significant formal innovation. In fact, discounting as relative trivialities some small-scale tinkering with abbreviations and the like and as evolutionary dead ends a blizzard of largely unsuccessful experiments that would mark Infocom’s final years, said innovation would be the last such to significantly impact the art of the text adventure as it would evolve after the commercial glory years of the 1980s.

If Ballyhoo is one of Infocom’s more forgotten games, its creator, Jeff O’Neill, is certainly the Forgotten Implementor. His perspective is conspicuously absent from virtually every history written of the company in the last quarter century. Most notably, he was the one Imp who declined to be interviewed for Jason Scott’s Get Lamp project. For reasons that we won’t dwell on here, O’Neill remains deeply embittered by his time with Infocom. Incredible as this may sound to those of us today who persist in viewing the company’s brief life as a sort of Camelot, that time in his own life is one that O’Neill would rather forget, as I learned to my disappointment when I reached out to him before writing this article. He has a right to his silence and his privacy, so we’ll leave it at that and confine ourselves to the public details.

O’Neill, at the time a frustrated young journalist looking for a career change, was hired by Infocom in the spring of 1984, just one of what would prove to be a major second wave of talent — including among their ranks Jon Palace and Brian Moriarty — who arrived at about the same time. Like Moriarty, O’Neill’s original role was a practical one: he became one of Infocom’s in-house testers. Having proved himself by dint of talent and hard work and the great ideas for new games he kept proposing, within about a year he became the first of a few who would eventually advance out of the testing department to become late-period Imps after Infocom’s hopes for hiring outside writers to craft their games proved largely fruitless.

Whether we attribute it to his degree in Journalism or innate talent, O’Neill had one of the most delicate writerly touches to be found amongst the Imps. Ballyhoo adds a color to Infocom’s emotional palette that we haven’t seen before: world-weary melancholy. The setting is a spectacularly original one for any adventurer tired of dragons and spaceships: an anachronistic, down-at-the-heels circus called “The Traveling Circus That Time Forgot, Inc.” The tears behind a clown’s greasepaint facade, as well as the tawdry desperation that is the flip side of “the show must go on” for performers and performances past their time, have been amply explored in other art forms. Yet such subtle shades of feeling have been only rarely evoked by games before or after Ballyhoo. Ballyhoo, in the words of one of its own more memorable descriptive passages, “exposes the underside of circus life — grungy costumes strung about, crooked and cracked mirrors, the musty odor of fresh makeup mingled with clown sweat infusing the air.” Given what was going on around O’Neill as he wrote the game, it feels hard not to draw parallels with Infocom’s own brief ascendency and abrupt fall from grace: “Your experience of the circus, with its ballyhooed promises of wonderment and its ultimate disappointment, has been to sink your teeth into a candy apple whose fruit is rotten.”

The nihilistic emptiness at the heart of the circus sideshow, the tragedy of these grotesques who parade themselves before the public because there’s no other alternative available to them, has likewise been expressed in art stretching at least as far back as Freaks, a 1932 film directed by Tod Browning that’s still as shocking and transgressive as it is moving today. Another obvious cultural touchstone, which would have been particularly fresh in the mid-1980s thanks to Bernard Pomerance’s 1979 play and David Lynch’s 1980 film, is the story of the so-called “Elephant Man”: Joseph Merrick, a gentle soul afflicted with horrendous deformities who was driven out into the street by his father at age 17 and forced to sell himself to various exploiters as a traveling “human curiosity.” Some say that Merrick died at age 27 in 1890 because he insisted on trying to lie down to sleep — something his enormous, misshapen head would not allow — as part of his fruitless lifelong quest just to “be like other people.”

Ballyhoo‘s own collection of freaks is less extreme but can be almost as heartbreaking. There’s Comrade Thumb, the Russian midget who’s been crammed into a Czarist general’s uniform and sent out to do tricks. Like Merrick, whose deformities made speech almost impossible, Thumb can’t even communicate with his fellow humans; he speaks only Russian (I’ve had just a taste of this sort of linguistic isolation at times since leaving the United States, and know how hard it can be). But saddest of all is the case of Tina, the “827 pounds of feminine charm” who’s become the circus’s token fat woman.

>n
West Half of Fat Lady
Dominating this once spacious room, geographic in her enormity, mountainous in her irreducibility, the fat lady sits (though no chair is visible) breathtakingly to the east. Paths around the attraction lead northeast and southeast. The exit is south.

>examine lady
The fat lady is wearing a big top, and the expression on her face is sad and distant, almost Rushmorean. She appears to be holding a small radio up to her ear.

>ne
It's a long haul, the scenery changing little. Eventually you arrive at ...

East Half of Fat Lady
The fat lady sits (though no chair is visible) breathtakingly to the west. Paths around the attraction lead northwest and southwest.

>give granola bar to tina
Tina is quick to confiscate the one-dollar-and-85-cent granola bar from your hand and grinds it up without hesitation. Turning her far-away gaze slowly in your direction, she seems to notice you for the first time this evening.

>tina, hello
She merely sighs wistfully, which creates a gale-force gust, and cranes her wrecking-ball-sized hand over to you.

>take hand
As you take hold, the fat lady's hand becomes relaxed, its full weight now residing in your arms like a sandbag and making your knees buckle.

>shake hand
Though unable to budge the fat lady's hand, your friendly intentions are nevertheless understood. The fat lady appears quite taken by your kindnesses. She clasps both her hands up to her chins, and stares ahead in teary silence.

If these passages, far from politically correct but by no means heartless, make you a bit uncomfortable, well, I like to think that they were meant to. After all, we’re among the sideshow gawkers. I’ve known people like Tina, cut off by their weight or other issues from a normal life, destined always to be noticed but never to be included. I recognize that wistful sigh, that far-off stare, that above-it-all stance that becomes their only defense. As for people like the circus’s manager Mr. Munrab — read the name backward — who we learn elsewhere “orders the roustabout to increase the frequency of her [Tina’s] feeding” every time she tries to go on a diet…. well, I’d like to think there’s a special circle of Hell for him along with Tom Norman, the man who stuck Joseph Merrick in a cage and set it up for the punters on Whitechapel Road.

I don’t want to give the impression that Ballyhoo is all doom and gloom, and certainly not that it’s entirely one-note in its mood. As Tina’s passages show, the game takes place in a vaguely surreal David Lynch-ian realm that’s tethered to but not quite the same as our own reality. This gives ample room for some flights of fancy that don’t always have to make us feel bad. O’Neill’s love of abstract wordplay, the theme around which his second and final work of interactive fiction would be built, also pops up in Ballyhoo from time to time. When you find yourself with an irresistible craving for something sweet, for instance, it takes the form of a literal monkey on your back who drives you to the concession stand. O’Neill also toys with the parser and the player sitting behind it to a degree not seen in an Infocom game since The Hitchhiker’s Guide to the Galaxy. Here’s what happens when you come upon a “piece of wood” that turns out to be a mousetrap:

>get wood
You have just encountered that brief instant of time between the realization that you've caused yourself excruciating Pain and the actual onslaught of such Pain, during which time most people speak with exclamation points and ... well, say things like ...

>fuck
Easy there! You're jeopardizing our "G" rating.

>darn
Bravisimo! Once more now, with feeling.

>darn
Cut! Cut! Okay, that's a wrap.

There’s even a fake death message, just the sort of faintly cruel player trickery that would have made Adams proud.

Indeed, there’s a little bit of bite, even a faint note of misanthropy, to O’Neill’s writing that’s largely missing from that of the other Imps. Your fellow circus-goers are uniformly boorish and boring. One or two situations, as well as the logical illogic needed to escape from them, smack of Infocom’s later social satire Bureaucracy, to which O’Neill (amongst many others) would make contributions.

>enter short line
You are now standing at the tail end of the short line.

>z
Time passes...

The face of the man ahead of you lights up as he spots something. "Hey, guys! It's ME, Jerry," he yells to a sizable group nearby, and they approach.

>z
Time passes...

"Haven't seen you turkeys in years. Howda hell are you guys?" They all reintroduce themselves. "Hey -- you clowns thirsty? Get in here, I'll buy y'all beer."

"You sure it's not a problem?" asks the catcher.

"Heck no, just scoot in right here."

With both your resolve and your heaving bosom firm against the crush of interlopers, you are nevertheless forced to backpedal.

>z
Time passes...

Jerry continues backslapping the second baseman.

>z
Time passes...

Jerry continues jiving with the center fielder.

>exit long line
You hear an inner voice whisper, "Do I really want to forfeit my position in the long line?" To which you answer:

>yes
You nonchalantly walk away from the long line.

>enter long line
A lot of other people must not have had the same idea as you, as they virtually hemorrhage over to the short line. Steaming to the front of the line, you get a two-dollar-and-25-cent frozen banana pushed at you and are whisked to the side before you can even count your change.

Ballyhoo was Infocom’s fourth game to be given the “Mystery” genre label. As such, it’s also an earnest attempt to solve a real or perceived problem that had long frustrated players of those previous three mysteries. The first of them, Deadline, had exploded the possibilities for adventure games by simulating a dynamic story with independent actors rather than just setting the player loose in a static world full of puzzles to solve; The Witness and Suspect had then continued along the same course. Instead of exploring a geographical space, the player’s primary task became to explore a story space, to learn how this dynamic system worked and to manipulate it to her own ends by judicious, precisely timed interference. While a huge advance that brought a new dimension to the adventure game, this seemingly much more story-oriented approach also remained paradoxically problematic to fully reconcile to the view of Infocom’s games as interactive fiction, as, as their box copy would have it, stories you “woke up inside” and proceeded to experience like the protagonist of a novel. The experience of playing one of these early mysteries was more like that of an editor, or a film director making an adaptation of the novel. You had to take the stories apart piece by piece through probing and experimentation, then put everything back together in a way that would guide the protagonist, from whom you stood at a decided remove, to the optimal ending. That process might offer pleasures all its own, but it kept the player firmly in the realm of puzzle-solver rather than fiction-enjoyer — or, if you like, guiding the fiction became the overarching puzzle. Even Infocom’s most unabashed attempt to create a “literary” work to date, Steve Meretzky’s A Mind Forever Voyaging, became abruptly, jarringly gamelike again when you got to the final section, where you had to head off a sequence of events that would otherwise be the end of you. In a film or novel based on A Mind Forever Voyaging, this sequence would just chance to play out in just the right way to let Perry Simm escape by the skin of his teeth and save the world in the process. In the game, however, the player was forced to figure out what dramatically satisfying narrative the author wanted to convey, then manipulate events to bring it to fruition, a very artificial process all the way around. Yet the alternative of a static environment given motion only when the player deigned to push on something was even farther from the idea of “interactive fiction” as a layperson might take that phrase. What to do?

Infocom’s answer, to which they first fully committed in Ballyhoo, was to flip the process on its head: to make the story respond to the player rather than always asking the player to respond to the story. Put another way, here the story chases the player rather than the player chasing the story. (Feel free to insert your “in Soviet Russia…” jokes here.) Ballyhoo is another dynamic mystery with its own collection of dramatic beats to work through. Now, though, the story moves forward only when and as the player’s actions make it most dramatically satisfying to do so, rather than ticking along according to its own remorseless timetable. So, for example, Comrade Thumb will struggle to get a drink of water from the public water fountain at the beginning of the game for hundreds of turns if necessary, until the player helps him by giving him a boost. He’ll then toddle off to another location to wait for the player to enter. When and only when she does, he’ll carry off his next dramatic beat. Later, a certain bumbling detective will wander onto the midway and pass out dead drunk just when the needs of the plot, as advanced by the player thus far, demand that he do so. Sometimes these developments are driven directly by the player, but at other times they happen only in the name of dramatic efficiency, of story logic. Rather than asking the player to construct a story from a bunch of component parts, now the author deconstructs the story she wants the player to experience, then figures out how to put it back together on the fly in a satisfying way in response to the player’s own actions — but without always making the fact that the story is responding to the player rather than unspooling on its own clear to the player. Ideally, this should let the player just enjoy the unfolding narrative from her perspective inside the story, which will always just happen to play out in suitably dramatic fashion, full of the close calls and crazy coincidences that are such part and parcel of story logic. Virtually unremarked at the time, this formal shift would eventually go on to become simply the way that adventure games were done, to the extent that the old Deadline approach stands out as a strange, cruel anomaly when it crops up on rare occasions on the modern landscape.

Depending on how you see these things, you might view this new approach as a major advance or as a disappointment, even as a capitulation of sorts. Early adventure writers, including those at Infocom, were very invested in the idea of their games as simulations of believable (if simplified) worlds. See, for instance, the article which Dave Lebling published in Byte in December of 1980, which, years before Infocom would dub their games “interactive fiction,” repeatedly refers to Zork and the other games like it that Infocom hopes to make as “computerized fantasy simulations.” Or see the flyer found in Zork I itself, which refers to that game as “a self-contained and self-maintaining universe.” To tinker with such a universe, to introduce a hand of God manipulating the levers in the name of drama and affect, felt and still feels wrong to some people. Most, however, have come to accept that pure, uncompromising simulation does not generally lead to a satisfying adventure game. Adventure games may be better viewed as storytelling and puzzle-solving engines — the relative emphasis placed on the former and the latter varying from work to work — wherein simulation elements are useful as long as they add verisimilitude and possibility without adding boredom and frustration, and stop being useful just as soon as the latter qualities begin to outweigh the former.

Which is not to say that this new approach of the story chasing the player is a magic bullet. Virtually everyone who’s played adventure games since Ballyhoo is familiar with the dilemma of a story engine that’s become stuck in place, of going over and over a game’s world looking for that one trigger you missed that will satisfy the game that all is in proper dramatic order and the next act can commence. My own heavily plotted adventure game is certainly not immune to this syndrome, which at its extreme can feel every bit as artificial and mimesis-destroying, and almost as frustrating, as needing to replay a game over and over with knowledge from past lives. Like so much else in life and game design, this sort of reactive storytelling is an imperfect solution, whose biggest virtue is that most people prefer its brand of occasional frustration to others.

And now we’ve come to the point in this article where I need to tell you why, despite pioneering such a major philosophical shift and despite a wonderful setting brought to life by some fine writing, Ballyhoo does indeed deserve its spot amongst the lower tier of Infocom games. The game has some deep-rooted problems that spoil much of what’s so good about it.

The most fundamental issue, one which badly damages Ballyhoo as both a coherent piece of fiction and a playable game, is that of motivation — or rather lack thereof. When the game begins you’re just another vaguely dissatisfied customer exiting the big top along with the rest of the maddening crowd. Getting the plot proper rolling by learning about the mystery itself — proprietor Munrab’s young daughter Chelsea has been kidnapped, possibly by one of his own discontented performers — requires you to sneak into a storage tent for no reason whatsoever. You then eavesdrop on a fortuitous conversation which occurs, thanks to Ballyhoo‘s new dramatic engine, just at the right moment. And so you decide that you are better equipped to solve the case than the uninterested and besotted detective Munrab has hired. But really, what kind of creepy busybody goes to the circus and then starts crawling around in the dark through forbidden areas just for kicks? Ballyhoo makes only the most minimal of attempts to explain such behavior in its opening passage: “The circus is a reminder of your own secret irrational desire to steal the spotlight, to defy death, and to bask in the thunder of applause.” That’s one of the most interesting and potential-fraught passages in the game, but Ballyhoo unfortunately makes no more real effort to explore this psychological theme, leaving the protagonist otherwise a largely blank slate. Especially given that the mystery at the heart of the game is quite low-stakes — the kidnapping is so clearly amateurish that Chelsea is hardly likely to suffer any real harm, while other dastardly revelations like the presence of an underground poker game aren’t exactly Godfather material — you’re left wondering why you’re here at all, why you’re sticking your nose into all this business that has nothing to do with you. In short, why do you care about any of this? Don’t you have anything better to be doing?

A similar aimlessness afflicts the puzzle structure. Ballyhoo never does muster that satisfying feeling of really building toward the solution of its central mystery. Instead, it just offers a bunch of situations that are clearly puzzles to be solved, but never gives you a clue why you should be solving them. For instance, you come upon a couple of lions in a locked cage which otherwise contains nothing other than a lion stand used in the lion trainer’s act. You soon find a key to the cage and a bullwhip. You have no use for the lion stand right now, nor for the lions themselves, nor for their cage. There’s obviously a puzzle to be solved here, but why? Well, if you do so and figure out how to deal with the lions, you’ll discover an important clue under the lion stand. But, with no possible way to know it was there, why on earth would any person risk her neck to enter a lion cage for no reason whatsoever? (Presumably the same kind that would creep into a circus’s supply tent…) Elsewhere you come upon an elephant in a tent. Later you have the opportunity to collect a mouse. You can probably imagine what you need to do, but, again, why? Why are you terrorizing this poor animal in its tiny, empty tent? More specifically, how could you anticipate that the elephant will bolt away in the perfect direction to knock down a certain section of fence? This George Mallory approach to design is everywhere in Ballyhoo. While “because it’s there” has been used plenty of times in justifying adventure-game puzzles both before and after Ballyhoo, Infocom by this time was usually much, much better at embedding puzzles within their games’ fictions.

With such an opaque puzzle structure, Ballyhoo becomes a very tough nut to crack; it’s never clear what problems you should be working on at any given time, nor how solving any given puzzle is likely to help you with the rest. It all just feels… random. And many of the individual solutions are really, really obscure, occasionally with a “read Jeff O’Neill’s mind” quality that pushes them past the boundary of fairness. Making things still more difficult are occasional struggles with the parser of the sort we’re just not used to seeing from Infocom by this stage: you can MOVE that moose head on the wall, but don’t try to TURN it. There’s also at least one significant bug that forced me to restore on my recent playthrough (the turnstile inexplicably stopped recognizing my ticket) and a few scattered typos. Again, these sorts of minor fit-and-finish problems are hardly surprising in general, but are surprising to find in an Infocom game of this vintage.

Assuming we give some of Hitchhiker’s dodgier elements a pass in the name of letting Douglas Adams be Douglas Adams, we have to go all the way back to those early days of Zork and Deadline to find an Infocom game with as many basic problems as this one. Ballyhoo isn’t, mind you, a complete reversion to the bad old days of 1982. Even leaving aside its bold new approach to plotting, much in Ballyhoo shows a very progressive sensibility. On at least one occasion when you’re on the verge of locking yourself out of victory, the game steers you to safety, saying that “the image of a burning bridge suddenly pops into your mind.” Yet on others it seems to positively delight in screwing you over. My theory, which is only that, is that Ballyhoo was adversely affected by the chaos inside Infocom as it neared release, that it didn’t get the full benefit of a usually exhaustive testing regime that normally rooted out not only bugs and implementation problems but also exactly the sorts of design issues that I’ve just pointed out. Thankfully, Ballyhoo would prove to be an anomaly; the games that succeeded it would once again evince the level of polish we’ve come to expect. Given that Ballyhoo was also the product of a first-time author, its failings are perhaps the result of a perfect storm of inexperience combined with distraction.

Ballyhoo was not, as you’ve probably guessed, a big seller, failing to break 30,000 copies in lifetime sales. It’s a paradoxical little game that I kind of love on one level but can’t really recommend on another. Certainly there’s much about it to which I really respond. Whether because I’m a melancholy soul at heart or because I just like to play at being one from time to time, I’m a sucker for its sort of ramshackle splendid decay. I’m such a sucker for it, in fact, that I dearly want Ballyhoo to be better than it is, to actually be the sad and beautiful work of interactive fiction that I sense it wants to be. I’ve occasionally overpraised it in the past for just that reason. But we also have to consider how well Ballyhoo works as an adventure game, and in that sense it’s a fairly broken creation. I won’t suggest that you tax yourself too much trying to actually solve it by yourself, but it’s well worth a bit of wandering around just to soak up its delicious melancholy.

 
35 Comments

Posted by on December 22, 2014 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

Simon & Schuster’s Treks to Nowhere

Star Trek

In 1983 the powers that were at Gulf and Western Industries, owners of both Paramount Pictures and Simon & Schuster, decided that it was time to bring Star Trek, a property of the former, to the computer under the stewardship of the latter. To appreciate this decision and everything that would follow it, we first should step back and briefly look at what Star Trek already meant to gamers at that time.

In late 1971, just as Star Trek was enjoying the first rush of a syndicated popularity that would soon far exceed that of its years as a first-run show, Mike Mayfield was a high-school senior with a passion for computers living near Irvine, California. He’d managed to finagle access to the University of California at Irvine’s Sigma 7 minicomputer, where he occasionally had a chance to play a port of MIT’s Spacewar!, generally acknowledged as the world’s first full-fledged videogame, on one of the university’s precious few graphical terminals. Mayfield wanted to write a space warfare game of his own, but he had no chance of securing the regular graphical-terminal access he’d need to do something along the lines of Spacewar! So he decided to try something more strategic and cerebral, something that could be displayed on a text-oriented terminal. If Spacewar! foreshadowed the frenetic dogfighting action of Star Wars many years before that movie existed, his own turn-based game would be modeled on the more stately space combat of his favorite television show. With the blissful unawareness of copyright and intellectual property that marks this early era of gaming, he simply called his game Star Trek.

One of the many variants of Mike Myfield's classic Star Trek

One of the many variants of Mike Mayfield’s classic Star Trek.

A full-on Klingon invasion is underway, the Enterprise the only Federation ship capable of stopping it. You, in the role of Captain Kirk, must try to turn back the invasion by warping from sector to sector and blowing away Klingon ships. Resource management is key. Virtually everything you do — moving within or between sectors; shooting phasers or photon torpedoes; absorbing enemy fire with your shields — consumes energy, of which you have only a limited quantity. You can repair, refuel, and restock your torpedoes at any of a number of friendly starbases scattered about the sectors, but doing so consumes precious time, of which you also have a limited quantity. If you don’t destroy all of the Klingons within thirty days they’ll break out and overrun the galaxy.

Within a year of starting on the game Mayfield moved on from the Sigma 7 to a much slicker HP-2100 series machine to which he had managed to convince the folks at his local Hewlett-Packard branch to give him access. He quickly ported Star Trek to HP Time-Shared BASIC, in which form, along with so many other historically important games, it spread across the country. It was discovered by David Ahl, who would soon go on to found the immensely important magazine Creative Computing. Ahl published an expanded version of the game, Super Star Trek, in 1974 as a type-in listing in one of Creative Computing‘s earliest issues. In 1977, Byte published another version, one of the few game listings ever to appear in the pages of that normally staunchly tech-oriented magazine. In 1978, Ahl republished his Super Star Trek in his book BASIC Computer Games. This collection of old standards largely drawn from the HP Time-Shared BASIC computing culture arrived at a precipitous time, just as the first wave of pre-assembled PCs were appearing in stores and catalogs. Super Star Trek was the standout entry in BASIC Computer Games, by far the longest program listing as well as the most complex, replayable, and interesting game to be found within its pages.

On the strength of this, the first million-selling computer book in history, Star Trek spread even more widely and wildly across the little machines than it had the big ones. From here the history of Star Trek the computer game gets truly bewildering, with hundreds of variants on Mayfield’s basic template running on dozens of systems. Some added to the invading Klingon hordes Star Trek‘s other all-purpose villains the Romulans, complete with their trademark cloaking devices; some added graphics and/or sound; some added the personalities of Spock, McCoy, Scott, and the rest reporting developments in-character. And the variations continually one-upped one another with ever more elaborate weapon and damage modeling. In 1983 a small company who called themselves Cygnus (later renamed to Interstel) reworked and expanded the concept into a commercial game called Star Fleet I: The War Begins! to considerable success. In this version the serial numbers were to some extent filed off for obvious reasons, but Cygnus didn’t really make the most concerted of efforts to hide their game’s origins. Klingons, for instance, simply became “Krellans,” while their more creatively named allies the “Zaldrons” have, you guessed it, cloaking devices.

This, then, was the situation when Simon & Schuster secured a mandate in 1983 to get into home computers, and to bring Star Trek along for the ride. Star Trek in general was now a hugely revitalized property in comparison to the bunch of orphaned old syndicated reruns that Mayfield had known back in 1971. There were now two successful films to the franchise’s credit and a third well into production, as well as a new line of paperback novels on Simon & Schuster’s own Pocket Books imprint regularly cracking bestseller lists. There was a popular new stand-up arcade game, Star Trek: Strategic Operations Simulator. And there was a successful tactical board game of spaceship combat, and an even more successful full-fledged tabletop RPG. There was even a space shuttle — albeit one which would never actually fly into space — sporting the name Enterprise. And of course Star Trek was all over computers in the form of ports of Strategic Operations Simulator as well as, and more importantly, Mike Mayfield’s unlicensed namesake game and its many variants, of which Paramount was actually quite remarkably tolerant. To my knowledge no one was ever sued over one of these games, and when David Ahl had asked for permission to include Super Star Trek in BASIC Computer Games Paramount had cheerfully agreed without asking for anything other than some legal fine print at the bottom of the page. Still, it’s not hard to understand why Paramount felt it was time for an official born-on-a-home-computer Star Trek game. Even leaving aside the obvious financial incentives, both Strategic Operations Simulator and Mayfield’s Star Trek and all of its successors were in a sense very un-Star Trek sorts of Star Trek games. They offered no exploring of strange new worlds, no seeking out of new life and new civilizations. No, these were straight-up war games, exactly the scenario that Star Trek‘s television writers had had to be careful not to let the series devolve into. Those writers had often discussed the fact that if any of the Enterprise‘s occasional run-ins with the Romulans or the Klingons ever resulted in open, generalized hostilities, Star Trek as a whole would have to become a very different sort of show, a tale of war in space rather than a five-year mission of peaceful (for the most part) exploration. Star Trek the television show would have had to become, in other words, like Star Trek the computer game.

But now, at last, Simon & Schuster had the mandate to move in the other direction, to create a game more consonant with what the show had been. In that spirit they secured the services of Diane Duane, an up-and-coming science-fiction and fantasy writer who had two Star Trek novels already in Pocket’s publication pipeline, to write a script for a Star Trek adventure game. Duane began making notes for an idea that riffed on the supposedly no-win Kobayashi Maru training scenario that had been memorably introduced at the beginning of the movie Star Trek II. The game’s fiction would have you participating in an alternative, hopefully more winnable test being considered as a replacement. Thus you would literally be playing The Kobayashi Alternative, the goal of which would be to find Mr. Sulu (Star Trek: The Search for Sulu?), now elevated to command of the USS Heinlein, who has disappeared along with his ship in a relatively unexplored sector of the galaxy.

Simon & Schuster’s first choice to implement this idea was the current darling of the industry, Infocom. As we’ve already learned in another article, Simon & Schuster spent a year earnestly trying to buy Infocom outright beginning in late 1983, dangling before their board the chance to work with a list of properties headed by Star Trek. An Infocom-helmed Star Trek adventure, written by Diane Duane, is today tempting ground indeed for dreams and speculation. However, that’s all it would become. Al Vezza and the rest of Infocom’s management stalled and dithered and ultimately rejected the Simon & Schuster bid for fear of losing creative control and, most significantly, because Simon & Schuster was utterly disinterested in Infocom’s aspirations to become a major developer of business software. As Infocom continued to drag their feet, Simon & Schuster made the fateful decision to take more direct control of Duane’s adventure game, publishing it under their own new “Computer Software Division” imprint.

Star Trek: The Kobayashi Alternative

Development of The Kobayashi Alternative was turned over to a new company called Micromosaics, founded by a veteran of the Children’s Television Workshop named Lary Rosenblatt to be a sort of full-service experience architect for the home-computer revolution, developing not only software but also the packaging, the manuals, and sometimes even the advertising that accompanied it; their staff included at least as many graphic designers as programmers. The packaging they came up with for The Kobayashi Alternative was indeed a stand-out even in this era of oft-grandiose packaging. Its centerpiece was a glossy full-color faux-Star Fleet briefing manual full of background information about the Enterprise and its crew and enough original art to set any Trekkie’s heart aflutter (one of these pictures, the first in this article, I cheerfully stole out of, er, a selfless conviction that it deserves to be seen). Sadly, the packaging also promised light years more than the actual contents of the disk delivered.

This alleged screenshot from the back of The Kobayashi Alternative's box is one of the most blatant instances of false advertising in gaming of the 1980s.

This alleged screenshot from the back of The Kobayashi Alternative‘s box is one of the most blatant instances of false advertising of 1980s gaming.

What the game actually looks like...

What the game actually looks like…

Whatever else you can say about it, you can’t say that The Kobayashi Alternative played it safe. Easily dismissed at a glance as just another text adventure, it’s actually a bizarrely original mutant creation, not quite like any other game I’ve ever seen. Everything that you as Captain Kirk can actually do yourself — “give,” “take,” “use,” “shoot,” etc. — you accomplish not through the parser but by tapping function-key combinations. You move about the Enterprise or planetside using the arrow keys. The parser, meanwhile, is literally your mouth; those things you type are things that you say aloud. This being Star Trek and you being Captain Kirk, that generally means orders that you issue to the rest of your familiar crew. And then, not satisfied with giving you just an adventure game with a very odd interface, Micromosaics also tried to build in a full simulation of the Enterprise for you to logistically manage and command in combat. Oh, and the whole thing is running in real time. If ever a game justified use of the “reach exceeded its grasp” reviewer’s cliché, it’s this one. The Kobayashi Alternative is unplayable. No one at Micromosaics had any real practical experience making computer games, and it shows.

A strange new world with a notable lack of new life and new civilizations

A strange new world with a notable lack of new life and new civilizations

The Kobayashi Alternative is yet another contender for the title of emptiest adventure game ever. In fact, it takes that crown handily from the likes of Level 9’s Snowball and Electronic Arts’s Amnesia. In lieu of discrete, unique locations, each of the ten planets you can beam down to consists of a vast X-Y grid of numerical coordinates to dully trudge across looking for the two or three places that actually contain something of interest. Sometimes you get clues in the form of coordinates to visit, but at other times the game seems to expect you to just lawnmower through hundreds of locations until you find something. The Enterprise, all 23 decks of it, is implemented in a similar lack of detail. It turns out that all those empty, anonymous corridors we were always seeing in the television show really were almost all there was to the ship. When you do find something or somebody, the parser is so limited that you never have any confidence in the conversations that result. Some versions of the game, for instance, don’t even understand the word “Sulu,” making the most natural question to ask anyone you meet — “Where is Sulu?” — a nonstarter. And then there are the bugs. Crewmen — but not you — can beam down to poisonous planets in their shirt sleeves and remain unharmed; when walking east on planets the program fails to warn you about dangerous terrain ahead, meaning you can tumble into a lake of liquid nitrogen without ever being told about it; crewmen inexplicably root themselves to the ground planetside, refusing to follow you no matter how you push or cajole or even start shooting at them with your phaser.

Following The Kobayashi Alternative‘s 1985 release, gamers, downright desperate as they were to play in this beloved universe, proved remarkably patient, while Simon & Schuster also seemed admirably determined to stay the course. Some six months after the initial release they published a revised version that, they claimed, fixed all of the bugs. The other, more deep-rooted design problems they tried to ret-con with a revised manual, which rather passive-aggressively announced that “The Kobayashi Alternative differs in several important ways from other interactive text simulations that you may have used,” including being “completely open-ended.” (Don’t cry to us if this doesn’t play like one of Infocom’s!) The parser problems were neatly sidestepped by printing every single phrase the parser could understand in the manual. And the most obvious major design flaw was similarly addressed by simply printing a list of all the important coordinates on all of the planets in the manual.

Interest in the game remained so high that Computer Gaming World‘s Scorpia, one of the premier fan voices in adventure gaming, printed a second multi-page review of the revised version to join her original, a level of commitment I don’t believe she ever showed to any other game. Alas, even after giving it the benefit of every doubt she couldn’t say the second version was any better than the original. It was actually worse: in fixing some bugs, Micromosaics introduced many others, including one that stole critical items silently from your inventory and made the game as unsolvable as the no-win scenario that provided its name. Micromosaics and Simon & Schuster couldn’t seem to get anything right; even some of the planet coordinates printed in the revised manual were wrong, sending you beaming down into the middle of a helium sea. Thus Scorpia’s second review was, like the first, largely a list of deadly bugs and ways to work around them. The whole sad chronicle adds up to the most hideously botched major adventure-game release of the 1980s, a betrayal of consumer trust worthy of a lawsuit. This software thing wasn’t turning out to be quite as easy as Simon & Schuster had thought it would be.

While The Kobayashi Alternative stands today as perhaps the most interesting of Simon & Schuster’s Star Trek games thanks to its soaring ambitions and how comprehensively it fails to achieve any of them, it was far from the last of its line. Understandably disenchanted with Micromosaics but determined to keep plugging away at Star Trek gaming, Simon & Schuster turned to another new company to create their second Star Trek adventure: TRANS Fiction Systems.

The story of TRANS Fiction begins with Ron Martinez, who had previously written a couple of Choose Your Own Adventure-style children’s gamebooks for publisher Byron Preiss, then had written the script for Telarium’s computerized adaptation of Rendezvous with Rama. Uninspiring as the finished result of that project was, it awakened a passion to dive deeper and do more with interactive fiction than Telarium’s limited technology would allow. Martinez:

If this was really an art form — like film, for example — you’d really want to know how to create the entire work. In film, you’d want to understand how to work a camera, how to shoot, how to edit, how to really make the finished product. For me, as a writer, I understood that I had to know how to program.

Like just about everybody else, I worshiped the Infocom work, was just amazed by it. So, my goal was to do two things simultaneously:

1. Learn how to program, so that I could —

2. Build an interactive-fiction system that was as good or better than Infocom’s.

Working with a more experienced programmer named Bill Herdle, Martinez did indeed devise his own interactive-fiction development system using a programming language we seem to be meeting an awful lot lately: Forth. Martinez, Herdle, and Jim Gasperini, another writerly alum of Byron Preiss, founded TRANS Fiction to deploy their system. They sincerely believed in interactive fiction as an art form, and were arrogant enough to believe themselves unusually qualified to realize its potential.

We started as writers and then learned the programming. Of other companies, we used to say that the people who built the stage are writing the plays. We used to look down our nose at people who were technical but had no sense of what story was all about attempting to use this medium which we thought would redefine fiction — we really believed that. Instead of having people who were technical trying to write stories, we thought it really had to come the other way, so the technology is in the service of the story and the characters and the richness of the world.

Thanks to their connections in the world of book publishing and their New York City location, TRANS Fiction was soon able to secure a contract to do the next Simon & Schuster Star Trek game. It wasn’t perhaps a dream project for a group of people with their artistic aspirations, but they needed to pay the bills. Thus, instead of things like the interactive version of William Burroughs’s novel Nova Express that Martinez fruitlessly pursued with Electronic Arts, TRANS Fiction did lots of work with less rarefied properties, like the Make Your Own Murder Party generator they did for EA and, yes, Star Trek.

I can tell you that it wasn’t with great joy that we were working with these properties. There’s something soulless about working with a big property owned by a conglomerate. Even though we might love Spock, it was still a property, and there were brand police, who had to review everything that Spock might say or do. We would extend the world and try to introduce new aspects of the history of these characters, but they’d have to sign off on it.

Given Martinez’s attitude as well as that set of restrictions, it’s not terribly shocking that TRANS Fiction’s first Star Trek game, The Promethean Prophecy, is not all that terribly inspired or inspiring. Nor is its parser or game engine quite “as good as,” much less “better than,” Infocom’s. A much more conventional — perhaps too conventional — text adventure than its crazy predecessor, its status as the most enjoyable of all the Simon & Schuster-era Treks has more to do with the weaknesses of its peers than its own intrinsic strengths.

Star Trek: The Promethean Prophecy. We're back on much more conventional text-adventure territory here...

Star Trek: The Promethean Prophecy

The Promethean Prophecy doesn’t try to be a starship simulator to anywhere near the same degree as its predecessor. While it does open with a space battle, said battle is largely an exercise in puzzle solving, in figuring out the next command that will drive the hard-wired plot forward and not get you killed, rather than a real tactical simulation. After that sequence, you beam down to Prometheus, the only planet in the game, and start on a fairly standard “figure out this alien culture” puzzle-driven text adventure which, other than having Kirk, Spock, and company as its stars, doesn’t feel all that notably Star Trek-like at all. What with its linear and heavily plotted opening followed by a non-linear body to be explored at your own pace, it reminds me more than anything of Infocom’s Starcross. This impression even extends to the puzzles themselves, which like those of Starcross often involve exchanging items with and otherwise manipulating the strange aliens you meet. And yet again like in Starcross, there is no possibility of having real conversations with them. Unfortunately, coming as it did four years after Starcross, The Promethean Prophecy is neither as notable in the context of history nor quite as clever and memorable on its own terms as a game. From its parser to its writing to its puzzles it’s best described as “competent” — a description which admittedly puts it head and shoulders above many of Infocom’s competitors and its own predecessor. The best of this era of Star Trek games, it’s also the one that feels the least like Star Trek.

Still, The Promethean Prophecy did have the virtue of being relatively bug free, a virtue that speaks more to the diligence of TRANS Fiction than Simon & Schuster; as Martinez later put it, “Nobody at Simon & Schuster really understood how we were doing any of it.” It was greeted with cautiously positive reviews and presumably sold a reasonable number of copies on the strength of the Star Trek name alone, but it hardly set the industry on fire. An all-text game of any stripe was becoming quite a hard sell indeed by the time of its late 1986 release.

After The Promethean Prophecy Simon & Schuster continued to doggedly release new Star Trek games, a motley assortment that ranged from problematic to downright bad. For 1987’s The Rebel Universe, they enlisted the services of our old friend Mike Singleton, who, departing even more from The Promethean Prophecy than that game had from its predecessor, tried to create a grand strategy game, a sort of Lords of Midnight in space. It was full of interesting ideas, but rushed to release in an incomplete and fatally unbalanced state. For 1988’s First Contact (no relation to the 1996 movie), they — incredibly — went back to Micromosaics, who simplified and retrofitted onto the old Kobayashi Alternative engine the ability to display the occasional interstitial graphic. Unfortunately, they also overcompensated for the overwhelming universe of their first game by making First Contact far too trivial. The following year’s adaptation of Star Trek V: The Final Frontier, oddly released through Mindscape rather than using Simon & Schuster’s own imprint, and 1990’s The Transinium Challenge, another product of TRANS Fiction and the first game to feature the cast of The Next Generation, were little more than interactive slide shows most notable for their heavy use of digitized images from the actual shows at a time when seeing real photographs of reasonable fidelity on a computer screen was still a fairly amazing thing.

It was all disappointing enough that by the beginning of the 1990s fans had begun to mumble about a Star Trek gaming curse. And indeed, it’s hard to know what to make of the handling of the franchise during this period. Gifted with easily one of the five most beloved properties on the planet amongst the computer-gaming demographic, Simon & Schuster refused to either turn it over to an experienced software publisher who would know what to do with it — virtually any of them would have paid a hell of a lot of money to have a crack at it — or to get really serious and pay a top-flight developer to create a really top-flight game. Instead they took the pointless middle route, tossing off a stream of rushed efforts from second-tier developers that managed to be unappealing enough to be sales disappointments despite the huge popularity of the name on their boxes, while other games made without a license — notably Starflight — proved much more successful at evoking the sense of wonder that always characterized Star Trek at its best. It wouldn’t be until 1992 that Star Trek would finally come to computers in a satisfying form that actually felt like Star Trek — but that’s a story for another day.

For today, I encourage you to have a look at one or more of the variants of Mike Mayfield’s original Star Trek game. There are a number of very good versions that you can play right in your browser. One of the first really compelling strategy games to appear on computers and, when taking into account all of its versions and variations, very likely the single most popular game on PCs during that Paleolithic era of about 1978 to 1981, it’s still capable of stealing a few hours of your time today. It’s also, needless to say, far more compelling than any commercial Star Trek released prior to 1992. Still, completionism demands that I also make available The Kobayashi Alternative and The Promethean Prophecy in their Commodore 64 incarnations for those of you who want to give them a go as well. They aren’t the worst adventures in the world… no, I take that back. The Kobayashi Alternative kind of is. Maybe it’s worth a look for that reason alone.

(The history of Mike Mayfield’s Star Trek has been covered much more thoroughly than I have here by other modern digital historians. See, for instance, Games of Fame and Pete Turnbull’s page on the game among many others. Most of my information on Simon & Schuster and TRANS Fiction was drawn from Jason Scott’s interview with Martinez for Get Lamp; thanks again for sharing, Jason! Scorpia’s review and re-review of The Kobayashi Alternative appeared in Computer Gaming World‘s March 1986 and August 1986 issues respectively.

For an excellent perspective on how Star Trek‘s writers saw the show as well as the state of Star Trek around the time that Mayfield first wrote his game, see David Gerrold’s The World of Star Trek. Apart from its value as a research source, it’s also a very special book to me, the first real work of criticism that I ever read as a kid. It taught me that you could love something while still acknowledging and even dissecting its flaws. I’m not as enchanted with Star Trek now as I was at the Science Fiction Golden Age of twelve, but Gerrold’s book has stuck with me to become an influence on the work I do here today. I was really happy recently to see it come back into “print” as an e-book.)

 
39 Comments

Posted by on December 17, 2014 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,