Category Archives: Digital Antiquaria

T Plus 4: Bombing Nevada



You're in a narrow underground chamber, illuminated by an open door in the east wall. The walls and ceiling are gouged with deep spiral ruts; they look as if they've been routed out with heavy machinery.

A large cylinder occupies most of the chamber. The maze of cables and pipes surrounding it trails west, into the depths of a tunnel.


The cables and pipes lining the tunnel's walls look like bloated veins and arteries in the splinter's flickering glow. Deep tunnels bend off to the east and west.

Some careless technician has left a walkie-talkie lying in the dirt.

>get walkie-talkie

>turn on walkie-talkie
You turn on the rocker switch.

Time passes.

A tinny voice, half-buried in static, says "Two."

Time passes.


Time passes.

The walkie-talkie clicks and hisses.

Time passes.

For a brief moment, the tunnel is bathed in a raw white glare.

The most subtly chilling vista in Trinity is found not inside one of its real-world atomic vignettes, but rather in the magical land that serves as the central hub for your explorations. This landscape is dotted with incongruous giant toadstools, each of which, you eventually realize, represents a single atomic explosion.

As your eyes sweep the landscape, you notice more of the giant toadstools. There must be hundreds of them. Some sprout in clusters, others grow in solitude among the trees. Their numbers increase dramatically as your gaze moves westward, until the forest is choked with pale domes.

The scene is a representation of time, following the path of the sun from east to west. The toadstools choking the forest to the west presumably represent the nuclear apocalypse you’ve just escaped. If we subtract those toadstools along with the two somewhere far off to the east that must represent the Hiroshima and Nagasaki blasts, we’re left with those that represent not instances of atomic bombs used in anger, but rather tests. A few of these we know well as historical landmarks in their own right: the first hydrogen bomb; the first Soviet bomb; that original Trinity blast, off far to the southeast with the rising sun, from which the game takes its name and where its climax will play out. Like the bombs used in anger, these don’t interest us today; we’ll give them their due in future articles. What I do want to talk about today is some of the blasts we don’t usually hear so much about. As the landscape would indicate, there have been lots of them. Since the nuclear era began one summer morning in the New Mexico desert in 1945, there has been a verified total of 2119 tests of nuclear bombs. Almost half of that number is attributed to the United States alone. Yes, there have been a lot of bombs.

At the close of World War II, the big question for planners and politicians in the United States was that of who should be given control of the nation’s burgeoning nuclear arsenal. The Manhattan Project had been conducted under the ostensible auspices of the Army Air Force (the Air Force wouldn’t become it’s own independent service branch until 1947), but in reality had been something of a law unto itself. Now both Army and Navy were eager to lay claim to the bomb. The latter had dismissed the bomb’s prospects during the war years and declined to play any role in the Manhattan Project, but was nevertheless able to wrangle enough control now to be given responsibility for the first post-war tests of the gadgets, to be called Operation Crossroads. The tests’ announced objective was to determine the impact of the atomic bomb on military ships. Accordingly, the Navy assembled for atomic target practice around Bikini Atoll in the Marshall Islands a fleet made up of surplus American ships and captured German and Japanese that would have been the envy of most other nations. Its 93 vessels included in their ranks 2 aircraft carriers, 5 battleships, and 4 cruisers. The 167 native residents of Bikini were shipped off to another, much less survivable island, first stop in what would prove to be a long odyssey of misery. (Their sad story is best told in Operation Crossroads by Jonathan M. Weisgall.)

From the start, Operation Crossroads had more to do with politics than with engineering or scientific considerations. It was widely hyped as a “test” to see if the very idea of a fighting navy still had any relevance in this new atomic age. More importantly in the minds of its political planners, it would also be a forceful demonstration to the Soviet Union of just what this awesome new American weapon could do. Operation Crossroads was the hottest ticket in town during the summer of 1946. Politicians, bureaucrats, and journalists — everyone who could finagle an invitation — flocked to Bikini to enjoy the spectacle along with good wine and food aboard one of the Navy’s well-appointed host vessels, swelling the number of on-site personnel to as high as 40,000.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

The spectators would get somewhat less than they bargained for, many of the sailors considerably more. The first bomb was dropped from a borrowed Army Air Force B-29 because the Navy had no aircraft capable of carrying the gadget. Dropped on a hazy, humid morning from high altitude, from which level the B-29 was notoriously inaccurate even under the best conditions, the bomb missed the center of the doomed fleet by some 700 yards. Only two uninteresting attack transports sank instantly in anything like the expected spectacular fashion, and only five ships sank in total, the largest of them a cruiser. As the journalists filed their reams of disappointed copy and the Navy’s leadership breathed a sigh of relief, some 5000 often shirtless sailors were dispatched to board the various vessels inside the hot zone to analyze their damage; as a safety precaution, they first scrubbed them down using water, soap, and lye to get rid of any lingering radiation. The operation then proceeded with the second bomb, an underwater blast that proved somewhat more satisfying, ripping apart the big battleship Arkansas and the aircraft carrier Saratoga amongst other vessels and tossing their pieces high into the air.

The second Operation Crossroads shot, July 25, 1946.

The second Operation Crossroads shot, July 25, 1946.

Operation Crossroads was emblematic of a Navy leadership that had yet to get their collective heads around just what a paradigm-annihilating device the atomic bomb actually was. Their insistence on dropping it on warships, as if the future was just going to bring more Battles of Midway with somewhat bigger explosions, shows that they still thought of the atomic bomb as essentially just a more powerful version of the bombs they were used to, a fundamentally tactical rather than strategic device. Their complete failure to take seriously the dangers of radioactive fallout, meanwhile, may be the reason that the sailors who took part in Operation Crossroads suffered an average life-span reduction of three months compared to others in their peer group. These were early days yet in atomic physics, but their state of denial is nevertheless difficult to understand. If the horrific photographs and films out of Hiroshima and Nagasaki — some of which apparently are shocking enough to still be classified — hadn’t been warning enough, there was always the case of Los Alamos physicist Louis Slotin. Less than six weeks before Operation Crossroads began, Slotin had accidentally started a chain reaction while experimenting with the atomic core of the same type of bomb used in the tests. He stopped the reaction through quick thinking and bravery, but not before absorbing a lethal dose of radiation. His slow, agonizing death — the second such to be experienced by a Los Alamos physicist — was meticulously filmed and documented, then made available to everyone working with atomic weapons. And yet the Navy chortled about the failure of the atomic bomb to do as much damage as expected whilst cheerfully sending in the boys to do some cleanup, ignoring both the slowly dying goats and other animals they had left aboard the various ships and the assessment of the Bulletin of Atomic Scientists of the likely fate of any individual ship in the target fleet: “The crew would be killed by the deadly burst of radiation from the bomb, and only a ghost ship would remain, floating unattended in the vast waters of the ocean.”

Just as President Eisenhower would take space exploration out from under the thumb of the military a decade later with the creation of NASA, President Truman did an end-run around the military’s conventional thinking about the atomic bomb on January 1, 1947, when the new, ostensibly civilian Atomic Energy Commission took over all responsibility for the development, testing, and deployment of the nation’s atomic stockpile. The Atomic Energy Commision would continue to conduct a steady trickle of tests in the remoter reaches of the Pacific for many years to come, albeit none with quite the bizarre spectator-sport qualities of Operation Crossroads. But the twin shocks of the first Soviet test of an atomic bomb on August 29, 1949, and the beginning of the Korean War in 1950, which came equipped with a raging debate about whether, how, and when the United States should again use its nuclear arsenal in anger, led weapons developers to agitate for a more local test site where they could regularly and easily set off smaller weapons than the blockbusters that tended to get earmarked to the Pacific. There were, they argued, plenty of open spaces in the American Southwest that would suit such a purpose perfectly well. On December 18, 1950, Truman therefore approved the allocation for this purpose of a 680-square-mile area inside the vast Nellis Air Force Gunnery and Bombing Range in the Nevada desert some 65 miles northwest of Las Vegas. The first test there, marking the first atomic bomb to be exploded on American soil since the original Trinity device, took place astonishingly soon thereafter, on January 27, 1951. By the end of the year sleeping quarters, mess halls, and laboratories had been built, creating a functioning, happy little community dedicated to making ever better bombs. The saga of the Nevada Test Site had begun. In the end no fewer than 928 of the 1032 nuclear tests ever conducted by the United States would be conducted right here.

One of the many test shots seen from the Las Vegas strip during the 1950s.

One of the many test shots seen from the Las Vegas Strip during the 1950s.

The strangest years of this very strange enterprise were the earliest. With money plentiful and the need to keep ahead of the Soviets perceived as urgent, bombs were exploded at quite a clip — twelve during the first year alone. At first they were mostly dropped from airplanes, later more commonly hung from balloons or mounted atop tall temporary towers. The testing regime was, as test-site geophysicist Wendell Weart would later put it, very “free-form.” If someone at one of the nation’s dueling atomic-weapons laboratories of Lawrence Livermore and Los Alamos determined that he needed a “shot” to prove a point or answer a question, he generally got it in pretty short order. Whatever else the testing accomplished, it was also a hell of a lot of fun. “I guess little boys like fireworks and firecrackers,” Weart admits, “and this was the biggest set of fireworks you could ever hope to see.” Las Vegas residents grew accustomed to the surreal sight of mushroom clouds blooming over their cityscape, like scenes from one of the B-grade atomic-themed monsters movies that filled the theaters of the era. When the bombs went off at night, they sometimes made enough light to read a newspaper by.

This era of free-form atmospheric testing at the Nevada Test Site coincided with the era of atomic mania in the United States at large, when nuclear energy of all stripes was considered the key to the future and the square-jawed scientists and engineers who worked on it veritable heroes. The most enduring marker of this era today is also one of the first. In 1946, not one but two French designers introduced risque new women’s bathing suits that were smaller and more revealing than anything that had come before. Jacques Heim called his the “atome,” or atom, “the world’s smallest bathing suit.” Louis Réard named his the bikini after the recently concluded Operation Crossroads tests at Bikini Atoll. “Like the bomb,” he declared, “the bikini is small and devastating.” It was Réard’s chosen name that stuck. In addition to explosive swimwear, by the mid-1950s you could get a “Lone Ranger atomic-bomb ring” by sending in 15 cents plus a Kix cereal proof of purchase; buy a pair of atomic-bomb salt and pepper shakers; buy an “Atomic Disintegrator” cap gun. Trinity‘s accompanying comic book with its breathless “Atomic Facts: Stranger than Fiction!” and its hyperactive patriotism is a dead ringer for those times.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Said times being what they were, Las Vegas denizens, far from being disturbed by the bombs going off so close by, embraced them with all of their usual kitschy enthusiasm. The test site helpfully provided an annual calendar of scheduled tests for civilians so they could make plans to come out and enjoy the shows. For children, it was a special treat to drive up to one of the best viewpoints on Mount Charleston early in the morning on the day of a shot, like an even better version of the Fourth of July; the budding connoisseurs  cataloged and ranked the shots and compared notes with their friends in the schoolyard. Many adults, being connoisseurs of another stripe, might prefer the “Miss Atomic Bomb” beauty pageants and revues that were all the rage along the Strip.

Showgirl Sally McCloskey does an "atomic ballet" in front of a shot.

Showgirl Sally McCloskey does an “atomic ballet” in front of a shot.

The official government stance, at the time and to a large extent even today, is that the radioactive fallout from these explosions traveled little distance if at all and was in any case minor enough to present few to no health or environmental concerns. Nevertheless, ranchers whose sheep grazed in the vicinity of the test site saw their flocks begin to sicken and die very soon after the test shots began. They mounted a lawsuit, which was denied under somewhat questionable circumstances in 1956; the sheep, claimed the court, had died of “malnutrition” or some other unidentified sickness. That judgment, almost all of the transcripts from which have since been lost, was later overturned on the rather astonishing basis of outright “fraud on the court” by the government’s defense team. That judgment was in its turn vacated on appeal in 1985, more than thirty years after the events in question. Virtually all questions about the so-called “Downwinders” who were affected — or believe they were affected — by fallout from the test site seem to end up in a similarly frustrating tangle.

What does seem fairly clear amid the bureaucratic babble, from circumstantial evidence if nothing else, is that the government even in the 1950s had more awareness of and concerns about fallout from the site than they owned up to publicly. Radioactive debris from those very first tests in early 1951 was detected, according to test-site meteorologist Philip Wymer Allen, going “up over Utah and across the Midwest and Illinois, not too far south of Chicago, and out across the Atlantic Coast and was still easily measured as the cloud passed north of Bermuda. We didn’t track it any further than that.” Already in 1952 physical chemist Willard Libby, inventor of radiocarbon dating and later a chairman of the Atomic Energy Commission, was expressing concerns about radioactive cesium escaping the site and being absorbed into the bones of people, especially children. A potential result could be leukemia. Another, arguably even graver concern, was radioiodine particles, which could be carried a surprising distance downwind before settling to earth, potentially on the forage preferred by sheep, goats, and cows. Many people in rural communities, especially in those days, drank unprocessed milk straight from the cow, as it were. If enough milk containing radioiodine is ingested, it can lead to thyroid cancer. Children were, once again, both particularly big drinkers of milk and particularly prone to the effects of the radioiodine that might be within it. When environmental chemist Delbert Barth was hired in the 1960s to conduct studies of radioiodine dispersion patterns at the site, he was asked to also make historical projections for the atmospheric shots of the 1950s — a request that, at least on its surface, seems rather odd if everyone truly believed there was absolutely nothing to fear. Similarly odd seems a policy which went into effect very early: not to conduct shots if the winds were blowing toward Las Vegas.

The radioactive exposure — or lack thereof — of the Downwinders remains a major political issue inside Nevada and also Utah, which many claim also received its fair share of fallout. Most people who were associated with the site say, predictably enough, that the Downwinders are at best misguided and at worst would-be freeloaders. Studies have not established a clear causal link between incidences of cancer and proximity to the Nevada Test Site, although many, including Barth, have expressed concerns about the methodologies they’ve employed. What we’re left with, then, are lots of heartbreaking stories which may have been caused by the activities at the site or may represent the simple hand of fate. (For a particularly sad story, which I won’t go into here because I don’t want to sound exploitative, see this interview with Zenna Mae and Eugene Bridges.)

The first era of the Nevada Test Site came to an abrupt end in November of 1958, when the United States and the Soviet Union entered into a non-binding mutual moratorium on all sorts of nuclear testing. For almost three years, the bombs fell silent at the test site and at its Soviet equivalent near Semipalatinsk in Kazakhstan. But then, on September 1, 1961, the Soviets suddenly started testing again, prompting the Nevada Test Site to go back into action as well. Still, the public was growing increasingly concerned over what was starting to look like the reckless practice of atmospheric testing. While Las Vegas had continued to party hearty, even before the moratorium the doughty farmers and ranchers working still closer to the site had, as Lawrence Livermore physicist Clifford Olsen rather dismissively puts it, “started to grumble a bit” about the effect they believed the fallout was having on their animals and crops and possibly their own bodies and those of their children. And now an international environmentalist movement was beginning to arise in response to Rachel Carson’s Silent Spring. In one of his last major acts before his assassination, President Kennedy in October of 1963 signed along with Soviet General Secretary Khrushchev the Partial Nuclear Test Ban Treaty that required all future nuclear tests to take place underground.

But never fear, the good times were hardly over at the Nevada Test Site. The scientists and engineers there had been experimenting with underground explosions for some years already in anticipation of this day that the more politically aware among them had begun to see as inevitable. Thus they were more than prepared to continue full-speed-ahead with a new regime of underground testing. The number of shots actually increased considerably during the 1960s, often clipping along at a steady average of one per week or more. Las Vegas, meanwhile, was still not allowed to forget about the presence of the test site. Residents grew accustomed to tremors that cracked plaster and made high-rises sway disconcertingly, phenomena that came to be known as “seismic fallout.” As the political mood shifted over the course of the decade, the number of complaints grew steadily, especially after a couple of big shots of well over 1 megaton in 1968 that caused serious structural damage to a number of buildings in Las Vegas. One of the most persistent and vociferous of the complainers was the eccentric billionaire recluse Howard Hughes, who was living at the time on the top two floors of the Desert Inn hotel. Hughes marshaled lots of money, employees, and political connections to his cause during the late 1960s, but was never able to stop or even slow the testing.

As for the environmental impact of this new breed of underground tests, the news is mixed. While neither is exactly ideal, it’s obviously preferable from an environmental standpoint to be exploding atomic bombs underground rather than in the open air. A whole new applied sub-science of geophysics, the discipline of nuclear “containment,” evolved out of efforts to, well, contain the explosions — to keep any radioactive material at all from “venting” to the surface during an explosion or “seeping” to the surface during the hours, months, and years afterward. And yet the attitudes of the folks working on the shots can still sound shockingly cavalier today. About 30 percent of the underground tests conducted during the 1960s leaked radioactivity to the surface to one degree or another. Those working at the site considered this figure acceptable. Virtually everyone present there during the 1960s makes note of the positive, non-bureaucratic, “can-do” attitude that still persisted into this new era of underground testing. Linda Smith, an administrator at the site, characterizes the attitude thus: “There is such a strong bias to get it done that overrides everything. Is there any profound discussion of should we or shouldn’t we? Is this good for the country? Is it not? There’s no question. You are there to get it done.” Clifford Olsen says, “We were all pretty much sure we were doing the right thing.”

What to make of this lack of introspection? Whatever else we say about it, we shouldn’t condemn the people of the Nevada Test Site too harshly for it. There were heaps of brilliant minds among them, but their backgrounds were very different from those of the people who had worked on the Manhattan Project, many of whom had thought and agonized at length about the nature of the work they were doing and the unimaginable power they were unleashing on the world. The men and few women of the Nevada Test Site, by contrast, had mostly come of age during or immediately after World War II, and had been raised in the very bosom of the burgeoning military-industrial complex. Indeed, most had had their education funded by military or industrial backers for the singular purpose of designing and operating nuclear weapons. This set them apart from their predecessors, who before the Manhattan Project and to a large degree after it — many among that first generation of bomb-makers considered their work in this area essentially done once the first few bombs had been exploded — tended to focus more on “pure” science than on its practical application. A few Brits aside, the Nevada Test Site people were monolithically American; many on the Manhattan Project came from overseas, including lots of refugees from occupied Europe. The Nevada Test Site people were politically conservative, in favor of law and order and strong defense (how could they not be given the nature of their work?); the Manhattan Project people were a much more politically heterogeneous group, with a leader in Robert Oppenheimer who had worked extensively for communist causes. Someone with a background like his would never have been allowed past the front gate of the Nevada Test Site.

Whatever else it was, the Nevada Test Site was just a great place to work. Regarded as they were as the nation’s main bulwark against the Soviet Union, the atomic scientists and all of those who worked with and supported them generally got whatever they asked for. Even the chow was first-rate: at the cafeteria, a dollar would get you all the steaks — good steaks — that you could eat. When all the long hours spent planning and calculating got to be too much, you could always take in a movie or go bowling: a little self-contained all-American town called Mercury grew up with the test site there in the middle of the desert. Its population peaked at about 10,000 during the 1960s, by which time it included in addition to a movie theater and bowling alley a post office, schools, churches, a variety of restaurants, a library, a swimming hall, and hotels — including one named, inevitably, the Atomic Motel. Or you could always take a walk just outside of town amidst the splendid, haunting desolation of the Nevada desert. And for those not satisfied with these small-town pleasures, the neon of Las Vegas beckoned just an hour or so down the highway.

But just as importantly, the work itself was deeply satisfying. After the slide rules and the geological charts were put away, there still remained some of that old childlike pleasure in watching things go boom. Wendell Weart: “I would go back in a tunnel and see what happened to these massive structures that we had put in there, and to see how it manhandled them and just wadded them up into balls. That was impressive.” Nor was the Nevada Test Site entirely an exercise in nuclear nihilism. While weapons development remained always the primary focus, most working there believed deeply in the peaceful potential for nuclear energy — even for nuclear explosions. One of the most extended and extensive test series conducted at the site was known as Operation Plowshare, a reference to “beating swords into plowshares” from the Book of Isaiah. Operation Plowshare eventually encompassed 27 separate explosions, stretching from 1961 to 1973. Its major focus was on nuclear explosions as means for carrying out grand earth-moving and digging operations, for the creation of trenches and canals among other things. (Such ideas formed the basis of the proposal Edward Teller bandied about during the Panama Canal controversy of the late 1970s to just dig another canal using hydrogen bombs.) Serious plans were mooted at one point to dig a whole new harbor at Cape Thompson in Alaska, more as a demonstration of the awesome potential of hydrogen bombs for such purposes than out of any practical necessity. Thankfully for the delicate oceanic ecosystem thereabouts, cooler heads prevailed in the end.

So, the people who worked at the site weren’t bad people. They were in fact almost uniformly good friends, good colleagues, good workers who were at the absolute tops of their various fields. Almost any one of them would have made a great, helpful neighbor. Nor, as Operation Plowshare and other projects attest, were they bereft of their own certain brand of idealism. If they sound heartlessly dismissive of the Downwinders’ claims and needlessly contemptuous of environmentalists who fret over the damage their work did and may still be doing, well, it would be hard for any of us to even consider the notion that the work to which we dedicated our lives — work which we thoroughly enjoyed, which made us feel good about ourselves, around which many of our happiest memories revolve — was misguided or downright foolish or may have even killed children, for God’s sake. I tend to see the people who worked at the site as embodying the best and the worst qualities of Americans in general, charging forward with optimism and industry and that great American can-do spirit — but perhaps not always thinking enough about just where they’re charging to.

A plume of radioactive debris vents from the Baneberry shot.

A plume of radioactive debris vents from the Baneberry shot.

The golden age of free-and-easy atomic testing at the Nevada Test Site ended at last on December 18, 1970. That was the day of Baneberry, a routine underground shot of just 10 kilotons. However, due to what the geophysicists involved claim was a perfect storm of factors, its containment model failed comprehensively. A huge cloud of highly radioactive particles burst to the surface and was blown directly over a mining encampment that was preparing the hole for another test nearby. By now the nature of radioactivity and its dangers was much better appreciated than it had been during the time of Operation Crossroads. All of the people at the encampment were put through extended, extensive decontamination procedures. Nevertheless, two heretofore healthy young men, an electrician and a security guard, died of leukemia within four years of the event. Their widows sued the government, resulting in another seemingly endless series of trials, feints, and legal maneuvers, culminating in yet another frustrating non-resolution in 1996: the government was found negligent and the plaintiffs awarded damages, but the deaths of the two men were paradoxically ruled not to have been a result of their radiation exposure. As many in the Downwinder community darkly noted at the time, a full admission of guilt in this case would have left the government open to a whole series of new law suits. Thus, they claimed, this strange splitting of the difference.

The more immediate consequence of Baneberry was a six-month moratorium on atomic testing at the Nevada Test Site while the accident was investigated and procedures were overhauled. When testing resumed, it did so in a much more controlled way, with containment calculations in particular required to go through an extended process of peer reviews and committee approvals. The Atomic Energy Commission also began for the first time to put pressure on the scientists and engineers to minimize the number of tests conducted by pooling resources and finding ways to get all the data they could out of each individual shot. The result was a slowdown from that high during the 1960s of about one shot per week to perhaps one or two per month. Old-timers grumbled about red tape and how the can-do spirit of the 1950s and 1960s had been lost, but, perhaps tellingly, there were no more Baneberrys. Of the roughly 350 shots at the Nevada Test Site after Baneberry, only 4 showed any detectable radiation leakage at all.

The site continued to operate right through the balance of the Cold War. The last bomb to be exploded there was also the last exploded to date by the United States: an anticlimactic little 5-kiloton shot on September 23, 1992. By this time, anti-nuclear activists had made the Nevada Test Site one of their major targets, and were a constant headache for everyone who worked there. Included among the ranks of those arrested for trespassing and disruption during the test site’s twilight years are Kris Kristofferson, Martin Sheen, Robert Blake, and Carl Sagan. Needless to say, the mood of the country and the public’s attitude toward nuclear weapons had changed considerably since those rah-rah days of atomic cap guns.

A tunnel waits in readiness, just in case.

A tunnel waits in readiness, just in case.

Since the mid-1990s the United States, along with Russia and the other established nuclear powers, has observed a long-lasting if non-binding tacit moratorium on all types of nuclear testing (a moratorium which unfortunately hasn’t been observed by new members of the nuclear club India, Pakistan, and North Korea). Stories of the days when mushroom clouds loomed over the Las Vegas Strip and the ground shook with the force of nuclear detonations are now something for long-time Nevada residents to share with their children or grandchildren. With its reason for existence in abeyance, the Nevada Test Site is in a state of largely deserted suspended animation today, Mercury a ghost town inhabited by only a few caretakers and esoteric researchers. One hopes that if Mercury should ever start to buzz with family life and commerce again it’s because someone has found some other, safer purpose for the desert landscape that surrounds it. In the meantime, the tunnels are still kept in readiness, just in case someone decides it’s time to start setting off the bombs again.

(The definitive resource on the history of the Nevada Test Site must be, now and likely forevermore, the University of Nevada at Las Vegas’s amazing Nevada Test Site Oral History Project. I could barely scratch the surface of the hundreds of lengthy interviews there when researching this article. And thanks to Duncan Stevens for his recommendation of Operation Crossroads by Jonathan M. Weisgall. I highly recommend the documentary The Atomic Cafe as a portrait of the era of atomic kitsch.)


Tags: , ,

T Plus 5: Bombs in Space


Earth Orbit, on a satellite

The satellite you're riding is about twenty feet long, and shaped like a beer can.

Time passes.

A red flash draws your eyes to the ground below, where the contrail of a missile is climbing into the stratosphere.

Time passes.

The maneuvering thrusters on the satellite fire, turning the nose until it faces the ascending missile.

Time passes.

The satellite erupts in a savage glare that lights up the ground below. A beam of violet radiation flashes downward, obliterating the distant missile. Unfortunately, you have little time to admire this triumph of engineering before the satellite's blast incinerates you.

Trinity aims in 256 K of text adventure to chronicle at least fifty years of humanity’s relationship to the atomic bomb, as encapsulated into seven vignettes. Two of these, the one dealing with the long-dreaded full-on nuclear war that begins with you on vacation in London’s Kensington Gardens and the one you see above involving a functioning version of Ronald Reagan’s “Star Wars” Strategic Defense Initiative (a proposition that all by itself justifies Trinity‘s “Fantasy” genre tag, as we’ll soon see), are actually speculative rather than historical, taking place at some point in the near future. The satirical comic that accompanies the game also reserves space for Reagan and his dream. It’s a bold choice to put Reagan himself in there, undisguised by pseudonymous machinations like A Mind Forever Voyaging‘s “Richard Ryder” — even a brave one for a company that was hardly in a position to alienate potential players. Trinity, you see, was released at the absolute zenith of Reagan’s popularity. While the comic and the game it accompanies hardly add up to a scathing sustained indictment a la A Mind Forever Voyaging, they do cast him as yet one more Cold Warrior in a conservative blue suit leading the world further along the garden path to the unthinkable. Today I’d like to look at this “orbiting ‘umbrella’ of high technology” that Trinity postulates — correctly — isn’t really going to help us all that much at all when the missiles start to fly. Along the way we’ll get a chance to explore some of the underpinnings of the nuclear standoff and also the way it came to an anticlimactically sudden end, so thankfully at odds with Trinity‘s more dramatic predictions of the supposed inevitable.

In November of 1985, while Trinity was in development, Ronald Reagan and the new Soviet General Secretary Mikhail Gorbachev met for the first American/Soviet summit of Reagan’s Presidency. The fact that the summit took place at all was almost entirely down to the efforts of Gorbachev, who quite skillfully made it politically impossible for Reagan not to attend. It marked the first time Reagan had actually talked with his Soviet counterpart face to face in his almost five years as President. The two men, as contemporary press reports would have it, “took the measure of each other” and largely liked what they saw, but came to no agreements. The second summit, held in Reykjavik, Iceland, in October of the following year, came to within a hairs breath of a major deal that would have started the superpowers down the road to the complete elimination of nuclear armaments and effectively marked the beginning of the end of the Cold War. The only stumbling block was the Strategic Defense Initiative. Gorbachev was adamant that Reagan give it up, or at least limit it to “laboratory testing”; Reagan just as adamantly refused. He repeatedly expressed to both Gorbachev and the press his bafflement at this alleged intransigence. SDI, he said, was to be a technology of defense, a technology for peace. His favorite metaphor was SDI as a nuclear “gas mask.” The major powers of the world had all banned poison gas by treaty after World War I, and, rather extraordinarily, even kept to that bargain through all the other horrors of World War II. Still, no one had thrown away their gas-mask stockpiles, and the knowledge that other countries still possessed them had just possibly helped to keep everyone honest. SDI, Reagan said, could serve the same purpose in the realm of nuclear weapons. He even made an extraordinary offer: the United States would be willing to give SDI to the Soviets “at cost” — whatever that meant — as soon as it was ready, as long as the Soviets would also be willing to share any fruits of their own (largely nonexistent) research. That way everyone could have nuclear gas masks! How could anyone who genuinely hoped and planned not to use nuclear weapons anyway possibly object to terms like that?

Gorbachev had a different view of the matter. He saw SDI as an inherently destabilizing force that would effectively jettison not one but two of the tacit agreements of the Cold War that had so far prevented a nuclear apocalypse. Would any responsible leader easily accept such an engine of chaos in return for a vague promise to “share” the technology? Would Reagan? It’s very difficult to know what was behind Reagan’s seeming naivete. Certainly his advisers knew that his folksy analogies hardly began to address Gorbachev’s very real and very reasonable concerns. If the shoe had been on the other foot, they would have had the same reaction. Secretary of Defense Casper Weinberger had demonstrated that in December of 1983, when he had said, “I can’t imagine a more destabilizing factor for the world than if the Soviets should acquire a thoroughly reliable defense against these missiles before we did.” As for Reagan himself, who knows? Your opinion on the matter depends on how you take this famous but enigmatic man whom conservatives have always found as easy to deify as liberals to demonize. Was he a bold visionary who saved his country from itself, or a Machivellian schemer who used a genial persona to institute an uglier, more heartless version of America? Or was he just a clueless if good-natured and very, very lucky bumbler? Or was he still the experienced actor, out there hitting his marks and selling the policies of his handlers like he had once shilled for General Electric? Regardless, let’s try to do more justice to Gorbachev’s concerns about SDI than Reagan did at their summits.

It’s kind of amazing that the Cold War never led to weapons in space. It certainly didn’t have to be that way. Histories today note what a shock it was to American pride and confidence when the Soviet Union became the first nation to successfully launch a satellite on October 4, 1957. That’s true enough, but a glance at the newspapers from the time also reveals less abstract fears. Now that the Soviets had satellites, people expected them to weaponize them, to use them to start dropping atomic bombs on their heads from space. One rumor, which amazingly turned out to have a basis in fact, claimed the Soviets planned to nuke the Moon, leading to speculation on what would happen if their missile was to miss the surface, boomerang around the Moon, and come back to Earth — talk about being hoisted from one’s own petard! The United States’s response to the Soviets’ satellite was par for the course during the Cold War: panicked, often ill-considered activity in the name of not falling behind. Initial responsibility for space was given to the military. The Navy and the Air Force, who often seemed to distrust one another more than either did the Soviets, promptly started squabbling over who owned this new seascape or skyscape, which depending on how you looked at it and how you picked your metaphors could reasonably be assumed to belong to either. While the Naval Research Laboratory struggled to get the United States’s first satellite into space, the more ambitious dreamers at the Air Force Special Weapons Center made their own secret plans to nuke the Moon as a show of force and mulled the construction of a manned secret spy base there.

But then, on July 29, 1958, President Eisenhower signed the bill that would transform the tiny National Advisory Committee for Aeronautics into the soon-to-be massive National Aeronautics and Space Administration — NASA. While NASA’s charter duly charged the new agency with making any “discoveries” available for “national defense” and with “the preservation of the role of the United States as a leader in aeronautical and space science and technology,” those goals came only after more high-toned abstractions like “the expansion of human knowledge” and the use of space for “peaceful and scientific purposes.” NASA was something of an early propaganda coup at a time when very little seemed to be going right with astronautics in the United States. The Soviet leadership had little choice but to accept the idea, publicly at least, of space exploration as a fundamentally peaceful endeavor. In 1967 the United States and the Soviet Union became signatories along with many other nations to the Outer Space Treaty that enshrined the peaceful status quo into international law. By way of compensation, the first operational ICBMs had started to come online by the end of the 1950s, giving both superpowers a way of dealing impersonal death from the stratosphere without having to rely on wonky satellites.

This is not to say that the Cold War never made it into space in any form. Far from it. Apollo, that grandest adventure of the twentieth century, would never have happened without the impetus of geopolitics. The Apollo 11 astronauts may have left a message on the Moon saying they had “come in peace for all mankind,” may have even believed it at some level, but that was hardly the whole story. President Kennedy, the architect of it all, had no illusions about the real purpose of his Moon pronouncement. “Everything that we do ought to be really tied into getting onto the Moon ahead of the Russians,” he told NASA Administrator James Webb in 1962. “Otherwise we shouldn’t be spending this kind of money because I’m not that interested in space.” The Moon Race, like war, was diplomacy through other means. As such, the division between military and civilian was not always all that clear. For instance, the first Americans to fly into orbit, like the first Soviets, did so mounted atop repurposed ICBMs.

Indeed, neither the American nor the Soviet military had any interest in leaving space entirely to the civilians. If one of the goals of NASA’s formation had been to eliminate duplications of effort, it didn’t entirely succeed. The Air Force in particular proved very reluctant to give up on their own manned space efforts, developing during the 1960s the X-15 rocket plane that Neil Armstrong among others flew to the edge of orbit, the cancelled Dyna-Soar space plane, and even a manned space station that also never got off the drawing board. Planners in both the United States and the Soviet Union seemed to treat the 1967 Outer Space Treaty as almost a temporary accord, waiting for the other shoe to drop and for the militarization of space to begin in earnest. I’ve already described in an earlier article how, once the Moon Race was over, NASA was forced to make an unholy alliance with the Air Force to build the space shuttle, whose very flight profile was designed to allow it to avoid space-based weaponry that didn’t yet actually exist.

Yet the most immediate and far-reaching military application of space proved to be reconnaissance satellites. Well before the 1960s were out these orbiting spies had become vital parts of the intelligence apparatus of both the United States and the Soviet Union, as well as vital tools for the detection of ICBM launches by the other side — yet another component of the ever-evolving balance of terror. Still, restrained by treaty, habit, and concern over what it might make the other guys do, neither of the superpowers ever progressed to the logical step of trying to shoot down those satellites that were spying on their countries. If you had told people in 1957 that there would still be effectively no weapons in space almost thirty years later, that there would never have been anything even remotely resembling a battle in space, I think they would be quite surprised.

But now SDI had come along and, at least in the Soviets’ view, threatened to undermine that tradition. They need only take at face value early reports of SDI’s potential implementations, which were all over the American popular media by the time of Reagan’s 1984 reelection campaign, to have ample grounds for concern. One early plan, proposed in apparent earnest by a committee who may have seen The Battle of Britain (or Star Wars) a few too many times, would have the United States and its allies protected by squadrons of orbiting manned fighter planes, who would rocket to the rescue to shoot down encroaching ICBMs, their daring pilots presumably wearing dashing scarves and using phrases like “Tally ho!” A more grounded plan, relatively speaking, was the one for hundreds of “orbiting battle stations” equipped with particle-beam weapons or missiles of their own — hey, whatever works — to pick off the ICBMs. Of course, as soon as these gadgets came into being the Soviets would have to develop gadgets of their own to try to take them out. Thus a precious accord would be shattered forever. To the Soviets, SDI felt like a betrayal, a breaking of a sacred trust that had so far kept people and satellites in space from having to shoot at each other and in doing so had just possibly prevented the development of a new generation of horrific weaponry.

And yet this was if anything the more modest of the two outrages they saw being inflicted on the world by SDI. The biggest problem was that it could be both a symptom and a cause of the ending of the old MAD doctrine — Mutually Assured Destruction — that had been the guiding principle of both superpowers for over twenty years and that had prevented them from blowing one another up along with the rest of the world. On its surface, the MAD formulation is simplicity itself. I have enough nuclear weapons to destroy your country — or at least to do unacceptable damage to it — and a window of time to launch them at you between the time I realize that you’ve launch yours at me and the time that yours actually hit me. Further, neither of us has the ability to stop the missiles of the other — at least, not enough of them. Therefore we’d best find some way to get along and not shoot missiles at each other. One comparison, so favored by Reagan that he drove Gorbachev crazy by using it over and over again at each of their summits, is that of two movie mobsters with cocked and loaded pistols pointed at each others’ heads.

That well-intentioned comparison is also a rather facile one. The difference is a matter of degree. Many of us had MAD, that most fundamental doctrine of the Cold War, engrained in us as schoolchildren to such a degree that it might be hard for us to really think about its horribleness anymore. Nevertheless, I’d like for us to try to do so now. Let’s think in particular about its basic psychological prerequisites. In order for the threat of nuclear annihilation to be an effective deterrent, in order for it never to be carried out, it must paradoxically be a real threat, one which absolutely, unquestionably would be carried out if the order was given. If the other side was ever to suspect that we were not willing to destroy them, the deterrent would evaporate. So, we must create an entire military superstructure, a veritable subculture, of many thousands of people all willing to unquestioningly annihilate tens or hundreds of millions of people. Indeed, said annihilation is the entire purpose of their professional existence. They sit in their missile silos or in their ready rooms or cruise the world in their submarines waiting for the order to push that button or turn that key that will quite literally end existence as they know it, insulated from the incalculable suffering that action will cause inside the very same sorts of “clean, carpeted, warmed, and well-lighted offices” that Reagan once described as the domain of the Soviet Union’s totalitarian leadership alone. If the rise of this sort of antiseptic killing is the tragedy of the twentieth century, the doctrine of MAD represents it taken to its well-nigh incomprehensible extreme.

MAD, requiring as it did people to be always ready and able to carry out genocide so that they would not have to carry out genocide, struck a perilous psychological balance. Things had the potential to go sideways when one of these actors in what most people hoped would be Waiting for Godot started to get a little bit too ready and able — in short, when someone started to believe that he could win. See for example General Curtis LeMay, head of the Strategic Air Command from its inception until 1965 and the inspiration for Dr. Strangelove‘s unhinged General Jack Ripper. LeMay believed to his dying day that the the United States had “lost” the Cuban Missile Crisis because President Kennedy had squandered his chance to finally just attack the Soviet Union and be done with it; talked of the killing of 100 million human beings as a worthwhile trade-off for the decapitation of the Soviet leadership; openly campaigned for and sought ways to covertly acquire the metaphorical keys to the nuclear arsenal, to be used solely at his own dubious discretion. “If I see that the Russians are amassing their planes for an attack, I’m going to knock the shit out of them before they take off the ground,” he once told a civil-defense committee. When told that such an action would represent insubordination to the point of treason, he replied, “I don’t care. It’s my policy. That’s what I’m going to do.” Tellingly, Dr. Strangelove itself was originally envisioned as a realistic thriller. The film descended into black comedy only when Stanley Kubrick started his research and discovered that so much of the reality was, well, blackly comic. Much in Dr. Strangelove that moviegoers of 1964 took as satire was in fact plain truth.

If the belief by a single individual that a nuclear war can be won is dangerous, an institutionalized version of that belief might just be the most dangerous thing in the world. And here we get to the heart of the Soviets’ almost visceral aversion to SDI, for it seemed to them and many others a product of just such a belief.

During the mid-1970s, when détente was still the watchword of the day, a group of Washington old-timers and newly arrived whiz kids formed something with the Orwellian name of The Committee on the Present Danger. Its leading light was one Paul Nitze. A name few Americans then or now are likely to recognize, Nitze had been a Washington insider since the 1940s and would remain a leading voice in Cold War policy for literally the entire duration of the Cold War. He and his colleagues, many of them part of a new generation of so-called “neoconservative” ideologues, claimed that détente was a sham, that “the Soviets do not agree with the Americans that nuclear war is unthinkable and unwinnable and that the only objective of strategic doctrine must be mutual deterrence.” On the contrary, they were preparing for “war-fighting, war-surviving, and war-winning.” Their means for accomplishing the latter two objectives would be an elaborate civil-defense program that was supposedly so effective as to reduce their casualties in an all-out nuclear exchange to about 10 percent of what the United States could expect. The Committee offered little or no proof for these assertions and many others like them. Many simply assumed that the well-connected Nitze must have access to secret intelligence sources which he couldn’t name. If so, they were secret indeed. When the CIA, alarmed by claims of Soviet preparedness in the Committee’s reports that were completely new to them, instituted a two-year investigation to get to the bottom of it all, they couldn’t find any evidence of whatsoever of any unusual civil-defense programs, much less any secret plans to start and win a nuclear war. It appears that Nitze and his colleagues exaggerated wildly and, when even that wouldn’t serve their ends, just made stuff up. (This pattern of “fixing the intelligence” would remain with Committee veterans for decades, leading most notably to the Iraq invasion of 2003.)

Throughout the Carter administration the Committee lobbied anyone who would listen, using the same sort of paranoid circular logic that had led to the nuclear-arms race in the first place. The Soviets, they said, have secretly abandoned the MAD strategy and embarked on a nuclear-war-winning strategy in its place. Therefore we must do likewise. There could be no American counterpart to the magical Soviet civil-defense measures that could somehow protect 90 percent of their population from the blasts of nuclear weapons and the long years of radioactive fall-out that would follow. This was because civil defense was “unattractive” to an “open society” (“unattractiveness” being a strangely weak justification for not doing something in the face of what the Committee claimed was an immediate existential threat, but so be it). One thing the United States could and must do in response was to engage in a huge nuclear- and conventional-arms buildup. That way it could be sure to hammer the Soviets inside their impregnable tunnels — or wherever it was they would all be going — just as hard as possible. But in addition, the United States must come up with a defense of its own.

Although Carter engaged in a major military buildup in his own right, his was nowhere near big enough in the Committee’s eyes. But then came the 1980 election of Ronald Reagan. Reagan took all of the Committee’s positions to heart and, indeed, took most of its most prominent members into his administration. Their new approach to geopolitical strategy was immediately apparent, and immediately destabilizing. Their endless military feints and probes and aggressive rhetoric seemed almost to have the intention of starting a war with the Soviet Union, a war they seemed to welcome whilst being bizarrely dismissive of its potentially world-ending consequences. Their comments read like extracts from Trinity‘s satirically gung-ho accompanying comic. “Nuclear war is a destructive thing, but it is still in large part a physics problem,” said one official. “If there are enough shovels to go around, everybody’s going to make it. It’s the dirt that does it,” said another. Asked if he thought that a nuclear war was “winnable,” Casper Weinberger replied, “We certainly are not planning to be defeated.” And then, in March of that fraught year of 1983 when the administration almost got the nuclear war it seemed to be courting, came Reagan’s SDI speech.

The most important thing to understand about SDI is that it was always a fantasy, a chimera chased by politicians and strategists who dearly wished it was possible. The only actual scientist amongst those who lobbied for it was Edward Teller, well known to the public as the father of the hydrogen bomb. One of the few participants in the Manhattan Project which had built the first atomic bomb more than 35 years before still active in public life at the time that Reagan took office, Teller was a brilliant scientist when he wanted to be, but one whose findings and predictions were often tainted by his strident anti-communism and a passion for nuclear weapons that could leave him sounding as unhinged as General LeMay. Teller seldom saw a problem that couldn’t be solved just by throwing a hydrogen bomb or two at it. His response to Carter’s decision to return the Panama Canal to Panama, for instance, was to recommend quickly digging a new one across some more cooperative Central American country using hydrogen bombs. Now, alone amongst his scientific peers, Teller told the Reagan administration that SDI was possible. He claimed that he could create X-ray beams in space by, naturally, detonating hydrogen bombs just so. These could be aimed at enemy missiles, zapping them out of the sky. The whole system could be researched, built, and put into service within five years. As evidence, he offered some inconclusive preliminary results derived from experimental underground explosions. It was all completely ridiculous; we still don’t know how to create such X-ray beams today, decades on. But it was also exactly the sort of superficially credible scientific endorsement — and from the father of the hydrogen bomb, no less! — that the Reagan administration needed.

Reagan coasted to reelection in 1984 in a campaign that felt more like a victory lap, buoyed by “Morning Again in America,” an energetic economy, and a military buildup that had SDI as one of its key components. The administration lobbied Congress to give the SDI project twice the inflation-adjusted funding as that received by the Manhattan Project at the height of World War II. With no obviously promising paths at all to follow, SDI opted for the spaghetti approach, throwing lots and lots of stuff at the wall in the hope that something would stick. Thus it devolved into a whole lot of individual fiefdoms with little accountability and less coordination with one another. Dr. Ashton Carter of Harvard, a former Defense Department analyst with full security clearance tasked with preparing a study of SDI for the Congressional Budget Office, concluded that the prospect for any sort of success was “so remote that it should not serve as the basis of public expectations of national policy.” Most of the press, seduced by Reagan’s own euphoria, paid little heed to such voices, instead publishing articles talking about the relative merits of laser and kinetic-energy weapons, battle stations in space, and whether the whole system should be controlled by humans or turned over to a supercomputer mastermind. With every notion as silly and improbable as every other and no direction in the form of a coherent plan from the SDI project itself, everyone could be an expert, everyone could build their own little SDI castle above the stratosphere. When journalists did raise objections, Reagan replied with more folksy homilies about how everyone thought Edison was crazy until he invented the light bulb, appealing to the good old American ingenuity that had got us to the Moon and could make anything possible. The administration’s messaging was framed so as to make objecting to SDI unpatriotic, downright un-American.

And yet even if you thought that American ingenuity would indeed save the day in the end, SDI had a more fundamental problem that made it philosophically as well as scientifically unsound. This most basic objection, cogently outlined at the time by the great astronomer, science popularizer, space advocate, and anti-SDI advocate Carl Sagan, was a fairly simple one. Even the most fanciful predictions for SDI must have a capacity ceiling, a limit beyond which the system simply couldn’t shoot down any more missiles. And it would always be vastly cheaper to build a few dozen more missiles than it would be to build and launch and monitor another battle station (or whatever) to deal with them. Not only would SDI not bring an end to nuclear weapons, it was likely to actually accelerate the nuclear-arms race, as the Soviet Union would now feel the need to not only be able to destroy the United States ten times over but be able to destroy the United States ten times over while also comprehensively overwhelming any SDI system in place. Reagan’s public characterization of SDI as a “nuclear umbrella” under which the American public might live safe and secure had no basis in reality. Even if SDI could somehow be made 99 percent effective, a figure that would make it more successful than any other defense in the history of warfare, the 1 percent of the Soviet Union’s immense arsenal that got through would still be enough to devastate many of the country’s cities and kill tens or hundreds of millions. There may have been an argument to make for SDI research aimed at developing, likely decades in the future, a system that could intercept and destroy a few rogue missiles. As a means of protection against a full-on strategic strike, though… forget it. It wasn’t going to happen. Ever. As President Nixon once said, “With 10,000 of these damn things, there is no defense.”

As with his seeming confusion about Gorbachev’s objections to SDI at their summits, it’s hard to say to what degree Reagan grasped this reality. Was he living a fantasy like so many others in the press and public when he talked of SDI rendering ICBMs “impotent and obsolete?” Whatever the answer to that question, it seems pretty clear that others inside the administration knew perfectly well that SDI couldn’t possibly protect the civilian population as a whole to any adequate degree. SDI was in reality a shell game, not an attempt to do an end-run around the doctrine of mutually assured destruction but an attempt to make sure that mutually assured destruction stayed mutually assured when it came to the United States’s side of the equation. Cold War planners had fretted for decades about a nightmare scenario in which the Soviet Union launched a first strike and the United States, due to sabotage, Soviet stealth technology, or some failure of command and control, failed to detect and respond to it in time by launching its own missiles before they were destroyed in their silos by those of the Soviets. SDI’s immediate strategic purpose was to close this supposed “window of vulnerability.” The system would be given, not the impossible task of protecting the vast nation as a whole, but the merely hugely improbable one of protecting those few areas where the missile silos were concentrated. Asked point-blank under oath whether SDI was meant to protect American populations or American missile silos, Pentagon chief of research and engineering Richard DeLauer gave a telling non-answer: “What we are trying to do is enhance deterrence. If you enhance deterrence and your deterrence is credible and holds, the people are protected.” This is of course just a reiteration of the MAD policy itself, not a separate justification for SDI. MAD just kept getting madder.

The essential absurdity of American plans for SDI seems to have struck Gorbachev by the beginning of 1987. Soviet intelligence had been scrambling for a few years by then, convinced that there had to be some important technological breakthrough behind all of the smoke the Reagan administration was throwing. It seems that at about this point they may have concluded that, no, the whole thing really was as ridiculous as it seemed. At any rate, Gorbachev decided it wasn’t worth perpetuating the Cold War over. He backed away from his demands, offering the United States the opportunity to continuing working on SDI if it liked, demanding only a commitment to inform the Soviet Union and officially back out of some relevant treaties (which might very possibly have to include the 1967 Outer Space Treaty that forbade nuclear explosions in space) if it decided to actually implement it. Coupled with Gorbachev’s soaring global popularity, it was enough to start getting deals done. Reagan and Gorbachev signed their first substantial agreement, to eliminate between them 2692 missiles, in December of 1987. More would follow, accompanied by shocking liberalization and reform behind the erstwhile Iron Curtain, culminating in the night of November 9, 1989, when the Berlin Wall, long the tangible symbol of division between East and West, came down. Just like that, almost incomprehensible in its suddenness, the Cold War was over. Trinity stands today as a cogent commentary on that strange shadow conflict, but it proved blessedly less than prescient about the way it would end. Whatever else is still to come, there will be no nuclear war between the United States of America and the Union of Soviet Socialist Republics.

If the end of the Cold War was shockingly unexpected, SDI played out exactly as you might expect. The program was renamed to the more modest Ballistic Missile Defense Organization and scaled back dramatically in 1993, by which time it had cost half again as much as the Manhattan Project — a staggering $30 billion, enough to make it the most expensive research program in history — and accomplished little. The old idea still resurfaces from time to time, but the fervor it once generated is all but forgotten now. SDI, like most history, is now essentially a footnote.

A more inspiring closing subject is Mikhail Gorbachev. His Nobel Peace Prize notwithstanding, he strikes me as someone who hasn’t quite gotten his due yet from history. There are many reasons that the Cold War came to an end when it did. Prominent among them was the increasingly untenable Soviet economy, battered during the decade by “the Soviet Union’s Vietnam” (Gorbachev’s phrase) in Afghanistan, a global downturn in oil prices, and the sheer creaking inertia of many years of, as the old Soviet saying went, workers pretending to work while the state pretended to pay them for it. Nevertheless, I don’t agree with Marx that history is a compendium of economic forces. Many individuals across Eastern Europe stepped forward to end their countries’ totalitarian regimes — usually peacefully, sometimes violently, occasionally at the cost of their lives. But Gorbachev’s shadow overlays all the rest. Undaunted by the most bellicose Presidential rhetoric in two decades, he used politics, psychology, and logic to convince Reagan to sit down with him and talk, then worked with him to shape a better, safer world. While Reagan talked about ending MAD through his chimerical Star Wars, Gorbachev actually did it, by abandoning his predecessors’ traditional intransigence, rolling up his sleeves, and finding a way to make it work. Later, this was the man who didn’t choose to send in the tanks when the Warsaw Pact started to slip away, making him, as Victor Sebstyen put it, one of very few leaders in the history of the world to elect not to use force to maintain an empire. Finally, and although it certainly was never his intention, he brought the Soviet Union in for a soft landing, keeping the chaos to a minimum and keeping the missiles from flying. Who would have imagined Gorbachev was capable of such vision, such — and I don’t use this word lightly — heroism? Who would have imagined he could weave his way around the hardliners at home and abroad to accomplish what he did? Prior to assuming office in 1985, he was just a smart, capable Party man who knew who buttered his bread, who, as he later admitted, “licked Brezhnev’s ass” alongside his colleagues. And then when he got to the top he looked around, accepted that the system just wasn’t working, and decided to change it. Gorbachev reminds us that the hero is often not the one who picks up a gun but the one who chooses not to.

(In addition to the sources listed in the previous article, Way Out There in the Blue by Frances FitzGerald is the best history I’ve found of SDI and its politics.)


Tags: , ,



During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had took office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had it shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became a substantial hit, Infocom’s last game to crack 100,000 in sales. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into the Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)


Tags: , ,

Out of the Frying Pan…

Activision, as embodied by Jim Levy (left), weds Infocom, as embodied by Joel Berez (right).

Activision, as embodied by Jim Levy (left), weds Infocom, as embodied by Joel Berez (right).

Activision’s first couple of years as a home-computer publisher were, for all their spirit of innovation and occasional artistic highs, mildly disappointing in commercial terms. While all of their games of this period were by no means flops, the only outsize hit among them was David Crane’s Ghostbusters. Activision was dogged by their own history; even selling several hundred thousand copies of Ghostbusters could feel anticlimactic when compared with the glory days of 1980 to 1983, when million-sellers were practically routine. And the company dragged along behind it more than psychological vestiges of that history. On the plus side, Jim Levy still had a substantial war chest made up of the profits socked away during those years with which to work. But on the minus side, the organization he ran was still too big, too unwieldy in light of the vastly reduced number of units they were moving these days in this completely different market. Levy was forced to authorize a painful series of almost quarterly layoffs as the big sales explosions stubbornly refused to come and Activison’s balance sheets remained in the red. Then came the departure of Alan Miller and Bob Whitehead to form the lean, mean Accolade, and that company’s galling instant profitability. Activision found themselves cast in the role of the bloated Atari of old, Jim Levy himself in that of the hated Ray Kassar. Nobody liked it one bit.

Levy and his board therefore adopted a new strategy for the second half of 1985: they would use some of that slowly dwindling war chest to acquire a whole stable of smaller developers, who would nevertheless continue to release games on their own imprints to avoid market saturation. The result would be more and more diverse games, separated into lines that would immediately identify for consumers just what type of game each title really was. In short order, Activision scooped up Gamestar, a developer of sports games. They also bought Creative Software, another tiny but stalwart industry survivor. Creative would specialize in home-oriented productivity software; ever since Brøderbund had hit it big with Bank Street Writer and The Print Shop publishers like Activision had been dreaming of duplicating their success. And then along came Infocom.

Joel Berez happened to run into Levy, accidentally or on purpose, during a business trip to Chicago in December of 1985. By this time Infocom’s travails were an open secret in the industry. Levy, by all accounts a genuine fan of Infocom’s games and, as Activision games like Portal attest, a great believer in the concept of interactive literature, immediately made it clear that Activision would be very interested in acquiring Infocom. Levy’s was literally the only offer on the table. After it dawned on them that Infocom alone could not possibly make a success out of Cornerstone, Al Vezza and his fellow business-oriented peers on Infocom’s board had for some time clung to the pipe dream of selling out to a big business publisher like Lotus, WordPerfect, or even Microsoft. But by now it was becoming clear even to them that absolutely no one cared a whit about Cornerstone, that the only value in Infocom was the games and the company’s still-sterling reputation as a game developer. However, those qualities, while by no means negligible, were outweighed in the eyes of most potential purchasers by the mountain of debt under which Infocom now labored, as well as by the worrisome shrinking sales of the pure text adventures released recently both by Infocom and their competitors. These were also very uncertain times for the industry in general, with many companies focused more on simple survival than expansion. Only Levy claimed to be able to sell his board on the idea of an Infocom acquisition. For Infocom, the choice was shaping up to be a stark one indeed: Activision subsidiary or bankruptcy. As Dave Lebling wryly said when asked his opinion on the acquisition, “What is a drowning man’s opinion of a life preserver?”

Levy was as good as his word. He convinced Activision’s board — some, especially in a year or two, might prefer to say “rammed the scheme through” — and on February 19, 1986, the two boards signed an agreement in principle for Activision to acquire Infocom by giving approximately $7.5 million in Activision stock to Infocom’s stockholders in return for all of their shares. This was, for those keeping score, about one-third what Simon & Schuster had been willing to pay barely a year before. But, what with their mountain of debt and flagging sales, Infocom’s new bargaining position wasn’t exactly strong; Simon & Schuster was now unwilling to do any deal at all, having already firmly rejected Vezza and Berez’s desperate, humiliating attempts to reopen the subject. As it was, Infocom considered themselves pretty lucky to get what they did; Levy could have driven a much harder bargain had he wanted to. And so Activision’s lawyers and accountants went to work to finalize things, and a few months later Infocom, Inc., officially ceased to exist. That fateful day was June 13, 1986, almost exactly seven years after a handful of MIT hackers had first gotten together with a vague intention to do “something with microcomputers.” It was also Friday the Thirteenth.

Still, even the most superstitious amongst Infocom’s staff could see little immediate ground for worry. If they had to give up their independence, it was hard to imagine a better guy to answer to than Jim Levy. He just “got” Infocom in a way that Al Vezza, for one, never had. He understood not only what the games were all about but also the company’s culture, and he seemed perfectly happy just to let both continue on as they were. During the due-diligence phase of the acquisition, Levy visited Infocom’s offices for a guided tour conducted, as one of his last official acts at Infocom, by an Al Vezza who visibly wanted nothing more by this time than to put this whole disappointing episode of his life behind him and return to the familiarity of MIT. In the process of duly demonstrating a series of games in progress, he came to Steve Meretzky’s next project, a risque science-fiction farce (later succinctly described by Infocom’s newsletter as “Hitchhiker’s Guide with sex”) called Leather Goddesses of Phobos. “Of course, that’s not necessarily the final name,” muttered Vezza with embarrassment. “What? I wouldn’t call it anything else!” laughed a delighted Levy, to almost audible sighs of relief from the staffers around him.

Levy not only accepted but joined right in with the sort of cheerful insanity that had always made Vezza so uncomfortable. He cemented Infocom’s loyalty via his handling of the “InfoWedding” staffers threw for him and Joel Berez, who took over once again as Infocom’s top manager following Vezza’s unlamented departure. A description of the blessed nuptials appeared in Infocom’s newsletter.

In a dramatic affirmation of combinatorial spirit, Activision President James H. Levy and Infocom President Joel M. Berez were merged in a moving ceremony presided over by InfoRabbi Stuart W. Galleywitz. Infocommies cheered, participated in responsive readings from Hackers (written by Steven Levy — no relation to Jim), and threw rice at the beaming CEOs.

Berez read a tone poem drawn from the purple prose of several interactive-fiction stories, and Levy responded with a (clean) limerick.

The bride wore a veil made from five yards of nylon net, and carried artificial flowers. Both bride and groom wore looks of bemused surprise.

After a honeymoon at Aragain Falls, the newly merged couple will maintain their separate product-development and marketing facilities in Mountain View, California, and Cambridge, Massachusetts (i.e., we’ll still be the Infocom you know and love).

Queried about graphics in interactive-fiction stories, or better parsers in Little Computer People, the happy couple declined comment, but smiled enigmatically.

Soon after, Levy submitted to a gently mocking “Gruer’s Profile” (a play on a long-running advertising series by Dewar’s whiskey) prepared for the newsletter:

Hobby: Collecting software-development companies.

Latest Book Read: The Ballyhoo hint book.

Latest Accomplishment: Finding foster homes for all the Little Computer People.

Favorite Infocom Game: Cornerstone.

Why I Do What I Do: Alimony.

Quote: “People often mistake me for Bruce Willis.”

Profile: Charismatic. A real motivator. Looks great in a limousine.

His Drink: “Gruer’s Dark,” right out of a canteen. “Its taste blends perfectly with the sense of satisfaction I feel in knowing that I am now the kingpin of interactive fiction.”

Levy seemed to go out of his way to make the Infocom folks feel respected and comfortable within his growing Activision family. He was careful, for instance, never to refer to this acquisition as an acquisition or, God forbid, a buy-out. It was always a “merger” of apparent equals. The recently departed Marc Blank, who kept in close touch with his old colleagues and knew Levy from his time in the industry, calls him today a “great guy.” Brian Moriarty considered him “fairly benign” (mostly harmless?), with a gratifying taste for “quirky, interesting games”: “He seemed like a good match. It looked like we were going to be okay.”

Jim Levy's "Gruer's Profile"

This period immediately after the Activision acquisition would prove to be an Indian summer of sorts between a very difficult period just passed and another very difficult one to come. In some ways the Imps had it better than ever. With Vezza and his business-oriented allies now all gone, Infocom was clearly and exclusively a game-development shop; all of the cognitive dissonance brought on by Cornerstone was at long last in the past. Now everyone could just concentrate on making the best interactive fiction they possibly could, knowing as they did so that any money they made could go back into making still better, richer virtual worlds. Otherwise, things looked largely to be business as usual. The first game Infocom published as an Activision subsidiary was Moriarty’s Trinity, in my opinion simply the best single piece of work they would ever manage, and one which everyone inside Infocom recognized even at the time as one of their more “special” games. As omens go, that seemed as good as they come, certainly more than enough to offset any concerns about that unfortunate choice of Friday the Thirteenth.

Activision’s marketing people did almost immediately offer some suggestions — and largely very sensible ones — to Infocom. Some of these Mike Dornbrook’s marketing people greeted with open arms; they were things that they had been lobbying for to the Imps, usually without much success, for years now. Most notably, Activision strongly recommended that Infocom take a hard look at a back catalog that featured some of the most beloved classics in the young culture of computer gaming and think about how to utilize the goodwill and nostalgia they engendered. The Zork brand in particular, still by far the most recognizable in Infocom’s arsenal, had been, in defiance of all marketing wisdom, largely ignored since the original trilogy concluded back in 1982. Now Infocom prepared a pair of deluxe limited-edition slip-cased compilations of the Zork and Enchanter trilogies designed not only to give newcomers a convenient point of entry but also, and almost more importantly, to appeal to the collecting instinct that motivated (and still motivates) so many of their fans. Infocom had long since learned that many of their most loyal customers didn’t generally get all that far in the games at all. Some didn’t even play them all that much. Many just liked the idea of them, liked to collect them and see them standing there on the shelf. Put an old game in a snazzy new package and many of them would buy it all over again.

Infocom also got to work at long last — in fact, literally years after they should have in Dornbrook’s view — on a new game to bear the Zork name. While they were at it, they assigned Meretzky to bring back Infocom’s single most beloved character, the cuddly robot Floyd, in the sequel to Planetfall that that game’s finale had promised back in 1983 — just as soon as he was done with Leather Goddesses, that is, a game for which Infocom, in deference to the time-honored maxim that Sex Sells, also had very high hopes.

The Infocom/Activision honeymoon period and the spirit of creative and commercial renewal it engendered would last barely six months. The chummy dialogue between these two offices on opposite coasts would likewise devolve quickly into decrees and surly obedience — or, occasionally, covert or overt defiance. But that’s for a future article. For now we’ll leave Infocom to enjoy their Indian summer of relative content, and begin to look at the games that this period produced.

(Largely the usual Infocom sources this time out: Jason Scott’s Get Lamp interviews and Down From the Top of Its Game. The Dave Lebling quote comes from an interview with Adventure Classic Gaming. The anecdote about Vezza and Levy comes from Steve Meretzky’s interview for Game Design, Theory & Practice by Richard Rouse III.

Patreon supporters: this article is a bit shorter than the norm simply because that’s the length that it “wanted” to be. Because of that, I’m making it a freebie. In the future I’ll continue to make articles of less than 2500 words free.)


Posted by on December 29, 2014 in Digital Antiquaria, Interactive Fiction


Tags: ,


‘Tis true my form is something odd,
But blaming me is blaming God;
Could I create myself anew
I would not fail in pleasing you.

– poem by Joseph Merrick, “The Elephant Man”


This article does contain some spoilers for Ballyhoo!

Ballyhoo, a low-key mystery written by a new Implementor, was the last game ever released by an independent Infocom. When it appeared in February of 1986, Al Vezza and Joel Berez were desperately trying to reel in their lifeline of last resort, a competitor interested in acquiring this imploding company that had fallen from such a precipitous height in just a year’s time. Having come in like a lion with Zork, Infocom, Inc., would go out like a lamb with Ballyhoo; it would go on to become one of their least remembered and least remarked games. We’ll eventually get to some very good reasons for Ballyhoo to be regarded as one of the lesser entries in the Infocom canon. Still, it’s also deserving of more critical consideration than it’s generally received for its unique tone and texture and, most importantly, for a very significant formal innovation. In fact, discounting as relative trivialities some small-scale tinkering with abbreviations and the like and as evolutionary dead ends a blizzard of largely unsuccessful experiments that would mark Infocom’s final years, said innovation would be the last such to significantly impact the art of the text adventure as it would evolve after the commercial glory years of the 1980s.

If Ballyhoo is one of Infocom’s more forgotten games, its creator, Jeff O’Neill, is certainly the Forgotten Implementor. His perspective is conspicuously absent from virtually every history written of the company in the last quarter century. Most notably, he was the one Imp who declined to be interviewed for Jason Scott’s Get Lamp project. For reasons that we won’t dwell on here, O’Neill remains deeply embittered by his time with Infocom. Incredible as this may sound to those of us today who persist in viewing the company’s brief life as a sort of Camelot, that time in his own life is one that O’Neill would rather forget, as I learned to my disappointment when I reached out to him before writing this article. He has a right to his silence and his privacy, so we’ll leave it at that and confine ourselves to the public details.

O’Neill, at the time a frustrated young journalist looking for a career change, was hired by Infocom in the spring of 1984, just one of what would prove to be a major second wave of talent — including among their ranks Jon Palace and Brian Moriarty — who arrived at about the same time. Like Moriarty, O’Neill’s original role was a practical one: he became one of Infocom’s in-house testers. Having proved himself by dint of talent and hard work and the great ideas for new games he kept proposing, within about a year he became the first of a few who would eventually advance out of the testing department to become late-period Imps after Infocom’s hopes for hiring outside writers to craft their games proved largely fruitless.

Whether we attribute it to his degree in Journalism or innate talent, O’Neill had one of the most delicate writerly touches to be found amongst the Imps. Ballyhoo adds a color to Infocom’s emotional palette that we haven’t seen before: world-weary melancholy. The setting is a spectacularly original one for any adventurer tired of dragons and spaceships: an anachronistic, down-at-the-heels circus called “The Traveling Circus That Time Forgot, Inc.” The tears behind a clown’s greasepaint facade, as well as the tawdry desperation that is the flip side of “the show must go on” for performers and performances past their time, have been amply explored in other art forms. Yet such subtle shades of feeling have been only rarely evoked by games before or after Ballyhoo. Ballyhoo, in the words of one of its own more memorable descriptive passages, “exposes the underside of circus life — grungy costumes strung about, crooked and cracked mirrors, the musty odor of fresh makeup mingled with clown sweat infusing the air.” Given what was going on around O’Neill as he wrote the game, it feels hard not to draw parallels with Infocom’s own brief ascendency and abrupt fall from grace: “Your experience of the circus, with its ballyhooed promises of wonderment and its ultimate disappointment, has been to sink your teeth into a candy apple whose fruit is rotten.”

The nihilistic emptiness at the heart of the circus sideshow, the tragedy of these grotesques who parade themselves before the public because there’s no other alternative available to them, has likewise been expressed in art stretching at least as far back as Freaks, a 1932 film directed by Tod Browning that’s still as shocking and transgressive as it is moving today. Another obvious cultural touchstone, which would have been particularly fresh in the mid-1980s thanks to Bernard Pomerance’s 1979 play and David Lynch’s 1980 film, is the story of the so-called “Elephant Man”: Joseph Merrick, a gentle soul afflicted with horrendous deformities who was driven out into the street by his father at age 17 and forced to sell himself to various exploiters as a traveling “human curiosity.” Some say that Merrick died at age 27 in 1890 because he insisted on trying to lie down to sleep — something his enormous, misshapen head would not allow — as part of his fruitless lifelong quest just to “be like other people.”

Ballyhoo‘s own collection of freaks is less extreme but can be almost as heartbreaking. There’s Comrade Thumb, the Russian midget who’s been crammed into a Czarist general’s uniform and sent out to do tricks. Like Merrick, whose deformities made speech almost impossible, Thumb can’t even communicate with his fellow humans; he speaks only Russian (I’ve had just a taste of this sort of linguistic isolation at times since leaving the United States, and know how hard it can be). But saddest of all is the case of Tina, the “827 pounds of feminine charm” who’s become the circus’s token fat woman.

West Half of Fat Lady
Dominating this once spacious room, geographic in her enormity, mountainous in her irreducibility, the fat lady sits (though no chair is visible) breathtakingly to the east. Paths around the attraction lead northeast and southeast. The exit is south.

>examine lady
The fat lady is wearing a big top, and the expression on her face is sad and distant, almost Rushmorean. She appears to be holding a small radio up to her ear.

It's a long haul, the scenery changing little. Eventually you arrive at ...

East Half of Fat Lady
The fat lady sits (though no chair is visible) breathtakingly to the west. Paths around the attraction lead northwest and southwest.

>give granola bar to tina
Tina is quick to confiscate the one-dollar-and-85-cent granola bar from your hand and grinds it up without hesitation. Turning her far-away gaze slowly in your direction, she seems to notice you for the first time this evening.

>tina, hello
She merely sighs wistfully, which creates a gale-force gust, and cranes her wrecking-ball-sized hand over to you.

>take hand
As you take hold, the fat lady's hand becomes relaxed, its full weight now residing in your arms like a sandbag and making your knees buckle.

>shake hand
Though unable to budge the fat lady's hand, your friendly intentions are nevertheless understood. The fat lady appears quite taken by your kindnesses. She clasps both her hands up to her chins, and stares ahead in teary silence.

If these passages, far from politically correct but by no means heartless, make you a bit uncomfortable, well, I like to think that they were meant to. After all, we’re among the sideshow gawkers. I’ve known people like Tina, cut off by their weight or other issues from a normal life, destined always to be noticed but never to be included. I recognize that wistful sigh, that far-off stare, that above-it-all stance that becomes their only defense. As for people like the circus’s manager Mr. Munrab — read the name backward — who we learn elsewhere “orders the roustabout to increase the frequency of her [Tina’s] feeding” every time she tries to go on a diet…. well, I’d like to think there’s a special circle of Hell for him along with Tom Norman, the man who stuck Joseph Merrick in a cage and set it up for the punters on Whitechapel Road.

I don’t want to give the impression that Ballyhoo is all doom and gloom, and certainly not that it’s entirely one-note in its mood. As Tina’s passages show, the game takes place in a vaguely surreal David Lynch-ian realm that’s tethered to but not quite the same as our own reality. This gives ample room for some flights of fancy that don’t always have to make us feel bad. O’Neill’s love of abstract wordplay, the theme around which his second and final work of interactive fiction would be built, also pops up in Ballyhoo from time to time. When you find yourself with an irresistible craving for something sweet, for instance, it takes the form of a literal monkey on your back who drives you to the concession stand. O’Neill also toys with the parser and the player sitting behind it to a degree not seen in an Infocom game since The Hitchhiker’s Guide to the Galaxy. Here’s what happens when you come upon a “piece of wood” that turns out to be a mousetrap:

>get wood
You have just encountered that brief instant of time between the realization that you've caused yourself excruciating Pain and the actual onslaught of such Pain, during which time most people speak with exclamation points and ... well, say things like ...

Easy there! You're jeopardizing our "G" rating.

Bravisimo! Once more now, with feeling.

Cut! Cut! Okay, that's a wrap.

There’s even a fake death message, just the sort of faintly cruel player trickery that would have made Adams proud.

Indeed, there’s a little bit of bite, even a faint note of misanthropy, to O’Neill’s writing that’s largely missing from that of the other Imps. Your fellow circus-goers are uniformly boorish and boring. One or two situations, as well as the logical illogic needed to escape from them, smack of Infocom’s later social satire Bureaucracy, to which O’Neill (amongst many others) would make contributions.

>enter short line
You are now standing at the tail end of the short line.

Time passes...

The face of the man ahead of you lights up as he spots something. "Hey, guys! It's ME, Jerry," he yells to a sizable group nearby, and they approach.

Time passes...

"Haven't seen you turkeys in years. Howda hell are you guys?" They all reintroduce themselves. "Hey -- you clowns thirsty? Get in here, I'll buy y'all beer."

"You sure it's not a problem?" asks the catcher.

"Heck no, just scoot in right here."

With both your resolve and your heaving bosom firm against the crush of interlopers, you are nevertheless forced to backpedal.

Time passes...

Jerry continues backslapping the second baseman.

Time passes...

Jerry continues jiving with the center fielder.

>exit long line
You hear an inner voice whisper, "Do I really want to forfeit my position in the long line?" To which you answer:

You nonchalantly walk away from the long line.

>enter long line
A lot of other people must not have had the same idea as you, as they virtually hemorrhage over to the short line. Steaming to the front of the line, you get a two-dollar-and-25-cent frozen banana pushed at you and are whisked to the side before you can even count your change.

Ballyhoo was Infocom’s fourth game to be given the “Mystery” genre label. As such, it’s also an earnest attempt to solve a real or perceived problem that had long frustrated players of those previous three mysteries. The first of them, Deadline, had exploded the possibilities for adventure games by simulating a dynamic story with independent actors rather than just setting the player loose in a static world full of puzzles to solve; The Witness and Suspect had then continued along the same course. Instead of exploring a geographical space, the player’s primary task became to explore a story space, to learn how this dynamic system worked and to manipulate it to her own ends by judicious, precisely timed interference. While a huge advance that brought a new dimension to the adventure game, this seemingly much more story-oriented approach also remained paradoxically problematic to fully reconcile to the view of Infocom’s games as interactive fiction, as, as their box copy would have it, stories you “woke up inside” and proceeded to experience like the protagonist of a novel. The experience of playing one of these early mysteries was more like that of an editor, or a film director making an adaptation of the novel. You had to take the stories apart piece by piece through probing and experimentation, then put everything back together in a way that would guide the protagonist, from whom you stood at a decided remove, to the optimal ending. That process might offer pleasures all its own, but it kept the player firmly in the realm of puzzle-solver rather than fiction-enjoyer — or, if you like, guiding the fiction became the overarching puzzle. Even Infocom’s most unabashed attempt to create a “literary” work to date, Steve Meretzky’s A Mind Forever Voyaging, became abruptly, jarringly gamelike again when you got to the final section, where you had to head off a sequence of events that would otherwise be the end of you. In a film or novel based on A Mind Forever Voyaging, this sequence would just chance to play out in just the right way to let Perry Simm escape by the skin of his teeth and save the world in the process. In the game, however, the player was forced to figure out what dramatically satisfying narrative the author wanted to convey, then manipulate events to bring it to fruition, a very artificial process all the way around. Yet the alternative of a static environment given motion only when the player deigned to push on something was even farther from the idea of “interactive fiction” as a layperson might take that phrase. What to do?

Infocom’s answer, to which they first fully committed in Ballyhoo, was to flip the process on its head: to make the story respond to the player rather than always asking the player to respond to the story. Put another way, here the story chases the player rather than the player chasing the story. (Feel free to insert your “in Soviet Russia…” jokes here.) Ballyhoo is another dynamic mystery with its own collection of dramatic beats to work through. Now, though, the story moves forward only when and as the player’s actions make it most dramatically satisfying to do so, rather than ticking along according to its own remorseless timetable. So, for example, Comrade Thumb will struggle to get a drink of water from the public water fountain at the beginning of the game for hundreds of turns if necessary, until the player helps him by giving him a boost. He’ll then toddle off to another location to wait for the player to enter. When and only when she does, he’ll carry off his next dramatic beat. Later, a certain bumbling detective will wander onto the midway and pass out dead drunk just when the needs of the plot, as advanced by the player thus far, demand that he do so. Sometimes these developments are driven directly by the player, but at other times they happen only in the name of dramatic efficiency, of story logic. Rather than asking the player to construct a story from a bunch of component parts, now the author deconstructs the story she wants the player to experience, then figures out how to put it back together on the fly in a satisfying way in response to the player’s own actions — but without always making the fact that the story is responding to the player rather than unspooling on its own clear to the player. Ideally, this should let the player just enjoy the unfolding narrative from her perspective inside the story, which will always just happen to play out in suitably dramatic fashion, full of the close calls and crazy coincidences that are such part and parcel of story logic. Virtually unremarked at the time, this formal shift would eventually go on to become simply the way that adventure games were done, to the extent that the old Deadline approach stands out as a strange, cruel anomaly when it crops up on rare occasions on the modern landscape.

Depending on how you see these things, you might view this new approach as a major advance or as a disappointment, even as a capitulation of sorts. Early adventure writers, including those at Infocom, were very invested in the idea of their games as simulations of believable (if simplified) worlds. See, for instance, the article which Dave Lebling published in Byte in December of 1980, which, years before Infocom would dub their games “interactive fiction,” repeatedly refers to Zork and the other games like it that Infocom hopes to make as “computerized fantasy simulations.” Or see the flyer found in Zork I itself, which refers to that game as “a self-contained and self-maintaining universe.” To tinker with such a universe, to introduce a hand of God manipulating the levers in the name of drama and affect, felt and still feels wrong to some people. Most, however, have come to accept that pure, uncompromising simulation does not generally lead to a satisfying adventure game. Adventure games may be better viewed as storytelling and puzzle-solving engines — the relative emphasis placed on the former and the latter varying from work to work — wherein simulation elements are useful as long as they add verisimilitude and possibility without adding boredom and frustration, and stop being useful just as soon as the latter qualities begin to outweigh the former.

Which is not to say that this new approach of the story chasing the player is a magic bullet. Virtually everyone who’s played adventure games since Ballyhoo is familiar with the dilemma of a story engine that’s become stuck in place, of going over and over a game’s world looking for that one trigger you missed that will satisfy the game that all is in proper dramatic order and the next act can commence. My own heavily plotted adventure game is certainly not immune to this syndrome, which at its extreme can feel every bit as artificial and mimesis-destroying, and almost as frustrating, as needing to replay a game over and over with knowledge from past lives. Like so much else in life and game design, this sort of reactive storytelling is an imperfect solution, whose biggest virtue is that most people prefer its brand of occasional frustration to others.

And now we’ve come to the point in this article where I need to tell you why, despite pioneering such a major philosophical shift and despite a wonderful setting brought to life by some fine writing, Ballyhoo does indeed deserve its spot amongst the lower tier of Infocom games. The game has some deep-rooted problems that spoil much of what’s so good about it.

The most fundamental issue, one which badly damages Ballyhoo as both a coherent piece of fiction and a playable game, is that of motivation — or rather lack thereof. When the game begins you’re just another vaguely dissatisfied customer exiting the big top along with the rest of the maddening crowd. Getting the plot proper rolling by learning about the mystery itself — proprietor Munrab’s young daughter Chelsea has been kidnapped, possibly by one of his own discontented performers — requires you to sneak into a storage tent for no reason whatsoever. You then eavesdrop on a fortuitous conversation which occurs, thanks to Ballyhoo‘s new dramatic engine, just at the right moment. And so you decide that you are better equipped to solve the case than the uninterested and besotted detective Munrab has hired. But really, what kind of creepy busybody goes to the circus and then starts crawling around in the dark through forbidden areas just for kicks? Ballyhoo makes only the most minimal of attempts to explain such behavior in its opening passage: “The circus is a reminder of your own secret irrational desire to steal the spotlight, to defy death, and to bask in the thunder of applause.” That’s one of the most interesting and potential-fraught passages in the game, but Ballyhoo unfortunately makes no more real effort to explore this psychological theme, leaving the protagonist otherwise a largely blank slate. Especially given that the mystery at the heart of the game is quite low-stakes — the kidnapping is so clearly amateurish that Chelsea is hardly likely to suffer any real harm, while other dastardly revelations like the presence of an underground poker game aren’t exactly Godfather material — you’re left wondering why you’re here at all, why you’re sticking your nose into all this business that has nothing to do with you. In short, why do you care about any of this? Don’t you have anything better to be doing?

A similar aimlessness afflicts the puzzle structure. Ballyhoo never does muster that satisfying feeling of really building toward the solution of its central mystery. Instead, it just offers a bunch of situations that are clearly puzzles to be solved, but never gives you a clue why you should be solving them. For instance, you come upon a couple of lions in a locked cage which otherwise contains nothing other than a lion stand used in the lion trainer’s act. You soon find a key to the cage and a bullwhip. You have no use for the lion stand right now, nor for the lions themselves, nor for their cage. There’s obviously a puzzle to be solved here, but why? Well, if you do so and figure out how to deal with the lions, you’ll discover an important clue under the lion stand. But, with no possible way to know it was there, why on earth would any person risk her neck to enter a lion cage for no reason whatsoever? (Presumably the same kind that would creep into a circus’s supply tent…) Elsewhere you come upon an elephant in a tent. Later you have the opportunity to collect a mouse. You can probably imagine what you need to do, but, again, why? Why are you terrorizing this poor animal in its tiny, empty tent? More specifically, how could you anticipate that the elephant will bolt away in the perfect direction to knock down a certain section of fence? This George Mallory approach to design is everywhere in Ballyhoo. While “because it’s there” has been used plenty of times in justifying adventure-game puzzles both before and after Ballyhoo, Infocom by this time was usually much, much better at embedding puzzles within their games’ fictions.

With such an opaque puzzle structure, Ballyhoo becomes a very tough nut to crack; it’s never clear what problems you should be working on at at any given time, nor how solving any given puzzle is likely to help you with the rest. It all just feels… random. And many of the individual solutions are really, really obscure, occasionally with a “read Jeff O’Neill’s mind” quality that pushes them past the boundary of fairness. Making things still more difficult are occasional struggles with the parser of the sort we’re just not used to seeing from Infocom by this stage: you can MOVE that moose head on the wall, but don’t try to TURN it. There’s also at least one significant bug that forced me to restore on my recent playthrough (the turnstile inexplicably stopped recognizing my ticket) and a few scattered typos. Again, these sorts of minor fit-and-finish problems are hardly surprising in general, but are surprising to find in an Infocom game of this vintage.

Assuming we give some of Hitchhiker’s dodgier elements a pass in the name of letting Douglas Adams be Douglas Adams, we have to go all the way back to those early days of Zork and Deadline to find an Infocom game with as many basic problems as this one. Ballyhoo isn’t, mind you, a complete reversion to the bad old days of 1982. Even leaving aside its bold new approach to plotting, much in Ballyhoo shows a very progressive sensibility. On at least one occasion when you’re on the verge of locking yourself out of victory, the game steers you to safety, saying that “the image of a burning bridge suddenly pops into your mind.” Yet on others it seems to positively delight in screwing you over. My theory, which is only that, is that Ballyhoo was adversely affected by the chaos inside Infocom as it neared release, that it didn’t get the full benefit of a usually exhaustive testing regime that normally rooted out not only bugs and implementation problems but also exactly the sorts of design issues that I’ve just pointed out. Thankfully, Ballyhoo would prove to be an anomaly; the games that succeeded it would once again evince the level of polish we’ve come to expect. Given that Ballyhoo was also the product of a first-time author, its failings are perhaps the result of a perfect storm of inexperience combined with distraction.

Ballyhoo was not, as you’ve probably guessed, a big seller, failing to break 30,000 copies in lifetime sales. It’s a paradoxical little game that I kind of love on one level but can’t really recommend on another. Certainly there’s much about it to which I really respond. Whether because I’m a melancholy soul at heart or because I just like to play at being one from time to time, I’m a sucker for its sort of ramshackle splendid decay. I’m such a sucker for it, in fact, that I dearly want Ballyhoo to be better than it is, to actually be the sad and beautiful work of interactive fiction that I sense it wants to be. I’ve occasionally overpraised it in the past for just that reason. But we also have to consider how well Ballyhoo works as an adventure game, and in that sense it’s a fairly broken creation. I won’t suggest that you tax yourself too much trying to actually solve it by yourself, but it’s well worth a bit of wandering around just to soak up its delicious melancholy.


Posted by on December 22, 2014 in Digital Antiquaria, Interactive Fiction


Tags: , ,