RSS

Tag Archives: moriarty

T Plus 4: Bombing Nevada

Trinity

Underground

You're in a narrow underground chamber, illuminated by an open door in the east wall. The walls and ceiling are gouged with deep spiral ruts; they look as if they've been routed out with heavy machinery.

A large cylinder occupies most of the chamber. The maze of cables and pipes surrounding it trails west, into the depths of a tunnel.

>w
Underground

The cables and pipes lining the tunnel's walls look like bloated veins and arteries in the splinter's flickering glow. Deep tunnels bend off to the east and west.

Some careless technician has left a walkie-talkie lying in the dirt.

>get walkie-talkie
Taken.

>turn on walkie-talkie
You turn on the rocker switch.

>z
Time passes.

A tinny voice, half-buried in static, says "Two."

>z
Time passes.

"One."

>z
Time passes.

The walkie-talkie clicks and hisses.

>z
Time passes.

For a brief moment, the tunnel is bathed in a raw white glare.

The most subtly chilling vista in Trinity is found not inside one of its real-world atomic vignettes, but rather in the magical land that serves as the central hub for your explorations. This landscape is dotted with incongruous giant toadstools, each of which, you eventually realize, represents a single atomic explosion.

As your eyes sweep the landscape, you notice more of the giant toadstools. There must be hundreds of them. Some sprout in clusters, others grow in solitude among the trees. Their numbers increase dramatically as your gaze moves westward, until the forest is choked with pale domes.

The scene is a representation of time, following the path of the sun from east to west. The toadstools choking the forest to the west presumably represent the nuclear apocalypse you’ve just escaped. If we subtract those toadstools along with the two somewhere far off to the east that must represent the Hiroshima and Nagasaki blasts, we’re left with those that represent not instances of atomic bombs used in anger, but rather tests. A few of these we know well as historical landmarks in their own right: the first hydrogen bomb; the first Soviet bomb; that original Trinity blast, off far to the southeast with the rising sun, from which the game takes its name and where its climax will play out. Like the bombs used in anger, these don’t interest us today; we’ll give them their due in future articles. What I do want to talk about today is some of the blasts we don’t usually hear so much about. As the landscape would indicate, there have been lots of them. Since the nuclear era began one summer morning in the New Mexico desert in 1945, there has been a verified total of 2119 tests of nuclear bombs. Almost half of that number is attributed to the United States alone. Yes, there have been a lot of bombs.

At the close of World War II, the big question for planners and politicians in the United States was that of who should be given control of the nation’s burgeoning nuclear arsenal. The Manhattan Project had been conducted under the ostensible auspices of the Army Air Force (the Air Force wouldn’t become its own independent service branch until 1947), but in reality had been something of a law unto itself. Now both Army and Navy were eager to lay claim to the bomb. The latter had dismissed the bomb’s prospects during the war years and declined to play any role in the Manhattan Project, but was nevertheless able to wrangle enough control now to be given responsibility for the first post-war tests of the gadgets, to be called Operation Crossroads. The tests’ announced objective was to determine the impact of the atomic bomb on military ships. Accordingly, the Navy assembled for atomic target practice around Bikini Atoll in the Marshall Islands a fleet made up of surplus American ships and captured German and Japanese that would have been the envy of most other nations. Its 93 vessels included in their ranks 2 aircraft carriers, 5 battleships, and 4 cruisers. The 167 native residents of Bikini were shipped off to another, much less survivable island, first stop in what would prove to be a long odyssey of misery. (Their sad story is best told in Operation Crossroads by Jonathan M. Weisgall.)

From the start, Operation Crossroads had more to do with politics than with engineering or scientific considerations. It was widely hyped as a “test” to see if the very idea of a fighting navy still had any relevance in this new atomic age. More importantly in the minds of its political planners, it would also be a forceful demonstration to the Soviet Union of just what this awesome new American weapon could do. Operation Crossroads was the hottest ticket in town during the summer of 1946. Politicians, bureaucrats, and journalists — everyone who could finagle an invitation — flocked to Bikini to enjoy the spectacle along with good wine and food aboard one of the Navy’s well-appointed host vessels, swelling the number of on-site personnel to as high as 40,000.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

The spectators would get somewhat less than they bargained for, many of the sailors considerably more. The first bomb was dropped from a borrowed Army Air Force B-29 because the Navy had no aircraft capable of carrying the gadget. Dropped on a hazy, humid morning from high altitude, from which level the B-29 was notoriously inaccurate even under the best conditions, the bomb missed the center of the doomed fleet by some 700 yards. Only two uninteresting attack transports sank instantly in anything like the expected spectacular fashion, and only five ships sank in total, the largest of them a cruiser. As the journalists filed their reams of disappointed copy and the Navy’s leadership breathed a sigh of relief, some 5000 often shirtless sailors were dispatched to board the various vessels inside the hot zone to analyze their damage; as a safety precaution, they first scrubbed them down using water, soap, and lye to get rid of any lingering radiation. The operation then proceeded with the second bomb, an underwater blast that proved somewhat more satisfying, ripping apart the big battleship Arkansas and the aircraft carrier Saratoga amongst other vessels and tossing their pieces high into the air.

The second Operation Crossroads shot, July 25, 1946.

The second Operation Crossroads shot, July 25, 1946.

Operation Crossroads was emblematic of a Navy leadership that had yet to get their collective heads around just what a paradigm-annihilating device the atomic bomb actually was. Their insistence on dropping it on warships, as if the future was just going to bring more Battles of Midway with somewhat bigger explosions, shows that they still thought of the atomic bomb as essentially just a more powerful version of the bombs they were used to, a fundamentally tactical rather than strategic device. Their complete failure to take seriously the dangers of radioactive fallout, meanwhile, may be the reason that the sailors who took part in Operation Crossroads suffered an average life-span reduction of three months compared to others in their peer group. These were early days yet in atomic physics, but their state of denial is nevertheless difficult to understand. If the horrific photographs and films out of Hiroshima and Nagasaki — some of which apparently are shocking enough to still be classified — hadn’t been warning enough, there was always the case of Los Alamos physicist Louis Slotin. Less than six weeks before Operation Crossroads began, Slotin had accidentally started a chain reaction while experimenting with the atomic core of the same type of bomb used in the tests. He stopped the reaction through quick thinking and bravery, but not before absorbing a lethal dose of radiation. His slow, agonizing death — the second such to be experienced by a Los Alamos physicist — was meticulously filmed and documented, then made available to everyone working with atomic weapons. And yet the Navy chortled about the failure of the atomic bomb to do as much damage as expected whilst cheerfully sending in the boys to do some cleanup, ignoring both the slowly dying goats and other animals they had left aboard the various ships and the assessment of the Bulletin of Atomic Scientists of the likely fate of any individual ship in the target fleet: “The crew would be killed by the deadly burst of radiation from the bomb, and only a ghost ship would remain, floating unattended in the vast waters of the ocean.”

Just as President Eisenhower would take space exploration out from under the thumb of the military a decade later with the creation of NASA, President Truman did an end-run around the military’s conventional thinking about the atomic bomb on January 1, 1947, when the new, ostensibly civilian Atomic Energy Commission took over all responsibility for the development, testing, and deployment of the nation’s atomic stockpile. The Atomic Energy Commision would continue to conduct a steady trickle of tests in the remoter reaches of the Pacific for many years to come, albeit none with quite the bizarre spectator-sport qualities of Operation Crossroads. But the twin shocks of the first Soviet test of an atomic bomb on August 29, 1949, and the beginning of the Korean War in 1950, which came equipped with a raging debate about whether, how, and when the United States should again use its nuclear arsenal in anger, led weapons developers to agitate for a more local test site where they could regularly and easily set off smaller weapons than the blockbusters that tended to get earmarked to the Pacific. There were, they argued, plenty of open spaces in the American Southwest that would suit such a purpose perfectly well. On December 18, 1950, Truman therefore approved the allocation for this purpose of a 680-square-mile area inside the vast Nellis Air Force Gunnery and Bombing Range in the Nevada desert some 65 miles northwest of Las Vegas. The first test there, marking the first atomic bomb to be exploded on American soil since the original Trinity device, took place astonishingly soon thereafter, on January 27, 1951. By the end of the year sleeping quarters, mess halls, and laboratories had been built, creating a functioning, happy little community dedicated to making ever better bombs. The saga of the Nevada Test Site had begun. In the end no fewer than 928 of the 1032 nuclear tests ever conducted by the United States would be conducted right here.

One of the many test shots seen from the Las Vegas strip during the 1950s.

One of the many test shots seen from the Las Vegas Strip during the 1950s.

The strangest years of this very strange enterprise were the earliest. With money plentiful and the need to keep ahead of the Soviets perceived as urgent, bombs were exploded at quite a clip — twelve during the first year alone. At first they were mostly dropped from airplanes, later more commonly hung from balloons or mounted atop tall temporary towers. The testing regime was, as test-site geophysicist Wendell Weart would later put it, very “free-form.” If someone at one of the nation’s dueling atomic-weapons laboratories of Lawrence Livermore and Los Alamos determined that he needed a “shot” to prove a point or answer a question, he generally got it in pretty short order. Whatever else the testing accomplished, it was also a hell of a lot of fun. “I guess little boys like fireworks and firecrackers,” Weart admits, “and this was the biggest set of fireworks you could ever hope to see.” Las Vegas residents grew accustomed to the surreal sight of mushroom clouds blooming over their cityscape, like scenes from one of the B-grade atomic-themed monsters movies that filled the theaters of the era. When the bombs went off at night, they sometimes made enough light to read a newspaper by.

This era of free-form atmospheric testing at the Nevada Test Site coincided with the era of atomic mania in the United States at large, when nuclear energy of all stripes was considered the key to the future and the square-jawed scientists and engineers who worked on it veritable heroes. The most enduring marker of this era today is also one of the first. In 1946, not one but two French designers introduced risqué new women’s bathing suits that were smaller and more revealing than anything that had come before. Jacques Heim called his the “atome,” or atom, “the world’s smallest bathing suit.” Louis Réard named his the bikini after the recently concluded Operation Crossroads tests at Bikini Atoll. “Like the bomb,” he declared, “the bikini is small and devastating.” It was Réard’s chosen name that stuck. In addition to explosive swimwear, by the mid-1950s you could get a “Lone Ranger atomic-bomb ring” by sending in 15 cents plus a Kix cereal proof of purchase; buy a pair of atomic-bomb salt and pepper shakers; buy an “Atomic Disintegrator” cap gun. Trinity‘s accompanying comic book with its breathless “Atomic Facts: Stranger than Fiction!” and its hyperactive patriotism is a dead ringer for those times.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Said times being what they were, Las Vegas denizens, far from being disturbed by the bombs going off so close by, embraced them with all of their usual kitschy enthusiasm. The test site helpfully provided an annual calendar of scheduled tests for civilians so they could make plans to come out and enjoy the shows. For children, it was a special treat to drive up to one of the best viewpoints on Mount Charleston early in the morning on the day of a shot, like an even better version of the Fourth of July; the budding connoisseurs  cataloged and ranked the shots and compared notes with their friends in the schoolyard. Many adults, being connoisseurs of another stripe, might prefer the “Miss Atomic Bomb” beauty pageants and revues that were all the rage along the Strip.

Showgirl Sally McCloskey does an "atomic ballet" in front of a shot.

Showgirl Sally McCloskey does an “atomic ballet” in front of a shot.

The official government stance, at the time and to a large extent even today, is that the radioactive fallout from these explosions traveled little distance if at all and was in any case minor enough to present few to no health or environmental concerns. Nevertheless, ranchers whose sheep grazed in the vicinity of the test site saw their flocks begin to sicken and die very soon after the test shots began. They mounted a lawsuit, which was denied under somewhat questionable circumstances in 1956; the sheep, claimed the court, had died of “malnutrition” or some other unidentified sickness. That judgment, almost all of the transcripts from which have since been lost, was later overturned on the rather astonishing basis of outright “fraud on the court” by the government’s defense team. That judgment was in its turn vacated on appeal in 1985, more than thirty years after the events in question. Virtually all questions about the so-called “Downwinders” who were affected — or believe they were affected — by fallout from the test site seem to end up in a similarly frustrating tangle.

What does seem fairly clear amid the bureaucratic babble, from circumstantial evidence if nothing else, is that the government even in the 1950s had more awareness of and concerns about fallout from the site than they owned up to publicly. Radioactive debris from those very first tests in early 1951 was detected, according to test-site meteorologist Philip Wymer Allen, going “up over Utah and across the Midwest and Illinois, not too far south of Chicago, and out across the Atlantic Coast and was still easily measured as the cloud passed north of Bermuda. We didn’t track it any further than that.” Already in 1952 physical chemist Willard Libby, inventor of radiocarbon dating and later a member of the Atomic Energy Commission, was expressing concerns about radioactive cesium escaping the site and being absorbed into the bones of people, especially children. A potential result could be leukemia. Another, arguably even graver concern, was radioiodine particles, which could be carried a surprising distance downwind before settling to earth, potentially on the forage preferred by sheep, goats, and cows. Many people in rural communities, especially in those days, drank unprocessed milk straight from the cow, as it were. If enough milk containing radioiodine is ingested, it can lead to thyroid cancer. Children were, once again, both particularly big drinkers of milk and particularly prone to the effects of the radioiodine that might be within it. When environmental chemist Delbert Barth was hired in the 1960s to conduct studies of radioiodine dispersion patterns at the site, he was asked to also make historical projections for the atmospheric shots of the 1950s — a request that, at least on its surface, seems rather odd if everyone truly believed there was absolutely nothing to fear. Similarly odd seems a policy which went into effect very early: not to conduct shots if the winds were blowing toward Las Vegas.

The radioactive exposure — or lack thereof — of the Downwinders remains a major political issue inside Nevada and also Utah, which many claim also received its fair share of fallout. Most people who were associated with the site say, predictably enough, that the Downwinders are at best misguided and at worst would-be freeloaders. Studies have not established a clear causal link between incidences of cancer and proximity to the Nevada Test Site, although many, including Barth, have expressed concerns about the methodologies they’ve employed. What we’re left with, then, are lots of heartbreaking stories which may have been caused by the activities at the site or may represent the simple hand of fate. (For a particularly sad story, which I won’t go into here because I don’t want to sound exploitative, see this interview with Zenna Mae and Eugene Bridges.)

The first era of the Nevada Test Site came to an abrupt end in November of 1958, when the United States and the Soviet Union entered into a non-binding mutual moratorium on all sorts of nuclear testing. For almost three years, the bombs fell silent at the test site and at its Soviet equivalent near Semipalatinsk in Kazakhstan. But then, on September 1, 1961, the Soviets suddenly started testing again, prompting the Nevada Test Site to go back into action as well. Still, the public was growing increasingly concerned over what was starting to look like the reckless practice of atmospheric testing. While Las Vegas had continued to party hearty, even before the moratorium the doughty farmers and ranchers working still closer to the site had, as Lawrence Livermore physicist Clifford Olsen rather dismissively puts it, “started to grumble a bit” about the effect they believed the fallout was having on their animals and crops and possibly their own bodies and those of their children. And now an international environmentalist movement was beginning to arise in response to Rachel Carson’s Silent Spring. In one of his last major acts before his assassination, President Kennedy in October of 1963 signed along with Soviet First Secretary Khrushchev the Partial Nuclear Test Ban Treaty that required all future nuclear tests to take place underground.

But never fear, the good times were hardly over at the Nevada Test Site. The scientists and engineers there had been experimenting with underground explosions for some years already in anticipation of this day that the more politically aware among them had begun to see as inevitable. Thus they were more than prepared to continue full-speed-ahead with a new regime of underground testing. The number of shots actually increased considerably during the 1960s, often clipping along at a steady average of one per week or more. Las Vegas, meanwhile, was still not allowed to forget about the presence of the test site. Residents grew accustomed to tremors that cracked plaster and made high-rises sway disconcertingly, phenomena that came to be known as “seismic fallout.” As the political mood shifted over the course of the decade, the number of complaints grew steadily, especially after a couple of big shots of well over 1 megaton in 1968 that caused serious structural damage to a number of buildings in Las Vegas. One of the most persistent and vociferous of the complainers was the eccentric billionaire recluse Howard Hughes, who was living at the time on the top two floors of the Desert Inn hotel. Hughes marshaled lots of money, employees, and political connections to his cause during the late 1960s, but was never able to stop or even slow the testing.

As for the environmental impact of this new breed of underground tests, the news is mixed. While neither is exactly ideal, it’s obviously preferable from an environmental standpoint to be exploding atomic bombs underground rather than in the open air. A whole new applied sub-science of geophysics, the discipline of nuclear “containment,” evolved out of efforts to, well, contain the explosions — to keep any radioactive material at all from “venting” to the surface during an explosion or “seeping” to the surface during the hours, months, and years afterward. And yet the attitudes of the folks working on the shots can still sound shockingly cavalier today. About 30 percent of the underground tests conducted during the 1960s leaked radioactivity to the surface to one degree or another. Those working at the site considered this figure acceptable. Virtually everyone present there during the 1960s makes note of the positive, non-bureaucratic, “can-do” attitude that still persisted into this new era of underground testing. Linda Smith, an administrator at the site, characterizes the attitude thus: “There is such a strong bias to get it done that overrides everything. Is there any profound discussion of should we or shouldn’t we? Is this good for the country? Is it not? There’s no question. You are there to get it done.” Clifford Olsen says, “We were all pretty much sure we were doing the right thing.”

What to make of this lack of introspection? Whatever else we say about it, we shouldn’t condemn the people of the Nevada Test Site too harshly for it. There were heaps of brilliant minds among them, but their backgrounds were very different from those of the people who had worked on the Manhattan Project, many of whom had thought and agonized at length about the nature of the work they were doing and the unimaginable power they were unleashing on the world. The men and few women of the Nevada Test Site, by contrast, had mostly come of age during or immediately after World War II, and had been raised in the very bosom of the burgeoning military-industrial complex. Indeed, most had had their education funded by military or industrial backers for the singular purpose of designing and operating nuclear weapons. This set them apart from their predecessors, who before the Manhattan Project and to a large degree after it — many among that first generation of bomb-makers considered their work in this area essentially done once the first few bombs had been exploded — tended to focus more on “pure” science than on its practical application. A few Brits aside, the Nevada Test Site people were monolithically American; many on the Manhattan Project came from overseas, including lots of refugees from occupied Europe. The Nevada Test Site people were politically conservative, in favor of law and order and strong defense (how could they not be given the nature of their work?); the Manhattan Project people were a much more politically heterogeneous group, with a leader in Robert Oppenheimer who had worked extensively for communist causes. Someone with a background like his would never have been allowed past the front gate of the Nevada Test Site.

Whatever else it was, the Nevada Test Site was just a great place to work. Regarded as they were as the nation’s main bulwark against the Soviet Union, the atomic scientists and all of those who worked with and supported them generally got whatever they asked for. Even the chow was first-rate: at the cafeteria, a dollar would get you all the steaks — good steaks — that you could eat. When all the long hours spent planning and calculating got to be too much, you could always take in a movie or go bowling: a little self-contained all-American town called Mercury grew up with the test site there in the middle of the desert. Its population peaked at about 10,000 during the 1960s, by which time it included in addition to a movie theater and bowling alley a post office, schools, churches, a variety of restaurants, a library, a swimming hall, and hotels — including one named, inevitably, the Atomic Motel. Or you could always take a walk just outside of town amidst the splendid, haunting desolation of the Nevada desert. And for those not satisfied with these small-town pleasures, the neon of Las Vegas beckoned just an hour or so down the highway.

But just as importantly, the work itself was deeply satisfying. After the slide rules and the geological charts were put away, there still remained some of that old childlike pleasure in watching things go boom. Wendell Weart: “I would go back in a tunnel and see what happened to these massive structures that we had put in there, and to see how it manhandled them and just wadded them up into balls. That was impressive.” Nor was the Nevada Test Site entirely an exercise in nuclear nihilism. While weapons development remained always the primary focus, most working there believed deeply in the peaceful potential for nuclear energy — even for nuclear explosions. One of the most extended and extensive test series conducted at the site was known as Operation Plowshare, a reference to “beating swords into plowshares” from the Book of Isaiah. Operation Plowshare eventually encompassed 27 separate explosions, stretching from 1961 to 1973. Its major focus was on nuclear explosions as means for carrying out grand earth-moving and digging operations, for the creation of trenches and canals among other things. (Such ideas formed the basis of the proposal Edward Teller bandied about during the Panama Canal controversy of the late 1970s to just dig another canal using hydrogen bombs.) Serious plans were mooted at one point to dig a whole new harbor at Cape Thompson in Alaska, more as a demonstration of the awesome potential of hydrogen bombs for such purposes than out of any practical necessity. Thankfully for the delicate oceanic ecosystem thereabouts, cooler heads prevailed in the end.

So, the people who worked at the site weren’t bad people. They were in fact almost uniformly good friends, good colleagues, good workers who were at the absolute tops of their various fields. Almost any one of them would have made a great, helpful neighbor. Nor, as Operation Plowshare and other projects attest, were they bereft of their own certain brand of idealism. If they sound heartlessly dismissive of the Downwinders’ claims and needlessly contemptuous of environmentalists who fret over the damage their work did and may still be doing, well, it would be hard for any of us to even consider the notion that the work to which we dedicated our lives — work which we thoroughly enjoyed, which made us feel good about ourselves, around which many of our happiest memories revolve — was misguided or downright foolish or may have even killed children, for God’s sake. I tend to see the people who worked at the site as embodying the best and the worst qualities of Americans in general, charging forward with optimism and industry and that great American can-do spirit — but perhaps not always thinking enough about just where they’re charging to.

A plume of radioactive debris vents from the Baneberry shot.

A plume of radioactive debris vents from the Baneberry shot.

The golden age of free-and-easy atomic testing at the Nevada Test Site ended at last on December 18, 1970. That was the day of Baneberry, a routine underground shot of just 10 kilotons. However, due to what the geophysicists involved claim was a perfect storm of factors, its containment model failed comprehensively. A huge cloud of highly radioactive particles burst to the surface and was blown directly over a mining encampment that was preparing the hole for another test nearby. By now the nature of radioactivity and its dangers was much better appreciated than it had been during the time of Operation Crossroads. All of the people at the encampment were put through extended, extensive decontamination procedures. Nevertheless, two heretofore healthy young men, an electrician and a security guard, died of leukemia within four years of the event. Their widows sued the government, resulting in another seemingly endless series of trials, feints, and legal maneuvers, culminating in yet another frustrating non-resolution in 1996: the government was found negligent and the plaintiffs awarded damages, but the deaths of the two men were paradoxically ruled not to have been a result of their radiation exposure. As many in the Downwinder community darkly noted at the time, a full admission of guilt in this case would have left the government open to a whole series of new lawsuits. Thus, they claimed, this strange splitting of the difference.

The more immediate consequence of Baneberry was a six-month moratorium on atomic testing at the Nevada Test Site while the accident was investigated and procedures were overhauled. When testing resumed, it did so in a much more controlled way, with containment calculations in particular required to go through an extended process of peer reviews and committee approvals. The Atomic Energy Commission also began for the first time to put pressure on the scientists and engineers to minimize the number of tests conducted by pooling resources and finding ways to get all the data they could out of each individual shot. The result was a slowdown from that high during the 1960s of about one shot per week to perhaps one or two per month. Old-timers grumbled about red tape and how the can-do spirit of the 1950s and 1960s had been lost, but, perhaps tellingly, there were no more Baneberrys. Of the roughly 350 shots at the Nevada Test Site after Baneberry, only 4 showed any detectable radiation leakage at all.

The site continued to operate right through the balance of the Cold War. The last bomb to be exploded there was also the last exploded to date by the United States: an anticlimactic little 5-kiloton shot on September 23, 1992. By this time, anti-nuclear activists had made the Nevada Test Site one of their major targets, and were a constant headache for everyone who worked there. Included among the ranks of those arrested for trespassing and disruption during the test site’s twilight years are Kris Kristofferson, Martin Sheen, Robert Blake, and Carl Sagan. Needless to say, the mood of the country and the public’s attitude toward nuclear weapons had changed considerably since those rah-rah days of atomic cap guns.

A tunnel waits in readiness, just in case.

A tunnel waits in readiness, just in case.

Since the mid-1990s the United States, along with Russia and the other established nuclear powers, has observed a long-lasting if non-binding tacit moratorium on all types of nuclear testing (a moratorium which unfortunately hasn’t been observed by new members of the nuclear club India, Pakistan, and North Korea). Stories of the days when mushroom clouds loomed over the Las Vegas Strip and the ground shook with the force of nuclear detonations are now something for long-time Nevada residents to share with their children or grandchildren. With its reason for existence in abeyance, the Nevada Test Site is in a state of largely deserted suspended animation today, Mercury a ghost town inhabited by only a few caretakers and esoteric researchers. One hopes that if Mercury should ever start to buzz with family life and commerce again it’s because someone has found some other, safer purpose for the desert landscape that surrounds it. In the meantime, the tunnels are still kept in readiness, just in case someone decides it’s time to start setting off the bombs again.

(The definitive resource on the history of the Nevada Test Site must be, now and likely forevermore, the University of Nevada at Las Vegas’s amazing Nevada Test Site Oral History Project. I could barely scratch the surface of the hundreds of lengthy interviews there when researching this article. And thanks to Duncan Stevens for his recommendation of Operation Crossroads by Jonathan M. Weisgall. I highly recommend the documentary The Atomic Cafe as a portrait of the era of atomic kitsch.)

 
 

Tags: , ,

T Plus 5: Bombs in Space

Trinity

Earth Orbit, on a satellite

The satellite you're riding is about twenty feet long, and shaped like a beer can.

>z
Time passes.

A red flash draws your eyes to the ground below, where the contrail of a missile is climbing into the stratosphere.

>z
Time passes.

The maneuvering thrusters on the satellite fire, turning the nose until it faces the ascending missile.

>z
Time passes.

The satellite erupts in a savage glare that lights up the ground below. A beam of violet radiation flashes downward, obliterating the distant missile. Unfortunately, you have little time to admire this triumph of engineering before the satellite's blast incinerates you.

Trinity aims in 256 K of text adventure to chronicle at least fifty years of humanity’s relationship to the atomic bomb, as encapsulated into seven vignettes. Two of these, the one dealing with the long-dreaded full-on nuclear war that begins with you on vacation in London’s Kensington Gardens and the one you see above involving a functioning version of Ronald Reagan’s “Star Wars” Strategic Defense Initiative (a proposition that all by itself justifies Trinity‘s “Fantasy” genre tag, as we’ll soon see), are actually speculative rather than historical, taking place at some point in the near future. The satirical comic that accompanies the game also reserves space for Reagan and his dream. It’s a bold choice to put Reagan himself in there, undisguised by pseudonymous machinations like A Mind Forever Voyaging‘s “Richard Ryder” — even a brave one for a company that was hardly in a position to alienate potential players. Trinity, you see, was released at the absolute zenith of Reagan’s popularity. While the comic and the game it accompanies hardly add up to a scathing sustained indictment a la A Mind Forever Voyaging, they do cast him as yet one more Cold Warrior in a conservative blue suit leading the world further along the garden path to the unthinkable. Today I’d like to look at this “orbiting ‘umbrella’ of high technology” that Trinity postulates — correctly — isn’t really going to help us all that much at all when the missiles start to fly. Along the way we’ll get a chance to explore some of the underpinnings of the nuclear standoff and also the way it came to an anticlimactically sudden end, so thankfully at odds with Trinity‘s more dramatic predictions of the supposed inevitable.

In November of 1985, while Trinity was in development, Ronald Reagan and the new Soviet General Secretary Mikhail Gorbachev met for the first American/Soviet summit of Reagan’s Presidency. The fact that the summit took place at all was almost entirely down to the efforts of Gorbachev, who quite skillfully made it politically impossible for Reagan not to attend. It marked the first time Reagan had actually talked with his Soviet counterpart face to face in his almost five years as President. The two men, as contemporary press reports would have it, “took the measure of each other” and largely liked what they saw, but came to no agreements. The second summit, held in Reykjavik, Iceland, in October of the following year, came to within a hair’s breadth of a major deal that would have started the superpowers down the road to the complete elimination of nuclear armaments and effectively marked the beginning of the end of the Cold War. The only stumbling block was the Strategic Defense Initiative. Gorbachev was adamant that Reagan give it up, or at least limit it to “laboratory testing”; Reagan just as adamantly refused. He repeatedly expressed to both Gorbachev and the press his bafflement at this alleged intransigence. SDI, he said, was to be a technology of defense, a technology for peace. His favorite metaphor was SDI as a nuclear “gas mask.” The major powers of the world had all banned poison gas by treaty after World War I, and, rather extraordinarily, even kept to that bargain through all the other horrors of World War II. Still, no one had thrown away their gas-mask stockpiles, and the knowledge that other countries still possessed them had just possibly helped to keep everyone honest. SDI, Reagan said, could serve the same purpose in the realm of nuclear weapons. He even made an extraordinary offer: the United States would be willing to give SDI to the Soviets “at cost” — whatever that meant — as soon as it was ready, as long as the Soviets would also be willing to share any fruits of their own (largely nonexistent) research. That way everyone could have nuclear gas masks! How could anyone who genuinely hoped and planned not to use nuclear weapons anyway possibly object to terms like that?

Gorbachev had a different view of the matter. He saw SDI as an inherently destabilizing force that would effectively jettison not one but two of the tacit agreements of the Cold War that had so far prevented a nuclear apocalypse. Would any responsible leader easily accept such an engine of chaos in return for a vague promise to “share” the technology? Would Reagan? It’s very difficult to know what was behind Reagan’s seeming naivete. Certainly his advisers knew that his folksy analogies hardly began to address Gorbachev’s very real and very reasonable concerns. If the shoe had been on the other foot, they would have had the same reaction. Secretary of Defense Caspar Weinberger had demonstrated that in December of 1983, when he had said, “I can’t imagine a more destabilizing factor for the world than if the Soviets should acquire a thoroughly reliable defense against these missiles before we did.” As for Reagan himself, who knows? Your opinion on the matter depends on how you take this famous but enigmatic man whom conservatives have always found as easy to deify as liberals to demonize. Was he a bold visionary who saved his country from itself, or a Machiavellian schemer who used a genial persona to institute an uglier, more heartless version of America? Or was he just a clueless if good-natured and very, very lucky bumbler? Or was he still the experienced actor, out there hitting his marks and selling the policies of his handlers like he had once shilled for General Electric? Regardless, let’s try to do more justice to Gorbachev’s concerns about SDI than Reagan did at their summits.

It’s kind of amazing that the Cold War never led to weapons in space. It certainly didn’t have to be that way. Histories today note what a shock it was to American pride and confidence when the Soviet Union became the first nation to successfully launch a satellite on October 4, 1957. That’s true enough, but a glance at the newspapers from the time also reveals less abstract fears. Now that the Soviets had satellites, people expected them to weaponize them, to use them to start dropping atomic bombs on their heads from space. One rumor, which amazingly turned out to have a basis in fact, claimed the Soviets planned to nuke the Moon, leading to speculation on what would happen if their missile was to miss the surface, boomerang around the Moon, and come back to Earth — talk about being hoisted by one’s own petard! The United States’s response to the Soviets’ satellite was par for the course during the Cold War: panicked, often ill-considered activity in the name of not falling behind. Initial responsibility for space was given to the military. The Navy and the Air Force, who often seemed to distrust one another more than either did the Soviets, promptly started squabbling over who owned this new seascape or skyscape, which depending on how you looked at it and how you picked your metaphors could reasonably be assumed to belong to either. While the Naval Research Laboratory struggled to get the United States’s first satellite into space, the more ambitious dreamers at the Air Force Special Weapons Center made their own secret plans to nuke the Moon as a show of force and mulled the construction of a manned secret spy base there.

But then, on July 29, 1958, President Eisenhower signed the bill that would transform the tiny National Advisory Committee for Aeronautics into the soon-to-be massive National Aeronautics and Space Administration — NASA. While NASA’s charter duly charged the new agency with making any “discoveries” available for “national defense” and with “the preservation of the role of the United States as a leader in aeronautical and space science and technology,” those goals came only after more high-toned abstractions like “the expansion of human knowledge” and the use of space for “peaceful and scientific purposes.” NASA was something of an early propaganda coup at a time when very little seemed to be going right with astronautics in the United States. The Soviet leadership had little choice but to accept the idea, publicly at least, of space exploration as a fundamentally peaceful endeavor. In 1967 the United States and the Soviet Union became signatories along with many other nations to the Outer Space Treaty that enshrined the peaceful status quo into international law. By way of compensation, the first operational ICBMs had started to come online by the end of the 1950s, giving both superpowers a way of dealing impersonal death from the stratosphere without having to rely on wonky satellites.

This is not to say that the Cold War never made it into space in any form. Far from it. Apollo, that grandest adventure of the twentieth century, would never have happened without the impetus of geopolitics. The Apollo 11 astronauts may have left a message on the Moon saying they had “come in peace for all mankind,” may have even believed it at some level, but that was hardly the whole story. President Kennedy, the architect of it all, had no illusions about the real purpose of his Moon pronouncement. “Everything that we do ought to be really tied into getting onto the Moon ahead of the Russians,” he told NASA Administrator James Webb in 1962. “Otherwise we shouldn’t be spending this kind of money because I’m not that interested in space.” The Moon Race, like war, was diplomacy through other means. As such, the division between military and civilian was not always all that clear. For instance, the first Americans to fly into orbit, like the first Soviets, did so mounted atop repurposed ICBMs.

Indeed, neither the American nor the Soviet military had any interest in leaving space entirely to the civilians. If one of the goals of NASA’s formation had been to eliminate duplications of effort, it didn’t entirely succeed. The Air Force in particular proved very reluctant to give up on their own manned space efforts, developing during the 1960s the X-15 rocket plane that Neil Armstrong among others flew to the edge of orbit, the cancelled Dyna-Soar space plane, and even a manned space station that also never got off the drawing board. Planners in both the United States and the Soviet Union seemed to treat the 1967 Outer Space Treaty as almost a temporary accord, waiting for the other shoe to drop and for the militarization of space to begin in earnest. I’ve already described in an earlier article how, once the Moon Race was over, NASA was forced to make an unholy alliance with the Air Force to build the space shuttle, whose very flight profile was designed to allow it to avoid space-based weaponry that didn’t yet actually exist.

Yet the most immediate and far-reaching military application of space proved to be reconnaissance satellites. Well before the 1960s were out these orbiting spies had become vital parts of the intelligence apparatus of both the United States and the Soviet Union, as well as vital tools for the detection of ICBM launches by the other side — yet another component of the ever-evolving balance of terror. Still, restrained by treaty, habit, and concern over what it might make the other guys do, neither of the superpowers ever progressed to the logical step of trying to shoot down those satellites that were spying on their countries. If you had told people in 1957 that there would still be effectively no weapons in space almost thirty years later, that there would never have been anything even remotely resembling a battle in space, I think they would be quite surprised.

But now SDI had come along and, at least in the Soviets’ view, threatened to undermine that tradition. They need only take at face value early reports of SDI’s potential implementations, which were all over the American popular media by the time of Reagan’s 1984 reelection campaign, to have ample grounds for concern. One early plan, proposed in apparent earnest by a committee who may have seen The Battle of Britain (or Star Wars) a few too many times, would have the United States and its allies protected by squadrons of orbiting manned fighter planes, who would rocket to the rescue to shoot down encroaching ICBMs, their daring pilots presumably wearing dashing scarves and using phrases like “Tally ho!” A more grounded plan, relatively speaking, was the one for hundreds of “orbiting battle stations” equipped with particle-beam weapons or missiles of their own — hey, whatever works — to pick off the ICBMs. Of course, as soon as these gadgets came into being the Soviets would have to develop gadgets of their own to try to take them out. Thus a precious accord would be shattered forever. To the Soviets, SDI felt like a betrayal, a breaking of a sacred trust that had so far kept people and satellites in space from having to shoot at each other and in doing so had just possibly prevented the development of a new generation of horrific weaponry.

And yet this was if anything the more modest of the two outrages they saw being inflicted on the world by SDI. The biggest problem was that it could be both a symptom and a cause of the ending of the old MAD doctrine — Mutually Assured Destruction — that had been the guiding principle of both superpowers for over twenty years and that had prevented them from blowing one another up along with the rest of the world. On its surface, the MAD formulation is simplicity itself. I have enough nuclear weapons to destroy your country — or at least to do unacceptable damage to it — and a window of time to launch them at you between the time I realize that you’ve launched yours at me and the time that yours actually hit me. Further, neither of us has the ability to stop the missiles of the other — at least, not enough of them. Therefore we’d best find some way to get along and not shoot missiles at each other. One comparison, so favored by Reagan that he drove Gorbachev crazy by using it over and over again at each of their summits, is that of two movie mobsters with cocked and loaded pistols pointed at each others’ heads.

That well-intentioned comparison is also a rather facile one. The difference is a matter of degree. Many of us had MAD, that most fundamental doctrine of the Cold War, engrained in us as schoolchildren to such a degree that it might be hard for us to really think about its horribleness anymore. Nevertheless, I’d like for us to try to do so now. Let’s think in particular about its basic psychological prerequisites. In order for the threat of nuclear annihilation to be an effective deterrent, in order for it never to be carried out, it must paradoxically be a real threat, one which absolutely, unquestionably would be carried out if the order was given. If the other side was ever to suspect that we were not willing to destroy them, the deterrent would evaporate. So, we must create an entire military superstructure, a veritable subculture, of many thousands of people all willing to unquestioningly annihilate tens or hundreds of millions of people. Indeed, said annihilation is the entire purpose of their professional existence. They sit in their missile silos or in their ready rooms or cruise the world in their submarines waiting for the order to push that button or turn that key that will quite literally end existence as they know it, insulated from the incalculable suffering that action will cause inside the very same sorts of “clean, carpeted, warmed, and well-lighted offices” that Reagan once described as the domain of the Soviet Union’s totalitarian leadership alone. If the rise of this sort of antiseptic killing is the tragedy of the twentieth century, the doctrine of MAD represents it taken to its well-nigh incomprehensible extreme.

MAD, requiring as it did people to be always ready and able to carry out genocide so that they would not have to carry out genocide, struck a perilous psychological balance. Things had the potential to go sideways when one of these actors in what most people hoped would be Waiting for Godot started to get a little bit too ready and able — in short, when someone started to believe that he could win. See for example General Curtis LeMay, head of the Strategic Air Command from its inception until 1965 and the inspiration for Dr. Strangelove‘s unhinged General Jack Ripper. LeMay believed to his dying day that the United States had “lost” the Cuban Missile Crisis because President Kennedy had squandered his chance to finally just attack the Soviet Union and be done with it; talked of the killing of 100 million human beings as a worthwhile trade-off for the decapitation of the Soviet leadership; openly campaigned for and sought ways to covertly acquire the metaphorical keys to the nuclear arsenal, to be used solely at his own dubious discretion. “If I see that the Russians are amassing their planes for an attack, I’m going to knock the shit out of them before they take off the ground,” he once told a civil-defense committee. When told that such an action would represent insubordination to the point of treason, he replied, “I don’t care. It’s my policy. That’s what I’m going to do.” Tellingly, Dr. Strangelove itself was originally envisioned as a realistic thriller. The film descended into black comedy only when Stanley Kubrick started his research and discovered that so much of the reality was, well, blackly comic. Much in Dr. Strangelove that moviegoers of 1964 took as satire was in fact plain truth.

If the belief by a single individual that a nuclear war can be won is dangerous, an institutionalized version of that belief might just be the most dangerous thing in the world. And here we get to the heart of the Soviets’ almost visceral aversion to SDI, for it seemed to them and many others a product of just such a belief.

During the mid-1970s, when détente was still the watchword of the day, a group of Washington old-timers and newly arrived whiz kids formed something with the Orwellian name of The Committee on the Present Danger. Its leading light was one Paul Nitze. A name few Americans then or now are likely to recognize, Nitze had been a Washington insider since the 1940s and would remain a leading voice in Cold War policy for literally the entire duration of the Cold War. He and his colleagues, many of them part of a new generation of so-called “neoconservative” ideologues, claimed that détente was a sham, that “the Soviets do not agree with the Americans that nuclear war is unthinkable and unwinnable and that the only objective of strategic doctrine must be mutual deterrence.” On the contrary, they were preparing for “war-fighting, war-surviving, and war-winning.” Their means for accomplishing the latter two objectives would be an elaborate civil-defense program that was supposedly so effective as to reduce their casualties in an all-out nuclear exchange to about 10 percent of what the United States could expect. The Committee offered little or no proof for these assertions and many others like them. Many simply assumed that the well-connected Nitze must have access to secret intelligence sources which he couldn’t name. If so, they were secret indeed. When the CIA, alarmed by claims of Soviet preparedness in the Committee’s reports that were completely new to them, instituted a two-year investigation to get to the bottom of it all, they couldn’t find any evidence whatsoever of any unusual civil-defense programs, much less any secret plans to start and win a nuclear war. It appears that Nitze and his colleagues exaggerated wildly and, when even that wouldn’t serve their ends, just made stuff up. (This pattern of “fixing the intelligence” would remain with Committee veterans for decades, leading most notably to the Iraq invasion of 2003.)

Throughout the Carter administration the Committee lobbied anyone who would listen, using the same sort of paranoid circular logic that had led to the nuclear-arms race in the first place. The Soviets, they said, have secretly abandoned the MAD strategy and embarked on a nuclear-war-winning strategy in its place. Therefore we must do likewise. There could be no American counterpart to the magical Soviet civil-defense measures that could somehow protect 90 percent of their population from the blasts of nuclear weapons and the long years of radioactive fall-out that would follow. This was because civil defense was “unattractive” to an “open society” (“unattractiveness” being a strangely weak justification for not doing something in the face of what the Committee claimed was an immediate existential threat, but so be it). One thing the United States could and must do in response was to engage in a huge nuclear- and conventional-arms buildup. That way it could be sure to hammer the Soviets inside their impregnable tunnels — or wherever it was they would all be going — just as hard as possible. But in addition, the United States must come up with a defense of its own.

Although Carter engaged in a major military buildup in his own right, his was nowhere near big enough in the Committee’s eyes. But then came the 1980 election of Ronald Reagan. Reagan took all of the Committee’s positions to heart and, indeed, took most of its most prominent members into his administration. Their new approach to geopolitical strategy was immediately apparent, and immediately destabilizing. Their endless military feints and probes and aggressive rhetoric seemed almost to have the intention of starting a war with the Soviet Union, a war they seemed to welcome whilst being bizarrely dismissive of its potentially world-ending consequences. Their comments read like extracts from Trinity‘s satirically gung-ho accompanying comic. “Nuclear war is a destructive thing, but it is still in large part a physics problem,” said one official. “If there are enough shovels to go around, everybody’s going to make it. It’s the dirt that does it,” said another. Asked if he thought that a nuclear war was “winnable,” Caspar Weinberger replied, “We certainly are not planning to be defeated.” And then, in March of that fraught year of 1983 when the administration almost got the nuclear war it seemed to be courting, came Reagan’s SDI speech.

The most important thing to understand about SDI is that it was always a fantasy, a chimera chased by politicians and strategists who dearly wished it was possible. The only actual scientist amongst those who lobbied for it was Edward Teller, well known to the public as the father of the hydrogen bomb. One of the few participants in the Manhattan Project which had built the first atomic bomb more than 35 years before still active in public life at the time that Reagan took office, Teller was a brilliant scientist when he wanted to be, but one whose findings and predictions were often tainted by his strident anti-communism and a passion for nuclear weapons that could leave him sounding as unhinged as General LeMay. Teller seldom saw a problem that couldn’t be solved just by throwing a hydrogen bomb or two at it. His response to Carter’s decision to return the Panama Canal to Panama, for instance, was to recommend quickly digging a new one across some more cooperative Central American country using hydrogen bombs. Now, alone amongst his scientific peers, Teller told the Reagan administration that SDI was possible. He claimed that he could create X-ray beams in space by, naturally, detonating hydrogen bombs just so. These could be aimed at enemy missiles, zapping them out of the sky. The whole system could be researched, built, and put into service within five years. As evidence, he offered some inconclusive preliminary results derived from experimental underground explosions. It was all completely ridiculous; we still don’t know how to create such X-ray beams today, decades on. But it was also exactly the sort of superficially credible scientific endorsement — and from the father of the hydrogen bomb, no less! — that the Reagan administration needed.

Reagan coasted to reelection in 1984 in a campaign that felt more like a victory lap, buoyed by “Morning Again in America,” an energetic economy, and a military buildup that had SDI as one of its key components. The administration lobbied Congress to give the SDI project twice the inflation-adjusted funding as that received by the Manhattan Project at the height of World War II. With no obviously promising paths at all to follow, SDI opted for the spaghetti approach, throwing lots and lots of stuff at the wall in the hope that something would stick. Thus it devolved into a whole lot of individual fiefdoms with little accountability and less coordination with one another. Dr. Ashton Carter of Harvard, a former Defense Department analyst with full security clearance tasked with preparing a study of SDI for the Congressional Budget Office, concluded that the prospect for any sort of success was “so remote that it should not serve as the basis of public expectations of national policy.” Most of the press, seduced by Reagan’s own euphoria, paid little heed to such voices, instead publishing articles talking about the relative merits of laser and kinetic-energy weapons, battle stations in space, and whether the whole system should be controlled by humans or turned over to a supercomputer mastermind. With every notion as silly and improbable as every other and no direction in the form of a coherent plan from the SDI project itself, everyone could be an expert, everyone could build their own little SDI castle above the stratosphere. When journalists did raise objections, Reagan replied with more folksy homilies about how everyone thought Edison was crazy until he invented the light bulb, appealing to the good old American ingenuity that had got us to the Moon and could make anything possible. The administration’s messaging was framed so as to make objecting to SDI unpatriotic, downright un-American.

And yet even if you thought that American ingenuity would indeed save the day in the end, SDI had a more fundamental problem that made it philosophically as well as scientifically unsound. This most basic objection, cogently outlined at the time by the great astronomer, science popularizer, space advocate, and anti-SDI advocate Carl Sagan, was a fairly simple one. Even the most fanciful predictions for SDI must have a capacity ceiling, a limit beyond which the system simply couldn’t shoot down any more missiles. And it would always be vastly cheaper to build a few dozen more missiles than it would be to build and launch and monitor another battle station (or whatever) to deal with them. Not only would SDI not bring an end to nuclear weapons, it was likely to actually accelerate the nuclear-arms race, as the Soviet Union would now feel the need to not only be able to destroy the United States ten times over but be able to destroy the United States ten times over while also comprehensively overwhelming any SDI system in place. Reagan’s public characterization of SDI as a “nuclear umbrella” under which the American public might live safe and secure had no basis in reality. Even if SDI could somehow be made 99 percent effective, a figure that would make it more successful than any other defense in the history of warfare, the 1 percent of the Soviet Union’s immense arsenal that got through would still be enough to devastate many of the country’s cities and kill tens or hundreds of millions. There may have been an argument to make for SDI research aimed at developing, likely decades in the future, a system that could intercept and destroy a few rogue missiles. As a means of protection against a full-on strategic strike, though… forget it. It wasn’t going to happen. Ever. As President Nixon once said, “With 10,000 of these damn things, there is no defense.”

As with his seeming confusion about Gorbachev’s objections to SDI at their summits, it’s hard to say to what degree Reagan grasped this reality. Was he living a fantasy like so many others in the press and public when he talked of SDI rendering ICBMs “impotent and obsolete”? Whatever the answer to that question, it seems pretty clear that others inside the administration knew perfectly well that SDI couldn’t possibly protect the civilian population as a whole to any adequate degree. SDI was in reality a shell game, not an attempt to do an end-run around the doctrine of mutually assured destruction but an attempt to make sure that mutually assured destruction stayed mutually assured when it came to the United States’s side of the equation. Cold War planners had fretted for decades about a nightmare scenario in which the Soviet Union launched a first strike and the United States, due to sabotage, Soviet stealth technology, or some failure of command and control, failed to detect and respond to it in time by launching its own missiles before they were destroyed in their silos by those of the Soviets. SDI’s immediate strategic purpose was to close this supposed “window of vulnerability.” The system would be given, not the impossible task of protecting the vast nation as a whole, but the merely hugely improbable one of protecting those few areas where the missile silos were concentrated. Asked point-blank under oath whether SDI was meant to protect American populations or American missile silos, Pentagon chief of research and engineering Richard DeLauer gave a telling non-answer: “What we are trying to do is enhance deterrence. If you enhance deterrence and your deterrence is credible and holds, the people are protected.” This is of course just a reiteration of the MAD policy itself, not a separate justification for SDI. MAD just kept getting madder.

The essential absurdity of American plans for SDI seems to have struck Gorbachev by the beginning of 1987. Soviet intelligence had been scrambling for a few years by then, convinced that there had to be some important technological breakthrough behind all of the smoke the Reagan administration was throwing. It seems that at about this point they may have concluded that, no, the whole thing really was as ridiculous as it seemed. At any rate, Gorbachev decided it wasn’t worth perpetuating the Cold War over. He backed away from his demands, offering the United States the opportunity to continue working on SDI if it liked, demanding only a commitment to inform the Soviet Union and officially back out of some relevant treaties (which might very possibly have to include the 1967 Outer Space Treaty that forbade nuclear explosions in space) if it decided to actually implement it. Coupled with Gorbachev’s soaring global popularity, it was enough to start getting deals done. Reagan and Gorbachev signed their first substantial agreement, to eliminate between them 2692 missiles, in December of 1987. More would follow, accompanied by shocking liberalization and reform behind the erstwhile Iron Curtain, culminating in the night of November 9, 1989, when the Berlin Wall, long the tangible symbol of division between East and West, came down. Just like that, almost incomprehensible in its suddenness, the Cold War was over. Trinity stands today as a cogent commentary on that strange shadow conflict, but it proved blessedly less than prescient about the way it would end. Whatever else is still to come, there will be no nuclear war between the United States of America and the Union of Soviet Socialist Republics.

If the end of the Cold War was shockingly unexpected, SDI played out exactly as you might expect. The program was renamed to the more modest Ballistic Missile Defense Organization and scaled back dramatically in 1993, by which time it had cost half again as much as the Manhattan Project — a staggering $30 billion, enough to make it the most expensive research program in history — and accomplished little. The old idea still resurfaces from time to time, but the fervor it once generated is all but forgotten now. SDI, like most of history, is now essentially a footnote.

A more inspiring closing subject is Mikhail Gorbachev. His Nobel Peace Prize notwithstanding, he strikes me as someone who hasn’t quite gotten his due yet from history. There are many reasons that the Cold War came to an end when it did. Prominent among them was the increasingly untenable Soviet economy, battered during the decade by “the Soviet Union’s Vietnam” (Gorbachev’s phrase) in Afghanistan, a global downturn in oil prices, and the sheer creaking inertia of many years of, as the old Soviet saying went, workers pretending to work while the state pretended to pay them for it. Nevertheless, I don’t agree with Marx that history is a compendium of economic forces. Many individuals across Eastern Europe stepped forward to end their countries’ totalitarian regimes — usually peacefully, sometimes violently, occasionally at the cost of their lives. But Gorbachev’s shadow overlays all the rest. Undaunted by the most bellicose Presidential rhetoric in two decades, he used politics, psychology, and logic to convince Reagan to sit down with him and talk, then worked with him to shape a better, safer world. While Reagan talked about ending MAD through his chimerical Star Wars, Gorbachev actually did it, by abandoning his predecessors’ traditional intransigence, rolling up his sleeves, and finding a way to make it work. Later, this was the man who didn’t choose to send in the tanks when the Warsaw Pact started to slip away, making him, as Victor Sebstyen put it, one of very few leaders in the history of the world to elect not to use force to maintain an empire. Finally, and although it certainly was never his intention, he brought the Soviet Union in for a soft landing, keeping the chaos to a minimum and keeping the missiles from flying. Who would have imagined Gorbachev was capable of such vision, such — and I don’t use this word lightly — heroism? Who would have imagined he could weave his way around the hardliners at home and abroad to accomplish what he did? Prior to assuming office in 1985, he was just a smart, capable Party man who knew who buttered his bread, who, as he later admitted, “licked Brezhnev’s ass” alongside his colleagues. And then when he got to the top he looked around, accepted that the system just wasn’t working, and decided to change it. Gorbachev reminds us that the hero is often not the one who picks up a gun but the one who chooses not to.

(In addition to the sources listed in the previous article, Way Out There in the Blue by Frances FitzGerald is the best history I’ve found of SDI and its politics.)

 
 

Tags: , ,

Trinity

Trinity

During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had taken office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became one of Infocom’s last substantial hits. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)

 
 

Tags: , ,

Wishbringer

Brian Moriarty, 1985

Brian Moriarty, 1985

Brian Moriarty was the first of a second wave of Infocom authors from very different and more diverse backgrounds than the original Imps. Their fresh perspectives would be a welcome addition during the latter half of the company’s history. Some of the second wave all but stumbled through the doors of Infocom, but not Moriarty — not at all Moriarty. His arrival as an Imp in September of 1984 marked the fruition of a calculated “assault on Infocom” — his words, not mine — that had taken over two years to bring off.

Moriarty’s personal history is perfect for an Imp, being marked by a mix of technical and literary interests right from his grade-school years. After taking a degree in English Literature from Southeastern Massachusetts University in 1978, he found a job in a Radio Shack store, where he spent hours many days playing with the TRS-80s. He didn’t buy a computer of his own, however, until after he had become a technical writer at Bose Corporation in Framingham, Massachusetts. It was there in 1981 that a colleague brought in his new Atari 800 to show off. Moriarty succumbed to the greatest Atari marketing weapon ever devised: the classic game Star Raiders. He soon headed out to buy an Atari system of his own.

Along with the computer and Star Raiders, Moriarty also brought home a copy of Scott Adams’s Strange Odyssey. He played it and the other Scott Adams games obsessively, thinking all the while of all the ways they could be better. Then one day he spotted Infocom’s Deadline on the shelf of his local Atari dealer. From its dossier-like packaging to its remarkable parser and its comparative reams of luxurious text, it did pretty much everything he had been dreaming about. Moriarty knew in an instant what he wanted to do, and where he wanted to do it. How great to learn that Infocom was located right there in the Boston area; that, anyway, was one problem less to deal with. Still, Infocom was a tiny, insular company at this point, and weren’t exactly accepting resumes from eager Atari enthusiasts who’d never designed an actual game before.

So Moriarty put Infocom in his long-range planning folder and went for the time being somewhere almost as cool. Back at Radio Shack, he’d worked with a fellow named Lee Pappas, whom he’d been surprised to rediscover behind the counter of the local Atari dealer when he’d gone to buy his 800 system. Pappas and a friend had by then already started a little newsletter, A.N.A.L.O.G. (“Atari Newsletter and Lots of Games”). By the end of 1982 it had turned into a full-fledged glossy magazine. Pappas asked Moriarty, who’d already been a regular contributor for some months, if he’d like to come work full-time for him. Moriarty said yes, leaving his safe, comfortable job at Bose behind; it was “the best career move I ever made.”

A.N.A.L.O.G. was a special place, a beloved institution within and chronicler of the Atari 8-bit community in much the same way that Softalk was of the Apple II scene. Their articles were just a little bit more thoughtful, their type-in programs a little bit better, their reviews a little bit more honest than was the norm at other magazines. Moriarty, a graceful writer as well as a superb Atari hacker, contributed to all those aspects by writing articles and reviews and programs. Life there was pretty good: “It was a small group of nerdy guys in their 20s who loved computer games, ate the same junk foods, and went to see the same science-fiction movies together.”

Still, Moriarty didn’t forget his ultimate goal. Having advanced one step by getting himself employed in the same general industry as Infocom, he set about writing his first adventure game to prove his mettle to anyone — Infocom, perhaps? — who might be paying attention. Adventure in the Fifth Dimension appeared in A.N.A.L.O.G.‘s April/May 1983 issue. A necessarily primitive effort written mostly in BASIC and running in 16 K, it nevertheless demonstrated some traits of Moriarty’s later work by mixing a real place, Washington D.C., with fantastic and surreal elements: a group of aliens have stolen the Declaration of Independence, and it’s up to you to track down an entrance to their alternate universe and get it back. A year later, Moriarty continued his campaign with another, more refined adventure written entirely in assembly language. Crash Dive! pits the player against a mutineer aboard a nuclear submarine, a scenario much more complex and plot-heavy than the typical magazine-type-in treasure hunt. It even included a set of Infocom-style feelies, albeit only via a photograph in the magazine.

Crash Dive!'s "feelies"

With two games under his belt, Moriarty applied for a position as a game designer at Infocom, but his resume came right back to him. Then a colleague showed him a posting he’d spotted on the online service CompuServe. It was from Dan Horn, manager of Infocom’s Micro Group, looking for an expert 6502 hacker to work on Z-Machine interpreters. It took Moriarty about “45 seconds” to answer. Horn liked what he saw of Moriarty, and in early 1984 the latter started working for the former in the building where the magic happened. His first project involved, as chance would have it, another submarine-themed game: he modified the Atari 8-bit, Commodore 64, and Apple II interpreters to support the sonar display in Seastalker. Later he wrote complete new interpreters for the Radio Shack Color Computer and the ill-fated Commodore Plus/4.

He was tantalizingly close to his goal. Having broken through the outer gates, he just needed to find a way into the inner keep of the Imps themselves. He took to telling Berlyn, Blank, Lebling, and the rest about his ambition every chance he got, while also sharing with them his big idea for a game: a grand “historical fantasy” that would deal with no less weighty a subject than the history of atomic weapons and their implications for humanity. It seemed the perfect subject for the zeitgeist of 1984, when the Cold War was going through its last really dangerous phase and millions of schoolchildren were still walking around with souls seared by the previous year’s broadcast of The Day After.

Moriarty got his shot at the inner circle when a certain pop-science writer whom Infocom had hired to write a game was allegedly found curled up beneath his desk in a little ball of misery, undone by the thorny syntax of ZIL. This moment marks the end of Marc Blank’s dream of being able to hire professional writers off the street, set them down with a terminal and a stack of manuals, and wait for the games to come gushing forth. From now on the games would be written by people already immersed in Infocom’s technology; the few outside collaborations to come would be just that, collaborations, with established programmers inside Infocom doing the actual coding.

That new philosophy was great news for a fellow like Brian Moriarty, skilled coder that he was. The Imps decided to reward his persistence and passion and give him a shot. Only thing was, they weren’t so sure about the big historical fantasy, at least not for a first game. What they really had in mind was a made-to-order game to fill a glaring gap in their product matrix: a gentle, modestly sized game to introduce newcomers to interactive fiction — an “Introductory”-level work. And it should preferably be a Zorkian fantasy, because that’s what sold best and what most people still thought of when they thought of Infocom. None of the current Imps were all that excited about such a project. Would Moriarty be interested? He wasn’t about to split hairs over theme or genre or anything else after dreaming of reaching this point for so long; he answered with a resounding “Absolutely!” And so Brian Moriarty became an Imp at last — to no small consternation from Dan Horn, who’d thought Moriarty had come to Infocom to do “great work for me.”

It’s kind of surprising that it took Infocom this long to perceive the need for a game like the one that Moriarty would now be taking on as his first assignment. Their original matrix had offered only games for children — “Interactive Fiction Junior” — below the “Standard” level. Considering that even the hard-as-nails Hitchhiker’s Guide to the Galaxy was labelled “Standard,” the leap from “Junior” to “Standard” could be a daunting one indeed. Clearly there was room for a work more suitable for adult novices, one that didn’t condescend in the way that Seastalker, solid as it is on its own terms, might be perceived to do. Infocom had now decided to make just such a game at last — although, oddly, the problematic conflations continued. Rather than simply add a fifth difficulty level to the matrix, they decided to dispense with the “Junior” category entirely, relabeling Seastalker an “Introductory” game. This might have made existing print materials easier to modify, but it lost track entirely of Seastalker‘s original target demographic. Infocom claimed in The New Zork Times that “adults didn’t want a kid’s game; in fact, kids didn’t want a kid’s game.” Which rather belied the claim in the same article that Seastalker had been a “success,” but there you go.

Moriarty was a thoughtful guy with a bit of a bookish demeanor, so much so that his inevitable nickname of “Professor” actually suited him really well. Now he started thinking about how he could make an introductory game that wouldn’t be too condescending or trivial to the Infocom faithful who would hopefully also buy it. He soon hit upon the idea of including a magic MacGuffin which would allow alternate, simpler solutions to many puzzles at a cost to the score — literally a Wishbringer. The hardcore could eschew its use from the start and have a pretty satisfying experience; beginners could, after the satisfaction and affirmation of solving the game the easy way, go back and play again the hard way to try to get a better score. It was brilliant, as was the choice not to make using the Wishbringer just a “solve this puzzle” button but rather an intriguing little puzzle nexus in its own right. First the player would have to find it; then she would have to apply it correctly by wishing for “rain,” “advice,” “flight,” “darkness,” “foresight,” “luck,” or “freedom” whilst having the proper material components for the spell on hand, a perfect primer for the spellcasting system in the Enchanter trilogy. The wishes would, like in any good fairy tale, be limited to one of each type. So, even this route to victory would be easier but still in its own way a challenge.

At first Moriarty thought of making Wishbringer a magic ring, but what with The Lord of the Rings and a thousand knock-offs thereof that felt too clichéd. Anyway, he wanted to include it in the box as a feelie, and, cost concerns being what they were, that meant the ring would have to be a gaudy plastic thing like those ones bubble-gum machines sometimes dispensed in lieu of a gumball. Then he hit upon the idea of making Wishbringer a stone — “The Magick Stone of Dreams.” Maybe they could make the one in the package glow in the dark to give it that proper aura and distract from its plasticness? Marketing said it was feasible, and so the die (or stone) was cast. Thus did Wishbringer become the first and only Infocom game to be literally designed around a feelie. Moriarty spent some nine months — amidst all of the Hitchhiker’s and Cornerstone excitement, the high-water mark that was Christmas 1984, an office move, and the dawning of the realization that the company was suddenly in big, big trouble — learning the vagaries of ZIL and writing Wishbringer.

Wishbringer

For all that it’s a much subtler work lacking the “Gee whiz!” quality of Seastalker, Wishbringer does feel like a classic piece of children’s literature. It casts you as a postal carrier in the quietly idyllic village of Festeron, which is apparently located in the same world as Zork and shares with that series an anachronistic mixing of modernity with fantasy. (I’m sure someone has figured out a detailed historical timeline for Wishbringer‘s relation to Zork as well as geography and all the rest, but as usual with that sort of thing I just can’t be bothered.) You dream of adventure — in fact, you’re interrupted in the middle of such a daydream as the game begins — but you’re just a mail carrier with a demanding boss. Said boss, Mr. Crisp, gives you a letter to deliver to the old woman who is proprietor of Ye Olde Magick Shoppe up in the hills north of town. On your way there you should explore the town and enjoy the lovely scenery, because once you make the delivery everything changes. The letter turns out to be a ransom note for the old woman from “The Evil One,” demanding Wishbringer itself in return for the safe return of her cat: “And now, now it claims my only companion.”

"It's getting Dark outside," the old woman remarks, and you can almost hear the capital D. "Maybe you should be getting back to town."

The old woman hobbles over to the Magick Shoppe door and opens it. A concealed bell tinkles merrily.

"Keep a sharp eye out for my cat, won't you?" She speaks the words slowly and distinctly. "Bring her to me if you find her. She's black as night from head to tail, except for one little white spot... right HERE."

The old woman touches the middle of your forehead with her finger. The light outside dims suddenly, like a cloud passing over the sun.

So, Wishbringer is ultimately just a hunt for a lost cat, a quest I can heartily get behind. But as soon as you step outside you realize that everything has changed. The scenery becomes a darker, more surreal riot reminiscent in places of Mindwheel. Mailboxes have become sentient (and sometimes carnivorous); Mr. Crisp has turned into the town’s petty dictator; a pet poodle has turned into a vicious hellhound. The game flirts with vaguely fascistic imagery, as with the giant Boot Patrols that march around the town enforcing its nightly curfew. (This does lead to one glaring continuity flaw: why is the cinema still open if the whole city is under curfew?) There’s a creepy dread and a creepy allure to exploring the changed town, a reminder that, as the Brothers Grimm taught us long ago, ostensible children’s literature doesn’t necessarily mean all sunshine and lollypops.

Like so much of Roberta Williams’s work, Wishbringer plays with fairy-tale tropes. But Moriarty is a much better, more original writer than Williams, not to mention a more controlled one. (Witness the way that the opening text of Wishbringer foreshadows the climax, a literary technique unlikely to even occur to Williams.) Rather than appropriate characters and situations whole cloth, he nails the feeling, balancing sweetness and whimsy with an undercurrent of darkness and menace that soon becomes an overcurrent when day turns to night and the big Change happens. The closest analogue I can offer for the world of Wishbringer is indeed the Brothers Grimm — but perhaps also, crazy as this is going to sound, Mr. Rogers’s Neighborhood of Make-Believe. Wishbringer has that same mixing of playfulness with a certain gravitas. There’s even some talking platypuses, one of very few examples of direct borrowing from Moriarty’s inspirations.

The other examples almost all come from Zork, including a great cameo from the good old white house and mailbox. And of course every Zork game has to have grues somewhere. The grues’ refrigerator light is my favorite gag in the whole game; it still makes me chuckle every time I think about it.

You have stumbled into the nesting place of a family of grues. Congratulations. Few indeed are the adventurers who have entered a grue's nest and lived as long as you have.

Everything is littered with rusty swords of elvish workmanship, piles of bones and other debris. A closed refrigerator stands in one corner of the nest, and something... a small, dangerous-looking little beast... is curled up in the other corner.

The only exit is to the west. Hope you survive long enough to use it.

 

Snoring fitfully, the little beast turns away from the light of the small stone and faces the wall.

>open refrigerator
A light inside the refrigerator goes out as you open it.

Opening the refrigerator reveals a bottle and an earthworm.

The little beast is stirring restlessly. It looks as if it's about to wake up!

>close refrigerator
A light inside the refrigerator comes on as you close it.

Indeed, while Moriarty is generally thought of as Infocom’s “serious” author on the exclusive basis of his second game Trinity, Wishbringer is full of such funny bits.

Wishbringer is very solvable, but doing so is not trivial even if you let yourself use the stone; this is of course just as Moriarty intended it. You may not even find the stone until a good third or more of the way through the game, and it definitely won’t help you with everything thereafter. Played without using the stone, I’m not sure that Wishbringer is really all that much easier than the average mid-period Infocom game at all. The most objectionable aspects for the modern player as well as the most surprising to find in an “Introductory” game are the hard time limits; you’re almost certain to need to restart a few times to fully explore Festeron before the Change and still deliver the letter in time, and you may need a few restores to get everything you need to done after the Change. An inventory limit also sometimes complicates matters; Infocom had been slowly losing interest in this sort of purely logistical problem for years, but Wishbringer demonstrates that even in an introductory game they weren’t quite there yet. Still, those are design sins worth forgiving in light of Wishbringer‘s charms — assuming you think them sins at all. Like the determination to make you work a bit for a solution even if you use the stone, they could be seen as a good thing. Wishbringer, we should remember, was meant to serve as an introduction to Infocom’s catalog as a whole, in which players would find plenty of other timers and inventory limits and puzzles that refuse to just disappear in a poof of magic. Wishbringer‘s refusal to trivialize its purpose is really quite admirable; there’s even a (thankfully painless) pseudo-maze.

Wishbringer was released in June of 1985, six full months after Infocom’s previous game Suspect. That gap would turn out to be the longest of Infocom’s productive middle years, and had left many fans worried about the company’s future and whether Cornerstone meant the end of games. Infocom’s idea that there were people potentially interested in interactive fiction but eager for a gentler version of the form turned out to be correct. Wishbringer turned into one of Infocom’s last genuine hits; Billboard software charts from the second half of 1985 show it and Hitchhiker’s regularly ensconced together inside the Top 20 or even Top 10, marking the last time Infocom would have a significant presence there. It sold almost 75,000 copies in its first six months, with a lifetime total perhaps as high as 150,000. To the best of my reckoning it stands as about Infocom’s fifth best-selling game overall.

Sales figures aside, Wishbringer‘s “Introductory” tag and its gentle, unassuming personality can make it an easy game amongst the Infocom canon to dismiss or overlook. That would be a shame to do, however; it’s one of the most likeable games Infocom ever did. While not one of Infocom’s more thematically or formally groundbreaking games and thus not one of their more discussed, it continues to be enjoyed by just about everyone who plays it. It’s the sort of game that may not come up that often when you ask people about their very favorites from Infocom, but mention it to any Infocom fan and you’ll almost always get back an “Oh, yes. I really liked that one.” Rather than bury its light charm under yet more leaden pontification, I’ll just suggest you play it if you haven’t already.

(Jason Scott’s interviews for Get Lamp informed much of this article. Interviews with Moriarty of various vintages can be found online at The IF Archive, Adventura CIA, Electron Dance, and Halcyon Days. Also useful was Moriarty’s “self-interview” in the January/February 1986 AmigaWorld; his picture above comes from that article. Adventure in the Fifth Dimension was published in the April/May 1983 A.N.A.L.O.G.; Crash Dive! in the May 1984 A.N.A.L.O.G., the last to which Moriarty contributed.)

 
 

Tags: , ,