Tag Archives: moriarty


Brian Moriarty, 1985

Brian Moriarty, 1985

Brian Moriarty was the first of a second wave of Infocom authors from very different and more diverse backgrounds than the original Imps. Their fresh perspectives would be a welcome addition during the latter half of the company’s history. Some of the second wave all but stumbled through the doors of Infocom, but not Moriarty — not at all Moriarty. His arrival as an Imp in September of 1984 marked the fruition of a calculated “assault on Infocom” — his words, not mine — that had taken over two years to bring off.

Moriarty’s personal history is perfect for an Imp, being marked by a mix of technical and literary interests right from his grade-school years. After taking a degree in English Literature from Southeastern Massachusetts University in 1978, he found a job in a Radio Shack store, where he spent hours many days playing with the TRS-80s. He didn’t buy a computer of his own, however, until after he had become a technical writer at Bose Corporation in Framingham, Massachusetts. It was there in 1981 that a colleague brought in his new Atari 800 to show off. Moriarty succumbed to the greatest Atari marketing weapon ever devised: the classic game Star Raiders. He soon headed out to buy an Atari system of his own.

Along with the computer and Star Raiders, Moriarty also brought home a copy of Scott Adams’s Strange Odyssey. He played it and the other Scott Adams games obsessively, thinking all the while of all the ways they could be better. Then one day he spotted Infocom’s Deadline on the shelf of his local Atari dealer. From its dossier-like packaging to its remarkable parser and its comparative reams of luxurious text, it did pretty much everything he had been dreaming about. Moriarty knew in an instant what he wanted to do, and where he wanted to do it. How great to learn that Infocom was located right there in the Boston area; that, anyway, was one problem less to deal with. Still, Infocom was a tiny, insular company at this point, and weren’t exactly accepting resumes from eager Atari enthusiasts who’d never designed an actual game before.

So Moriarty put Infocom in his long-range planning folder and went for the time being somewhere almost as cool. Back at Radio Shack, he’d worked with a fellow named Lee Pappas, whom he’d been surprised to rediscover behind the counter of the local Atari dealer when he’d gone to buy his 800 system. Pappas and a friend had by then already started a little newsletter, A.N.A.L.O.G. (“Atari Newsletter and Lots of Games”). By the end of 1982 it had turned into a full-fledged glossy magazine. Pappas asked Moriarty, who’d already been a regular contributor for some months, if he’d like to come work full-time for him. Moriarty said yes, leaving his safe, comfortable job at Bose behind; it was “the best career move I ever made.”

A.N.A.L.O.G. was a special place, a beloved institution within and chronicler of the Atari 8-bit community in much the same way that Softalk was of the Apple II scene. Their articles were just a little bit more thoughtful, their type-in programs a little bit better, their reviews a little bit more honest than was the norm at other magazines. Moriarty, a graceful writer as well as a superb Atari hacker, contributed to all those aspects by writing articles and reviews and programs. Life there was pretty good: “It was a small group of nerdy guys in their 20s who loved computer games, ate the same junk foods, and went to see the same science-fiction movies together.”

Still, Moriarty didn’t forget his ultimate goal. Having advanced one step by getting himself employed in the same general industry as Infocom, he set about writing his first adventure game to prove his mettle to anyone — Infocom, perhaps? — who might be paying attention. Adventure in the Fifth Dimension appeared in A.N.A.L.O.G.‘s April/May 1983 issue. A necessarily primitive effort written mostly in BASIC and running in 16 K, it nevertheless demonstrated some traits of Moriarty’s later work by mixing a real place, Washington D.C., with fantastic and surreal elements: a group of aliens have stolen the Declaration of Independence, and it’s up to you to track down an entrance to their alternate universe and get it back. A year later, Moriarty continued his campaign with another, more refined adventure written entirely in assembly language. Crash Dive! pits the player against a mutineer aboard a nuclear submarine, a scenario much more complex and plot-heavy than the typical magazine-type-in treasure hunt. It even included a set of Infocom-style feelies, albeit only via a photograph in the magazine.

Crash Dive!'s "feelies"

With two games under his belt, Moriarty applied for a position as a game designer at Infocom, but his resume came right back to him. Then a colleague showed him a posting he’d spotted on the online service CompuServe. It was from Dan Horn, manager of Infocom’s Micro Group, looking for an expert 6502 hacker to work on Z-Machine interpreters. It took Moriarty about “45 seconds” to answer. Horn liked what he saw of Moriarty, and in early 1984 the latter started working for the former in the building where the magic happened. His first project involved, as chance would have it, another submarine-themed game: he modified the Atari 8-bit, Commodore 64, and Apple II interpreters to support the sonar display in Seastalker. Later he wrote complete new interpreters for the Radio Shack Color Computer and the ill-fated Commodore Plus/4.

He was tantalizingly close to his goal. Having broken through the outer gates, he just needed to find a way into the inner keep of the Imps themselves. He took to telling Berlyn, Blank, Lebling, and the rest about his ambition every chance he got, while also sharing with them his big idea for a game: a grand “historical fantasy” that would deal with no less weighty a subject than the history of atomic weapons and their implications for humanity. It seemed the perfect subject for the zeitgeist of 1984, when the Cold War was going through its last really dangerous phase and millions of schoolchildren were still walking around with souls seared by the previous year’s broadcast of The Day After.

Moriarty got his shot at the inner circle when a certain pop-science writer whom Infocom had hired to write a game was allegedly found curled up beneath his desk in a little ball of misery, undone by the thorny syntax of ZIL. This moment marks the end of Marc Blank’s dream of being able to hire professional writers off the street, set them down with a terminal and a stack of manuals, and wait for the games to come gushing forth. From now on the games would be written by people already immersed in Infocom’s technology; the few outside collaborations to come would be just that, collaborations, with established programmers inside Infocom doing the actual coding.

That new philosophy was great news for a fellow like Brian Moriarty, skilled coder that he was. The Imps decided to reward his persistence and passion and give him a shot. Only thing was, they weren’t so sure about the big historical fantasy, at least not for a first game. What they really had in mind was a made-to-order game to fill a glaring gap in their product matrix: a gentle, modestly sized game to introduce newcomers to interactive fiction — an “Introductory”-level work. And it should preferably be a Zorkian fantasy, because that’s what sold best and what most people still thought of when they thought of Infocom. None of the current Imps were all that excited about such a project. Would Moriarty be interested? He wasn’t about to split hairs over theme or genre or anything else after dreaming of reaching this point for so long; he answered with a resounding “Absolutely!” And so Brian Moriarty became an Imp at last — to no small consternation from Dan Horn, who’d thought Moriarty had come to Infocom to do “great work for me.”

It’s kind of surprising that it took Infocom this long to perceive the need for a game like the one that Moriarty would now be taking on as his first assignment. Their original matrix had offered only games for children — “Interactive Fiction Junior” — below the “Standard” level. Considering that even the hard-as-nails Hitchhiker’s Guide to the Galaxy was labelled “Standard,” the leap from “Junior” to “Standard” could be a daunting one indeed. Clearly there was room for a work more suitable for adult novices, one that didn’t condescend in the way that Seastalker, solid as it is on its own terms, might be perceived to do. Infocom had now decided to make just such a game at last — although, oddly, the problematic conflations continued. Rather than simply add a fifth difficulty level to the matrix, they decided to dispense with the “Junior” category entirely, relabeling Seastalker an “Introductory” game. This might have made existing print materials easier to modify, but it lost track entirely of Seastalker‘s original target demographic. Infocom claimed in The New Zork Times that “adults didn’t want a kid’s game; in fact, kids didn’t want a kid’s game.” Which rather belied the claim in the same article that Seastalker had been a “success,” but there you go.

Moriarty was a thoughtful guy with a bit of a bookish demeanor, so much so that his inevitable nickname of “Professor” actually suited him really well. Now he started thinking about how he could make an introductory game that wouldn’t be too condescending or trivial to the Infocom faithful who would hopefully also buy it. He soon hit upon the idea of including a magic MacGuffin which would allow alternate, simpler solutions to many puzzles at a cost to the score — literally a Wishbringer. The hardcore could eschew its use from the start and have a pretty satisfying experience; beginners could, after the satisfaction and affirmation of solving the game the easy way, go back and play again the hard way to try to get a better score. It was brilliant, as was the choice not to make using the Wishbringer just a “solve this puzzle” button but rather an intriguing little puzzle nexus in its own right. First the player would have to find it; then she would have to apply it correctly by wishing for “rain,” “advice,” “flight,” “darkness,” “foresight,” “luck,” or “freedom” whilst having the proper material components for the spell on hand, a perfect primer for the spellcasting system in the Enchanter trilogy. The wishes would, like in any good fairy tale, be limited to one of each type. So, even this route to victory would be easier but still in its own way a challenge.

At first Moriarty thought of making Wishbringer a magic ring, but what with The Lord of the Rings and a thousand knock-offs thereof that felt too clichéd. Anyway, he wanted to include it in the box as a feelie, and, cost concerns being what they were, that meant the ring would have to be a gaudy plastic thing like those ones bubble-gum machines sometimes dispensed in lieu of a gumball. Then he hit upon the idea of making Wishbringer a stone — “The Magick Stone of Dreams.” Maybe they could make the one in the package glow in the dark to give it that proper aura and distract from its plasticness? Marketing said it was feasible, and so the die (or stone) was cast. Thus did Wishbringer become the first and only Infocom game to be literally designed around a feelie. Moriarty spent some nine months — amidst all of the Hitchhiker’s and Cornerstone excitement, the high-water mark that was Christmas 1984, an office move, and the dawning of the realization that the company was suddenly in big, big trouble — learning the vagaries of ZIL and writing Wishbringer.


For all that it’s a much subtler work lacking the “Gee whiz!” quality of Seastalker, Wishbringer does feel like a classic piece of children’s literature. It casts you as a postal carrier in the quietly idyllic village of Festeron, which is apparently located in the same world as Zork and shares with that series an anachronistic mixing of modernity with fantasy. (I’m sure someone has figured out a detailed historical timeline for Wishbringer‘s relation to Zork as well as geography and all the rest, but as usual with that sort of thing I just can’t be bothered.) You dream of adventure — in fact, you’re interrupted in the middle of such a daydream as the game begins — but you’re just a mail carrier with a demanding boss. Said boss, Mr. Crisp, gives you a letter to deliver to the old woman who is proprietor of Ye Olde Magick Shoppe up in the hills north of town. On your way there you should explore the town and enjoy the lovely scenery, because once you make the delivery everything changes. The letter turns out to be a ransom note for the old woman from “The Evil One,” demanding Wishbringer itself in return for the safe return of her cat: “And now, now it claims my only companion.”

"It's getting Dark outside," the old woman remarks, and you can almost hear the capital D. "Maybe you should be getting back to town."

The old woman hobbles over to the Magick Shoppe door and opens it. A concealed bell tinkles merrily.

"Keep a sharp eye out for my cat, won't you?" She speaks the words slowly and distinctly. "Bring her to me if you find her. She's black as night from head to tail, except for one little white spot... right HERE."

The old woman touches the middle of your forehead with her finger. The light outside dims suddenly, like a cloud passing over the sun.

So, Wishbringer is ultimately just a hunt for a lost cat, a quest I can heartily get behind. But as soon as you step outside you realize that everything has changed. The scenery becomes a darker, more surreal riot reminiscent in places of Mindwheel. Mailboxes have become sentient (and sometimes carnivorous); Mr. Crisp has turned into the town’s petty dictator; a pet poodle has turned into a vicious hellhound. The game flirts with vaguely fascistic imagery, as with the giant Boot Patrols that march around the town enforcing its nightly curfew. (This does lead to one glaring continuity flaw: why is the cinema still open if the whole city is under curfew?) There’s a creepy dread and a creepy allure to exploring the changed town, a reminder that, as the Brothers Grimm taught us long ago, ostensible children’s literature doesn’t necessarily mean all sunshine and lollypops.

Like so much of Roberta Williams’s work, Wishbringer plays with fairy-tale tropes. But Moriarty is a much better, more original writer than Williams, not to mention a more controlled one. (Witness the way that the opening text of Wishbringer foreshadows the climax, a literary technique unlikely to even occur to Williams.) Rather than appropriate characters and situations whole cloth, he nails the feeling, balancing sweetness and whimsy with an undercurrent of darkness and menace that soon becomes an overcurrent when day turns to night and the big Change happens. The closest analogue I can offer for the world of Wishbringer is indeed the Brothers Grimm — but perhaps also, crazy as this is going to sound, Mr. Rogers’s Neighborhood of Make-Believe. Wishbringer has that same mixing of playfulness with a certain gravitas. There’s even some talking platypuses, one of very few examples of direct borrowing from Moriarty’s inspirations.

The other examples almost all come from Zork, including a great cameo from the good old white house and mailbox. And of course every Zork game has to have grues somewhere. The grues’ refrigerator light is my favorite gag in the whole game; it still makes me chuckle every time I think about it.

You have stumbled into the nesting place of a family of grues. Congratulations. Few indeed are the adventurers who have entered a grue's nest and lived as long as you have.

Everything is littered with rusty swords of elvish workmanship, piles of bones and other debris. A closed refrigerator stands in one corner of the nest, and something... a small, dangerous-looking little beast... is curled up in the other corner.

The only exit is to the west. Hope you survive long enough to use it.


Snoring fitfully, the little beast turns away from the light of the small stone and faces the wall.

>open refrigerator
A light inside the refrigerator goes out as you open it.

Opening the refrigerator reveals a bottle and an earthworm.

The little beast is stirring restlessly. It looks as if it's about to wake up!

>close refrigerator
A light inside the refrigerator comes on as you close it.

Indeed, while Moriarty is generally thought of as Infocom’s “serious” author on the exclusive basis of his second game Trinity, Wishbringer is full of such funny bits.

Wishbringer is very solvable, but doing so is not trivial even if you let yourself use the stone; this is of course just as Moriarty intended it. You may not even find the stone until a good third or more of the way through the game, and it definitely won’t help you with everything thereafter. Played without using the stone, I’m not sure that Wishbringer is really all that much easier than the average mid-period Infocom game at all. The most objectionable aspects for the modern player as well as the most surprising to find in an “Introductory” game are the hard time limits; you’re almost certain to need to restart a few times to fully explore Festeron before the Change and still deliver the letter in time, and you may need a few restores to get everything you need to done after the Change. An inventory limit also sometimes complicates matters; Infocom had been slowly losing interest in this sort of purely logistical problem for years, but Wishbringer demonstrates that even in an introductory game they weren’t quite there yet. Still, those are design sins worth forgiving in light of Wishbringer‘s charms — assuming you think them sins at all. Like the determination to make you work a bit for a solution even if you use the stone, they could be seen as a good thing. Wishbringer, we should remember, was meant to serve as an introduction to Infocom’s catalog as a whole, in which players would find plenty of other timers and inventory limits and puzzles that refuse to just disappear in a poof of magic. Wishbringer‘s refusal to trivialize its purpose is really quite admirable; there’s even a (thankfully painless) pseudo-maze.

Wishbringer was released in June of 1985, six full months after Infocom’s previous game Suspect. That gap would turn out to be the longest of Infocom’s productive middle years, and had left many fans worried about the company’s future and whether Cornerstone meant the end of games. Infocom’s idea that there were people potentially interested in interactive fiction but eager for a gentler version of the form turned out to be correct. Wishbringer turned into one of Infocom’s last genuine hits; Billboard software charts from the second half of 1985 show it and Hitchhiker’s regularly ensconced together inside the Top 20 or even Top 10, marking the last time Infocom would have a significant presence there. It sold almost 75,000 copies in its first six months, with a lifetime total perhaps as high as 150,000. To the best of my reckoning it stands as about Infocom’s fifth best-selling game overall.

Sales figures aside, Wishbringer‘s “Introductory” tag and its gentle, unassuming personality can make it an easy game amongst the Infocom canon to dismiss or overlook. That would be a shame to do, however; it’s one of the most likeable games Infocom ever did. While not one of Infocom’s more thematically or formally groundbreaking games and thus not one of their more discussed, it continues to be enjoyed by just about everyone who plays it. It’s the sort of game that may not come up that often when you ask people about their very favorites from Infocom, but mention it to any Infocom fan and you’ll almost always get back an “Oh, yes. I really liked that one.” Rather than bury its light charm under yet more leaden pontification, I’ll just suggest you play it if you haven’t already.

(Jason Scott’s interviews for Get Lamp informed much of this article. Interviews with Moriarty of various vintages can be found online at The IF Archive, Adventura CIA, Electron Dance, and Halcyon Days. Also useful was Moriarty’s “self-interview” in the January/February 1986 AmigaWorld; his picture above comes from that article. Adventure in the Fifth Dimension was published in the April/May 1983 A.N.A.L.O.G.; Crash Dive! in the May 1984 A.N.A.L.O.G., the last to which Moriarty contributed.)


Tags: , , ,



During 1983, the year that Brian Moriarty first conceived the idea of a text adventure about the history of atomic weapons, the prospect of nuclear annihilation felt more real, more terrifyingly imaginable to average Americans, than it had in a long, long time. The previous November had brought the death of longtime Soviet General Secretary Leonid Brezhnev and the ascension to power of Yuri Andropov. Brezhnev had been a corrupt, self-aggrandizing old rascal, but also a known, relatively safe quantity, content to pin medals on his own chest and tool around in his collection of foreign cars while the Soviet Union settled into a comfortable sort of stagnate stability around him. Andropov, however, was to the extent he was known at all considered a bellicose Party hardliner. He had enthusiastically played key roles in the brutal suppression of both the 1956 Hungarian Revolution and the 1968 Prague Spring.

Ronald Reagan, another veteran Cold Warrior, welcomed Andropov into office with two of the most famous speeches of his Presidency. On March 8, 1983, in a speech before the American Society of Evangelicals, he declared the Soviet Union “an evil empire.” Echoing Hannah Arendt’s depiction of Adolf Eichmann, he described Andropov and his colleagues as “quiet men with white collars and cut fingernails and smooth-shaven cheeks who do not need to raise their voice,” committing outrage after outrage “in clean, carpeted, warmed, and well-lighted offices.” Having thus drawn an implicit parallel between the current Soviet leadership and the Nazis against which most of them had struggled in the bloodiest war in history, Reagan dropped some big news on the world two weeks later. At the end of a major televised address on the need for engaging in the largest peacetime military buildup in American history, he announced a new program that would soon come to be known as the Strategic Defense Initiative, or Star Wars: a network of satellites equipped with weaponry to “intercept and destroy strategic ballistic missiles before they reach our own territory or that of our allies.” While researching and building SDI, which would “take years, probably decades, of effort on many fronts” with “failures and setbacks just as there will be successes and breakthroughs” — the diction was oddly reminiscent of Kennedy’s Moon challenge — the United States would in the meantime be deploying a new fleet of Pershing II missiles to West Germany, capable of reaching Moscow in less than ten minutes whilst literally flying under the radar of all of the Soviet Union’s existing early-warning systems. To the Soviet leadership, it looked like the Cuban Missile Crisis in reverse, with Reagan in the role of Khrushchev.

Indeed, almost from the moment that Reagan had taken office, the United States had begun playing chicken with the Soviet Union, deliberately twisting the tail of the Russian bear via feints and probes in the border regions. “A squadron would fly straight at Soviet airspace and their radars would light up and units would go on alert. Then at the last minute the squadron would peel off and go home,” remembers former Undersecretary of State William Schneider. Even as Reagan was making his Star Wars speech, one of the largest of these deliberate provocations was in progress. Three aircraft-carrier battle groups along with a squadron of B-52 bombers all massed less than 500 miles from Siberia’s Kamchatka Peninsula, home of many vital Soviet military installations. If the objective was to make the Soviet leadership jittery — leaving aside for the moment the issue of whether making a country with millions of kilotons of thermonuclear weapons at its disposal jittery is really a good thing — it certainly succeeded. “Every Soviet official one met was running around like a chicken without a head — sometimes talking in conciliatory terms and sometimes talking in the most ghastly and dire terms of real hot war — of fighting war, of nuclear war,” recalls James Buchan, at the time a correspondent for the Financial Times, of his contemporaneous visit to Moscow. Many there interpreted the speeches and the other provocations as setting the stage for premeditated nuclear war.

And so over the course of the year the two superpowers blundered closer and closer to the brink of the unthinkable on the basis of an almost incomprehensible mutual misunderstanding of one another’s national characters and intentions. Reagan and his cronies still insisted on taking the Marxist rhetoric to which the Soviet Union paid lip service at face value when in reality any serious hopes for fomenting a worldwide revolution of the proletariat had ended with Khrushchev, if not with Stalin. As the French demographer Emmanuel Todd wrote in 1976, the Soviet Union’s version of Marxism had long since been transformed “into a collection of high-sounding but irrelevant rhetoric.” Even the Soviet Union’s 1979 invasion of Afghanistan, interpreted by not just the Reagan but also the Carter administration as a prelude to further territorial expansion into the Middle East, was actually a reactionary move founded, like so much the Soviet Union did during this late era of its history, on insecurity rather than expansionist bravado: the new Afghan prime minister, Hafizullah Amin, was making noises about abandoning his alliance with the Soviet Union in favor of one with the United States, raising the possibility of an American client state bordering on the Soviet Union’s soft underbelly. To imagine that this increasingly rickety artificial construct of a nation, which couldn’t even feed itself despite being in possession of vast tracts of some of the most arable land on the planet, was capable of taking over the world was bizarre indeed. Meanwhile, to imagine that the people around him would actually allow Reagan to launch an unprovoked first nuclear strike even if he was as unhinged as some in the Soviet leadership believed him to be is to fundamentally misunderstand America and Americans.

On September 1, 1983, this mutual paranoia took its toll in human lives.  Korean Air Lines Flight 007, on its way from New York City to Seoul, drifted hundreds of miles off-course due to the pilot’s apparent failure to change an autopilot setting. It flew over the very same Kamchatka Peninsula the United States had been so aggressively probing. Deciding enough was enough, the Soviet air-defense commander in charge scrambled fighters and made the tragic decision to shoot the plane down without ever confirming that it really was the American spy plane he suspected it to be. All 269 people aboard were killed. Soviet leadership then made the colossally awful decision to deny that they had shot down the plane; then to admit that, well, okay, maybe they had shot it down, but it had all been an American trick to make their country look bad. If Flight 007 had been an American plot, the Soviets could hardly have played better into the Americans’ hands. Reagan promptly pronounced the downing “an act of barbarism” and “a crime against nature,” and the rest of the world nodded along, thinking maybe there was some truth to this Evil Empire business after all. Throughout the fall dueling search parties haunted the ocean around the Kamchatka Peninsula, sometimes aggressively shadowing one another in ways that could easily lead to real shooting warfare. The Soviets found the black box first, then quickly squirreled it away and denied its existence; it clearly confirmed that Flight 007 was exactly the innocent if confused civilian airliner the rest of the world was saying it had been.

The superpowers came as close to the brink of war as they ever would — arguably closer than during the much more famed Cold War flash point of the Cuban Missile Crisis — that November. Despite a “frenzied” atmosphere of paranoia in Moscow, which some diplomats described as “pre-war,” the Reagan administration made the decision to go ahead with another provocation in the form of Able Archer 83, an elaborately realistic drill simulating the command-and-control process leading up to a real nuclear strike. The Soviets had long suspected that the West might attempt to launch a real attack under the cover of a drill. Now, watching Able Archer unfold, with many in the Soviet military claiming that it likely represented the all-out nuclear strike the world had been dreading for so long, the leaderless Politburo squabbled over what to do while a dying Andropov lay in hospital. Nuclear missiles were placed on hair-trigger alert in their silos; aircraft loaded with nuclear weapons stood fueled and ready on their tarmacs. One itchy trigger finger or overzealous politician over the course of the ten-day drill could have resulted in apocalypse. Somehow, it didn’t happen.

On November 20, nine days after the conclusion of Able Archer, the ABC television network aired a first-run movie called The Day After. Directed by Nicholas Meyer, fresh off the triumph of Star Trek II, it told the story of a nuclear attack on the American heartland of Kansas. If anything, it soft-pedaled the likely results of such an attack; as a disclaimer in the end credits noted, a real attack would likely be so devastating that there wouldn’t be enough people left alive and upright to make a story. Still, it was brutally uncompromising for a program that aired on national television during the family-friendly hours of prime time. Viewed by more than 100 million shocked and horrified people, The Day After became one of the landmark events in American television history and a landmark of social history in its own right. Many of the viewers, myself among them, were children. I can remember having nightmares about nuclear hellfire and radiation sickness for weeks afterward. The Day After seemed a fitting capstone to such a year of brinksmanship and belligerence. The horrors of nuclear war were no longer mere abstractions. They felt palpably real.

This, then, was the atmosphere in which Brian Moriarty first conceived of Trinity, a text adventure about the history of atomic weaponry and a poetic meditation on its consequences. Moriarty was working during 1983 for A.N.A.L.O.G. magazine, editing articles and writing reviews and programs for publication as type-in listings. Among these were two text adventures, Adventure in the Fifth Dimension and Crash Dive!, that did what they could within the limitations of their type-in format. Trinity, however, needed more, and so it went unrealized during Moriarty’s time at A.N.A.L.O.G. But it was still on his mind during the spring of 1984, when Konstantin Chernenko was settling in as Andropov’s replacement — one dying, idea-bereft old man replacing another, a metaphor for the state of the Soviet Union if ever there was one — and Moriarty was settling in as the newest addition to Infocom’s Micro Group. And it was still there six months later, when the United States and the Soviet Union were agreeing to resume arms-control talks the following year — Reagan had become more open to the possibility following his own viewing of The Day After, thus making Meyer’s film one of the few with a real claim to having directly influenced the course of history — and Moriarty was agreeing to do an entry-level Zorkian fantasy as his first work as an Imp.

Immediately upon completion of his charming Wishbringer in May of 1985, Moriarty was back to his old obsession, which looked at last to have a chance of coming to fruition. The basic structure of the game had long been decided: a time-jumping journey through a series of important events in atomic history that would begin with you escaping a near-future nuclear strike on London and end with you at the first test of an atomic bomb in the New Mexico desert on July 16, 1945 — the Trinity test. In a single feverish week he dashed off the opening vignette in London’s Kensington Gardens, a lovely if foreboding sequence filled with mythic signifiers of the harrowing journey that awaits you. He showed it first to Stu Galley, one of the least heralded of the Imps but one possessed of a quiet passion for interactive fiction’s potential and a wisdom about its production that made him a favorite source of advice among his peers. “If you can sustain this, you’ll have something,” said Galley in his usual understated way.

Thus encouraged, Moriarty could lobby in earnest for his ambitious, deeply serious atomic-age tragedy. Here he caught a lucky break: Wishbringer became one of Infocom’s last substantial hits. While no one would ever claim that the Imps were judged solely on the commercial performance of their games, it certainly couldn’t hurt to have written a hit when your next proposal came up for review. The huge success of The Hitchhiker’s Guide to the Galaxy, for instance, probably had a little something to do with Infocom’s decision to green-light Steve Meretzky’s puzzleless experiment A Mind Forever Voyaging. Similarly, this chance to develop the commercially questionable Trinity can be seen, at least partially, as a reward to Moriarty for providing Infocom with one of the few bright spots of a pretty gloomy 1985. They even allowed him to make it the second game (after A Mind Forever Voyaging) written for the new Interactive Fiction Plus virtual machine that allowed twice the content of the normal system at the expense of abandoning at least half the platforms for which Infocom’s games were usually sold. Moriarty would need every bit of the extra space to fulfill his ambitions.

The market at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

The marker at the site of the Trinity test, as photographed by Moriarty on his 1985 visit.

He plunged enthusiastically into his research, amassing a bibliography some 40 items long that he would eventually publish, in a first and only for Infocom, in the game’s manual. He also reached out personally to a number of scientists and historians for guidance, most notably Ferenc Szasz of the University of Albuquerque, who had just written a book about the Trinity test. That July he took a trip to New Mexico to visit Szasz as well as Los Alamos National Laboratory and other sites associated with early atomic-weapons research, including the Trinity site itself on the fortieth anniversary of that fateful day. His experience of the Land of Enchantment affected him deeply, and in turn affected the game he was writing. In an article for Infocom’s newsletter, he described the weird Strangelovean enthusiasm he found for these dreadful gadgets at Los Alamos with an irony that echoes that of “The Illustrated Story of the Atom Bomb,” the gung-ho comic that would accompany the game itself.

“The Lab” is Los Alamos National Laboratory, announced by a sign that stretches like a CinemaScope logo along the fortified entrance. One of the nation’s leading centers of nuclear-weapons research. The birthplace of the atomic bomb.

The Bradbury Museum occupies a tiny corner in the acres of buildings, parking lots, and barbed-wire fences that comprise the Laboratory. Its collection includes scale models of the very latest in nuclear warheads and guided missiles. You can watch on a computer as animated neutrons blast heavy isotopes to smithereens. The walls are adorned with spectacular color photographs of fireballs and mushroom clouds, each respectfully mounted and individually titled, like great works of art.

I watched a teacher explain a neutron-bomb exhibit to a group of schoolchildren. The exhibit consists of a diagram with two circles. One circle represents the blast radius of a conventional nuclear weapon; a shaded ring in the middle shows the zone of lethal radiation. The other circle shows the relative effects of a neutron bomb. The teacher did her best to point out that the neutron bomb’s “blast” radius is smaller, but its “lethal” radius is proportionally much larger. The benefit of this innovation was not explained, but the kids listened politely.

Trinity had an unusually if not inordinately long development cycle for an Infocom game, stretching from Moriarty’s first foray into Kensington Gardens in May of 1985 to his placing of the finishing touches on the game almost exactly one year later; the released story file bears a compilation datestamp of May 8, 1986. During that time, thanks to the arrival of Mikhail Gorbachev and Perestroika and a less belligerent version of Ronald Reagan, the superpowers crept back a bit from the abyss into which they had stared in 1983. Trinity, however, never wavered from its grim determination that it’s only a matter of time until these Pandorean toys of ours lead to the apocalyptic inevitable. Perhaps we’re fooling ourselves; perhaps it’s still just a matter of time before the wrong weapon in the wrong hands leads, accidentally or on purpose, to nuclear winter. If so, may our current blissful reprieve at least stretch as long as possible.

I’m not much interested in art as competition, but it does feel impossible to discuss Trinity without comparing it to Infocom’s other most obviously uncompromising attempt to create literary Art, A Mind Forever Voyaging. If pressed to name a single favorite from the company’s rich catalog, I would guess that a majority of hardcore Infocom fans would likely name one of these two games. As many of you probably know already, I’m firmly in the Trinity camp myself. While A Mind Forever Voyaging is a noble experiment that positively oozes with Steve Meretzky’s big old warm-and-fuzzy heart, it’s also a bit mawkish and one-note in its writing and even its themes. It’s full of great ideas, mind you, but those ideas often aren’t explored — when they’re explored at all — in all that thoughtful of a way. And I must confess that the very puzzleless design that represents its most obvious innovation presents something of a pacing problem for me. Most of the game is just wandering around under-implemented city streets looking for something to record, an experience that leaves me at an odd disconnect from both the story and the world. Mileages of course vary greatly here (otherwise everyone would be a Trinity person), but I really need a reason to get my hands dirty in a game.

One of the most noteworthy things about Trinity, by contrast, is that it is — whatever else it is — a beautifully crafted traditional text adventure, full of intricate puzzles to die for, exactly the sort of game for which Infocom is renowned and which they did better than anyone else. If A Mind Forever Voyaging is a fascinating might-have-been, a tangent down which Infocom would never venture again, Trinity feels like a culmination of everything the 18 games not named A Mind Forever Voyaging that preceded it had been building toward. Or, put another way, if A Mind Forever Voyaging represents the adventuring avant garde, a bold if problematic new direction, Trinity is a work of classicist art, a perfectly controlled, mature application of established techniques. There’s little real plot to Trinity; little character interaction; little at all really that Infocom hadn’t been doing, albeit in increasingly refined ways, since the days of Zork. If we want to get explicit with the comparisons, we might note that the desolate magical landscape where you spend much of the body of Trinity actually feels an awful lot like that of Zork III, while the vignettes you visit from that central hub parallel Hitchhiker’s design. I could go on, but suffice to say that there’s little obviously new here. Trinity‘s peculiar genius is to be a marvelous old-school adventure game while also being beautiful, poetic and even philosophically profound. It manages to imbed its themes within its puzzles, implicating you directly in the ideas it explores rather than leaving you largely a wandering passive observer as does A Mind Forever Voyaging.

To my thinking, then, Trinity represents the epitome of Infocom’s craft, achieved some nine years after a group of MIT hackers first saw Adventure and decided they could make something even better. There’s a faint odor of anticlimax that clings to just about every game that would follow it, worthy as most of those games would continue to be on their own terms (Infocom’s sense of craft would hardly allow them to be anything else). Some of the Imps, most notably Dave Lebling, have occasionally spoken of a certain artistic malaise that gripped Infocom in its final years, one that was separate from and perhaps more fundamental than all of the other problems with which they struggled. Where to go next? What more was there to really do in interactive fiction, given the many things, like believable characters and character interactions and parsers that really could understand just about anything you typed, that they still couldn’t begin to figure out how to do? Infocom was never, ever going to be able to top Trinity on its own traditionalist terms and really didn’t know how, given the technical, commercial, and maybe even psychological obstacles they faced, to rip up the mold and start all over again with something completely new. Trinity is the top of mountain, from which they could only start down the other side if they couldn’t find a completely new one to climb. (If we don’t mind straining a metaphor to the breaking point, we might even say that A Mind Forever Voyaging represents a hastily abandoned base camp.)

Given that I think Trinity represents Infocom’s artistic peak (you fans of A Mind Forever Voyaging and other games are of course welcome to your own opinions), I want to put my feet up here for a while and spend the first part of this new year really digging into the history and ideas it evokes. We’re going to go on a little tour of atomic history with Trinity by our side, a series of approaches to one of the most important and tragic — in the classical sense of the term; I’ll go into what I mean by that in a future article — moments of the century just passed, that explosion in the New Mexico desert that changed everything forever. We’ll do so by examining the same historical aftershocks of that “fulcrum of history” (Moriarty’s words) as does Trinity itself, like the game probing deeper and moving back through time toward their locus.

I think of Trinity almost as an intertextual work. “Intertextuality,” like many fancy terms beloved by literary scholars, isn’t really all that hard a concept to understand. It simply refers to a work that requires that its reader have a knowledge of certain other works in order to gain a full appreciation of this one. While Moriarty is no Joyce or Pynchon, Trinity evokes huge swathes of history and lots of heady ideas in often abstract, poetic ways, using very few but very well-chosen words. The game can be enjoyed on its own, but it gains so very much resonance when we come to it knowing something about all of this history. Why else did Moriarty include that lengthy bibliography? In lieu of that 40-item reading list, maybe I can deliver some of the prose you need to fully appreciate Moriarty’s poetry. And anyway, I think this stuff is interesting as hell, which is a pretty good justification in its own right. I hope you’ll agree, and I hope you’ll enjoy the little detour we’re about to make before we continue on to other computer games of the 1980s.

(This and the next handful of articles will all draw from the same collection of sources, so I’ll just list them once here.

On the side of Trinity the game and Infocom, we have, first and foremost as always, Jason Scott’s Get Lamp materials. Also the spring 1986 issue of Infocom’s newsletter, untitled now thanks to legal threats from The New York Times; the September/October 1986 and November 1986 Computer Gaming World; the August 1986 Questbusters; and the August 1986 Computer and Video Games.

As far as atomic history, I find I’ve amassed a library almost as extensive as Trinity‘s bibliography. Standing in its most prominent place we have Richard Rhodes’s magisterial “atomic trilogy” The Making of the Atomic Bomb, Dark Sun, and Arsenals of Folly. There’s also Command and Control by Eric Schlosser; The House at Otowi Bridge by Peggy Pond Church; The Nuclear Weapons Encyclopedia; Now It Can Be Told by Leslie Groves; Hiroshima by John Hershey; The Day the Sun Rose Twice by Ferenc Morton Szasz; Enola Gay by Gordon Thomas; and Prompt and Utter Destruction by J. Samuel Walker. I can highly recommend all of these books for anyone who wants to read further in these subjects.)


Tags: , ,

T Plus 5: Bombs in Space


Earth Orbit, on a satellite

The satellite you're riding is about twenty feet long, and shaped like a beer can.

Time passes.

A red flash draws your eyes to the ground below, where the contrail of a missile is climbing into the stratosphere.

Time passes.

The maneuvering thrusters on the satellite fire, turning the nose until it faces the ascending missile.

Time passes.

The satellite erupts in a savage glare that lights up the ground below. A beam of violet radiation flashes downward, obliterating the distant missile. Unfortunately, you have little time to admire this triumph of engineering before the satellite's blast incinerates you.

Trinity aims in 256 K of text adventure to chronicle at least fifty years of humanity’s relationship to the atomic bomb, as encapsulated into seven vignettes. Two of these, the one dealing with the long-dreaded full-on nuclear war that begins with you on vacation in London’s Kensington Gardens and the one you see above involving a functioning version of Ronald Reagan’s “Star Wars” Strategic Defense Initiative (a proposition that all by itself justifies Trinity‘s “Fantasy” genre tag, as we’ll soon see), are actually speculative rather than historical, taking place at some point in the near future. The satirical comic that accompanies the game also reserves space for Reagan and his dream. It’s a bold choice to put Reagan himself in there, undisguised by pseudonymous machinations like A Mind Forever Voyaging‘s “Richard Ryder” — even a brave one for a company that was hardly in a position to alienate potential players. Trinity, you see, was released at the absolute zenith of Reagan’s popularity. While the comic and the game it accompanies hardly add up to a scathing sustained indictment a la A Mind Forever Voyaging, they do cast him as yet one more Cold Warrior in a conservative blue suit leading the world further along the garden path to the unthinkable. Today I’d like to look at this “orbiting ‘umbrella’ of high technology” that Trinity postulates — correctly — isn’t really going to help us all that much at all when the missiles start to fly. Along the way we’ll get a chance to explore some of the underpinnings of the nuclear standoff and also the way it came to an anticlimactically sudden end, so thankfully at odds with Trinity‘s more dramatic predictions of the supposed inevitable.

In November of 1985, while Trinity was in development, Ronald Reagan and the new Soviet General Secretary Mikhail Gorbachev met for the first American/Soviet summit of Reagan’s Presidency. The fact that the summit took place at all was almost entirely down to the efforts of Gorbachev, who quite skillfully made it politically impossible for Reagan not to attend. It marked the first time Reagan had actually talked with his Soviet counterpart face to face in his almost five years as President. The two men, as contemporary press reports would have it, “took the measure of each other” and largely liked what they saw, but came to no agreements. The second summit, held in Reykjavik, Iceland, in October of the following year, came to within a hair’s breadth of a major deal that would have started the superpowers down the road to the complete elimination of nuclear armaments and effectively marked the beginning of the end of the Cold War. The only stumbling block was the Strategic Defense Initiative. Gorbachev was adamant that Reagan give it up, or at least limit it to “laboratory testing”; Reagan just as adamantly refused. He repeatedly expressed to both Gorbachev and the press his bafflement at this alleged intransigence. SDI, he said, was to be a technology of defense, a technology for peace. His favorite metaphor was SDI as a nuclear “gas mask.” The major powers of the world had all banned poison gas by treaty after World War I, and, rather extraordinarily, even kept to that bargain through all the other horrors of World War II. Still, no one had thrown away their gas-mask stockpiles, and the knowledge that other countries still possessed them had just possibly helped to keep everyone honest. SDI, Reagan said, could serve the same purpose in the realm of nuclear weapons. He even made an extraordinary offer: the United States would be willing to give SDI to the Soviets “at cost” — whatever that meant — as soon as it was ready, as long as the Soviets would also be willing to share any fruits of their own (largely nonexistent) research. That way everyone could have nuclear gas masks! How could anyone who genuinely hoped and planned not to use nuclear weapons anyway possibly object to terms like that?

Gorbachev had a different view of the matter. He saw SDI as an inherently destabilizing force that would effectively jettison not one but two of the tacit agreements of the Cold War that had so far prevented a nuclear apocalypse. Would any responsible leader easily accept such an engine of chaos in return for a vague promise to “share” the technology? Would Reagan? It’s very difficult to know what was behind Reagan’s seeming naivete. Certainly his advisers knew that his folksy analogies hardly began to address Gorbachev’s very real and very reasonable concerns. If the shoe had been on the other foot, they would have had the same reaction. Secretary of Defense Caspar Weinberger had demonstrated that in December of 1983, when he had said, “I can’t imagine a more destabilizing factor for the world than if the Soviets should acquire a thoroughly reliable defense against these missiles before we did.” As for Reagan himself, who knows? Your opinion on the matter depends on how you take this famous but enigmatic man whom conservatives have always found as easy to deify as liberals to demonize. Was he a bold visionary who saved his country from itself, or a Machiavellian schemer who used a genial persona to institute an uglier, more heartless version of America? Or was he just a clueless if good-natured and very, very lucky bumbler? Or was he still the experienced actor, out there hitting his marks and selling the policies of his handlers like he had once shilled for General Electric? Regardless, let’s try to do more justice to Gorbachev’s concerns about SDI than Reagan did at their summits.

It’s kind of amazing that the Cold War never led to weapons in space. It certainly didn’t have to be that way. Histories today note what a shock it was to American pride and confidence when the Soviet Union became the first nation to successfully launch a satellite on October 4, 1957. That’s true enough, but a glance at the newspapers from the time also reveals less abstract fears. Now that the Soviets had satellites, people expected them to weaponize them, to use them to start dropping atomic bombs on their heads from space. One rumor, which amazingly turned out to have a basis in fact, claimed the Soviets planned to nuke the Moon, leading to speculation on what would happen if their missile was to miss the surface, boomerang around the Moon, and come back to Earth — talk about being hoisted by one’s own petard! The United States’s response to the Soviets’ satellite was par for the course during the Cold War: panicked, often ill-considered activity in the name of not falling behind. Initial responsibility for space was given to the military. The Navy and the Air Force, who often seemed to distrust one another more than either did the Soviets, promptly started squabbling over who owned this new seascape or skyscape, which depending on how you looked at it and how you picked your metaphors could reasonably be assumed to belong to either. While the Naval Research Laboratory struggled to get the United States’s first satellite into space, the more ambitious dreamers at the Air Force Special Weapons Center made their own secret plans to nuke the Moon as a show of force and mulled the construction of a manned secret spy base there.

But then, on July 29, 1958, President Eisenhower signed the bill that would transform the tiny National Advisory Committee for Aeronautics into the soon-to-be massive National Aeronautics and Space Administration — NASA. While NASA’s charter duly charged the new agency with making any “discoveries” available for “national defense” and with “the preservation of the role of the United States as a leader in aeronautical and space science and technology,” those goals came only after more high-toned abstractions like “the expansion of human knowledge” and the use of space for “peaceful and scientific purposes.” NASA was something of an early propaganda coup at a time when very little seemed to be going right with astronautics in the United States. The Soviet leadership had little choice but to accept the idea, publicly at least, of space exploration as a fundamentally peaceful endeavor. In 1967 the United States and the Soviet Union became signatories along with many other nations to the Outer Space Treaty that enshrined the peaceful status quo into international law. By way of compensation, the first operational ICBMs had started to come online by the end of the 1950s, giving both superpowers a way of dealing impersonal death from the stratosphere without having to rely on wonky satellites.

This is not to say that the Cold War never made it into space in any form. Far from it. Apollo, that grandest adventure of the twentieth century, would never have happened without the impetus of geopolitics. The Apollo 11 astronauts may have left a message on the Moon saying they had “come in peace for all mankind,” may have even believed it at some level, but that was hardly the whole story. President Kennedy, the architect of it all, had no illusions about the real purpose of his Moon pronouncement. “Everything that we do ought to be really tied into getting onto the Moon ahead of the Russians,” he told NASA Administrator James Webb in 1962. “Otherwise we shouldn’t be spending this kind of money because I’m not that interested in space.” The Moon Race, like war, was diplomacy through other means. As such, the division between military and civilian was not always all that clear. For instance, the first Americans to fly into orbit, like the first Soviets, did so mounted atop repurposed ICBMs.

Indeed, neither the American nor the Soviet military had any interest in leaving space entirely to the civilians. If one of the goals of NASA’s formation had been to eliminate duplications of effort, it didn’t entirely succeed. The Air Force in particular proved very reluctant to give up on their own manned space efforts, developing during the 1960s the X-15 rocket plane that Neil Armstrong among others flew to the edge of orbit, the cancelled Dyna-Soar space plane, and even a manned space station that also never got off the drawing board. Planners in both the United States and the Soviet Union seemed to treat the 1967 Outer Space Treaty as almost a temporary accord, waiting for the other shoe to drop and for the militarization of space to begin in earnest. I’ve already described in an earlier article how, once the Moon Race was over, NASA was forced to make an unholy alliance with the Air Force to build the space shuttle, whose very flight profile was designed to allow it to avoid space-based weaponry that didn’t yet actually exist.

Yet the most immediate and far-reaching military application of space proved to be reconnaissance satellites. Well before the 1960s were out these orbiting spies had become vital parts of the intelligence apparatus of both the United States and the Soviet Union, as well as vital tools for the detection of ICBM launches by the other side — yet another component of the ever-evolving balance of terror. Still, restrained by treaty, habit, and concern over what it might make the other guys do, neither of the superpowers ever progressed to the logical step of trying to shoot down those satellites that were spying on their countries. If you had told people in 1957 that there would still be effectively no weapons in space almost thirty years later, that there would never have been anything even remotely resembling a battle in space, I think they would be quite surprised.

But now SDI had come along and, at least in the Soviets’ view, threatened to undermine that tradition. They need only take at face value early reports of SDI’s potential implementations, which were all over the American popular media by the time of Reagan’s 1984 reelection campaign, to have ample grounds for concern. One early plan, proposed in apparent earnest by a committee who may have seen The Battle of Britain (or Star Wars) a few too many times, would have the United States and its allies protected by squadrons of orbiting manned fighter planes, who would rocket to the rescue to shoot down encroaching ICBMs, their daring pilots presumably wearing dashing scarves and using phrases like “Tally ho!” A more grounded plan, relatively speaking, was the one for hundreds of “orbiting battle stations” equipped with particle-beam weapons or missiles of their own — hey, whatever works — to pick off the ICBMs. Of course, as soon as these gadgets came into being the Soviets would have to develop gadgets of their own to try to take them out. Thus a precious accord would be shattered forever. To the Soviets, SDI felt like a betrayal, a breaking of a sacred trust that had so far kept people and satellites in space from having to shoot at each other and in doing so had just possibly prevented the development of a new generation of horrific weaponry.

And yet this was if anything the more modest of the two outrages they saw being inflicted on the world by SDI. The biggest problem was that it could be both a symptom and a cause of the ending of the old MAD doctrine — Mutually Assured Destruction — that had been the guiding principle of both superpowers for over twenty years and that had prevented them from blowing one another up along with the rest of the world. On its surface, the MAD formulation is simplicity itself. I have enough nuclear weapons to destroy your country — or at least to do unacceptable damage to it — and a window of time to launch them at you between the time I realize that you’ve launched yours at me and the time that yours actually hit me. Further, neither of us has the ability to stop the missiles of the other — at least, not enough of them. Therefore we’d best find some way to get along and not shoot missiles at each other. One comparison, so favored by Reagan that he drove Gorbachev crazy by using it over and over again at each of their summits, is that of two movie mobsters with cocked and loaded pistols pointed at each others’ heads.

That well-intentioned comparison is also a rather facile one. The difference is a matter of degree. Many of us had MAD, that most fundamental doctrine of the Cold War, engrained in us as schoolchildren to such a degree that it might be hard for us to really think about its horribleness anymore. Nevertheless, I’d like for us to try to do so now. Let’s think in particular about its basic psychological prerequisites. In order for the threat of nuclear annihilation to be an effective deterrent, in order for it never to be carried out, it must paradoxically be a real threat, one which absolutely, unquestionably would be carried out if the order was given. If the other side was ever to suspect that we were not willing to destroy them, the deterrent would evaporate. So, we must create an entire military superstructure, a veritable subculture, of many thousands of people all willing to unquestioningly annihilate tens or hundreds of millions of people. Indeed, said annihilation is the entire purpose of their professional existence. They sit in their missile silos or in their ready rooms or cruise the world in their submarines waiting for the order to push that button or turn that key that will quite literally end existence as they know it, insulated from the incalculable suffering that action will cause inside the very same sorts of “clean, carpeted, warmed, and well-lighted offices” that Reagan once described as the domain of the Soviet Union’s totalitarian leadership alone. If the rise of this sort of antiseptic killing is the tragedy of the twentieth century, the doctrine of MAD represents it taken to its well-nigh incomprehensible extreme.

MAD, requiring as it did people to be always ready and able to carry out genocide so that they would not have to carry out genocide, struck a perilous psychological balance. Things had the potential to go sideways when one of these actors in what most people hoped would be Waiting for Godot started to get a little bit too ready and able — in short, when someone started to believe that he could win. See for example General Curtis LeMay, head of the Strategic Air Command from its inception until 1965 and the inspiration for Dr. Strangelove‘s unhinged General Jack Ripper. LeMay believed to his dying day that the United States had “lost” the Cuban Missile Crisis because President Kennedy had squandered his chance to finally just attack the Soviet Union and be done with it; talked of the killing of 100 million human beings as a worthwhile trade-off for the decapitation of the Soviet leadership; openly campaigned for and sought ways to covertly acquire the metaphorical keys to the nuclear arsenal, to be used solely at his own dubious discretion. “If I see that the Russians are amassing their planes for an attack, I’m going to knock the shit out of them before they take off the ground,” he once told a civil-defense committee. When told that such an action would represent insubordination to the point of treason, he replied, “I don’t care. It’s my policy. That’s what I’m going to do.” Tellingly, Dr. Strangelove itself was originally envisioned as a realistic thriller. The film descended into black comedy only when Stanley Kubrick started his research and discovered that so much of the reality was, well, blackly comic. Much in Dr. Strangelove that moviegoers of 1964 took as satire was in fact plain truth.

If the belief by a single individual that a nuclear war can be won is dangerous, an institutionalized version of that belief might just be the most dangerous thing in the world. And here we get to the heart of the Soviets’ almost visceral aversion to SDI, for it seemed to them and many others a product of just such a belief.

During the mid-1970s, when détente was still the watchword of the day, a group of Washington old-timers and newly arrived whiz kids formed something with the Orwellian name of The Committee on the Present Danger. Its leading light was one Paul Nitze. A name few Americans then or now are likely to recognize, Nitze had been a Washington insider since the 1940s and would remain a leading voice in Cold War policy for literally the entire duration of the Cold War. He and his colleagues, many of them part of a new generation of so-called “neoconservative” ideologues, claimed that détente was a sham, that “the Soviets do not agree with the Americans that nuclear war is unthinkable and unwinnable and that the only objective of strategic doctrine must be mutual deterrence.” On the contrary, they were preparing for “war-fighting, war-surviving, and war-winning.” Their means for accomplishing the latter two objectives would be an elaborate civil-defense program that was supposedly so effective as to reduce their casualties in an all-out nuclear exchange to about 10 percent of what the United States could expect. The Committee offered little or no proof for these assertions and many others like them. Many simply assumed that the well-connected Nitze must have access to secret intelligence sources which he couldn’t name. If so, they were secret indeed. When the CIA, alarmed by claims of Soviet preparedness in the Committee’s reports that were completely new to them, instituted a two-year investigation to get to the bottom of it all, they couldn’t find any evidence whatsoever of any unusual civil-defense programs, much less any secret plans to start and win a nuclear war. It appears that Nitze and his colleagues exaggerated wildly and, when even that wouldn’t serve their ends, just made stuff up. (This pattern of “fixing the intelligence” would remain with Committee veterans for decades, leading most notably to the Iraq invasion of 2003.)

Throughout the Carter administration the Committee lobbied anyone who would listen, using the same sort of paranoid circular logic that had led to the nuclear-arms race in the first place. The Soviets, they said, have secretly abandoned the MAD strategy and embarked on a nuclear-war-winning strategy in its place. Therefore we must do likewise. There could be no American counterpart to the magical Soviet civil-defense measures that could somehow protect 90 percent of their population from the blasts of nuclear weapons and the long years of radioactive fall-out that would follow. This was because civil defense was “unattractive” to an “open society” (“unattractiveness” being a strangely weak justification for not doing something in the face of what the Committee claimed was an immediate existential threat, but so be it). One thing the United States could and must do in response was to engage in a huge nuclear- and conventional-arms buildup. That way it could be sure to hammer the Soviets inside their impregnable tunnels — or wherever it was they would all be going — just as hard as possible. But in addition, the United States must come up with a defense of its own.

Although Carter engaged in a major military buildup in his own right, his was nowhere near big enough in the Committee’s eyes. But then came the 1980 election of Ronald Reagan. Reagan took all of the Committee’s positions to heart and, indeed, took most of its most prominent members into his administration. Their new approach to geopolitical strategy was immediately apparent, and immediately destabilizing. Their endless military feints and probes and aggressive rhetoric seemed almost to have the intention of starting a war with the Soviet Union, a war they seemed to welcome whilst being bizarrely dismissive of its potentially world-ending consequences. Their comments read like extracts from Trinity‘s satirically gung-ho accompanying comic. “Nuclear war is a destructive thing, but it is still in large part a physics problem,” said one official. “If there are enough shovels to go around, everybody’s going to make it. It’s the dirt that does it,” said another. Asked if he thought that a nuclear war was “winnable,” Caspar Weinberger replied, “We certainly are not planning to be defeated.” And then, in March of that fraught year of 1983 when the administration almost got the nuclear war it seemed to be courting, came Reagan’s SDI speech.

The most important thing to understand about SDI is that it was always a fantasy, a chimera chased by politicians and strategists who dearly wished it was possible. The only actual scientist amongst those who lobbied for it was Edward Teller, well known to the public as the father of the hydrogen bomb. One of the few participants in the Manhattan Project which had built the first atomic bomb more than 35 years before still active in public life at the time that Reagan took office, Teller was a brilliant scientist when he wanted to be, but one whose findings and predictions were often tainted by his strident anti-communism and a passion for nuclear weapons that could leave him sounding as unhinged as General LeMay. Teller seldom saw a problem that couldn’t be solved just by throwing a hydrogen bomb or two at it. His response to Carter’s decision to return the Panama Canal to Panama, for instance, was to recommend quickly digging a new one across some more cooperative Central American country using hydrogen bombs. Now, alone amongst his scientific peers, Teller told the Reagan administration that SDI was possible. He claimed that he could create X-ray beams in space by, naturally, detonating hydrogen bombs just so. These could be aimed at enemy missiles, zapping them out of the sky. The whole system could be researched, built, and put into service within five years. As evidence, he offered some inconclusive preliminary results derived from experimental underground explosions. It was all completely ridiculous; we still don’t know how to create such X-ray beams today, decades on. But it was also exactly the sort of superficially credible scientific endorsement — and from the father of the hydrogen bomb, no less! — that the Reagan administration needed.

Reagan coasted to reelection in 1984 in a campaign that felt more like a victory lap, buoyed by “Morning Again in America,” an energetic economy, and a military buildup that had SDI as one of its key components. The administration lobbied Congress to give the SDI project twice the inflation-adjusted funding as that received by the Manhattan Project at the height of World War II. With no obviously promising paths at all to follow, SDI opted for the spaghetti approach, throwing lots and lots of stuff at the wall in the hope that something would stick. Thus it devolved into a whole lot of individual fiefdoms with little accountability and less coordination with one another. Dr. Ashton Carter of Harvard, a former Defense Department analyst with full security clearance tasked with preparing a study of SDI for the Congressional Budget Office, concluded that the prospect for any sort of success was “so remote that it should not serve as the basis of public expectations of national policy.” Most of the press, seduced by Reagan’s own euphoria, paid little heed to such voices, instead publishing articles talking about the relative merits of laser and kinetic-energy weapons, battle stations in space, and whether the whole system should be controlled by humans or turned over to a supercomputer mastermind. With every notion as silly and improbable as every other and no direction in the form of a coherent plan from the SDI project itself, everyone could be an expert, everyone could build their own little SDI castle above the stratosphere. When journalists did raise objections, Reagan replied with more folksy homilies about how everyone thought Edison was crazy until he invented the light bulb, appealing to the good old American ingenuity that had got us to the Moon and could make anything possible. The administration’s messaging was framed so as to make objecting to SDI unpatriotic, downright un-American.

And yet even if you thought that American ingenuity would indeed save the day in the end, SDI had a more fundamental problem that made it philosophically as well as scientifically unsound. This most basic objection, cogently outlined at the time by the great astronomer, science popularizer, space advocate, and anti-SDI advocate Carl Sagan, was a fairly simple one. Even the most fanciful predictions for SDI must have a capacity ceiling, a limit beyond which the system simply couldn’t shoot down any more missiles. And it would always be vastly cheaper to build a few dozen more missiles than it would be to build and launch and monitor another battle station (or whatever) to deal with them. Not only would SDI not bring an end to nuclear weapons, it was likely to actually accelerate the nuclear-arms race, as the Soviet Union would now feel the need to not only be able to destroy the United States ten times over but be able to destroy the United States ten times over while also comprehensively overwhelming any SDI system in place. Reagan’s public characterization of SDI as a “nuclear umbrella” under which the American public might live safe and secure had no basis in reality. Even if SDI could somehow be made 99 percent effective, a figure that would make it more successful than any other defense in the history of warfare, the 1 percent of the Soviet Union’s immense arsenal that got through would still be enough to devastate many of the country’s cities and kill tens or hundreds of millions. There may have been an argument to make for SDI research aimed at developing, likely decades in the future, a system that could intercept and destroy a few rogue missiles. As a means of protection against a full-on strategic strike, though… forget it. It wasn’t going to happen. Ever. As President Nixon once said, “With 10,000 of these damn things, there is no defense.”

As with his seeming confusion about Gorbachev’s objections to SDI at their summits, it’s hard to say to what degree Reagan grasped this reality. Was he living a fantasy like so many others in the press and public when he talked of SDI rendering ICBMs “impotent and obsolete”? Whatever the answer to that question, it seems pretty clear that others inside the administration knew perfectly well that SDI couldn’t possibly protect the civilian population as a whole to any adequate degree. SDI was in reality a shell game, not an attempt to do an end-run around the doctrine of mutually assured destruction but an attempt to make sure that mutually assured destruction stayed mutually assured when it came to the United States’s side of the equation. Cold War planners had fretted for decades about a nightmare scenario in which the Soviet Union launched a first strike and the United States, due to sabotage, Soviet stealth technology, or some failure of command and control, failed to detect and respond to it in time by launching its own missiles before they were destroyed in their silos by those of the Soviets. SDI’s immediate strategic purpose was to close this supposed “window of vulnerability.” The system would be given, not the impossible task of protecting the vast nation as a whole, but the merely hugely improbable one of protecting those few areas where the missile silos were concentrated. Asked point-blank under oath whether SDI was meant to protect American populations or American missile silos, Pentagon chief of research and engineering Richard DeLauer gave a telling non-answer: “What we are trying to do is enhance deterrence. If you enhance deterrence and your deterrence is credible and holds, the people are protected.” This is of course just a reiteration of the MAD policy itself, not a separate justification for SDI. MAD just kept getting madder.

The essential absurdity of American plans for SDI seems to have struck Gorbachev by the beginning of 1987. Soviet intelligence had been scrambling for a few years by then, convinced that there had to be some important technological breakthrough behind all of the smoke the Reagan administration was throwing. It seems that at about this point they may have concluded that, no, the whole thing really was as ridiculous as it seemed. At any rate, Gorbachev decided it wasn’t worth perpetuating the Cold War over. He backed away from his demands, offering the United States the opportunity to continue working on SDI if it liked, demanding only a commitment to inform the Soviet Union and officially back out of some relevant treaties (which might very possibly have to include the 1967 Outer Space Treaty that forbade nuclear explosions in space) if it decided to actually implement it. Coupled with Gorbachev’s soaring global popularity, it was enough to start getting deals done. Reagan and Gorbachev signed their first substantial agreement, to eliminate between them 2692 missiles, in December of 1987. More would follow, accompanied by shocking liberalization and reform behind the erstwhile Iron Curtain, culminating in the night of November 9, 1989, when the Berlin Wall, long the tangible symbol of division between East and West, came down. Just like that, almost incomprehensible in its suddenness, the Cold War was over. Trinity stands today as a cogent commentary on that strange shadow conflict, but it proved blessedly less than prescient about the way it would end. Whatever else is still to come, there will be no nuclear war between the United States of America and the Union of Soviet Socialist Republics.

If the end of the Cold War was shockingly unexpected, SDI played out exactly as you might expect. The program was renamed to the more modest Ballistic Missile Defense Organization and scaled back dramatically in 1993, by which time it had cost half again as much as the Manhattan Project — a staggering $30 billion, enough to make it the most expensive research program in history — and accomplished little. The old idea still resurfaces from time to time, but the fervor it once generated is all but forgotten now. SDI, like most of history, is now essentially a footnote.

A more inspiring closing subject is Mikhail Gorbachev. His Nobel Peace Prize notwithstanding, he strikes me as someone who hasn’t quite gotten his due yet from history. There are many reasons that the Cold War came to an end when it did. Prominent among them was the increasingly untenable Soviet economy, battered during the decade by “the Soviet Union’s Vietnam” (Gorbachev’s phrase) in Afghanistan, a global downturn in oil prices, and the sheer creaking inertia of many years of, as the old Soviet saying went, workers pretending to work while the state pretended to pay them for it. Nevertheless, I don’t agree with Marx that history is a compendium of economic forces. Many individuals across Eastern Europe stepped forward to end their countries’ totalitarian regimes — usually peacefully, sometimes violently, occasionally at the cost of their lives. But Gorbachev’s shadow overlays all the rest. Undaunted by the most bellicose Presidential rhetoric in two decades, he used politics, psychology, and logic to convince Reagan to sit down with him and talk, then worked with him to shape a better, safer world. While Reagan talked about ending MAD through his chimerical Star Wars, Gorbachev actually did it, by abandoning his predecessors’ traditional intransigence, rolling up his sleeves, and finding a way to make it work. Later, this was the man who didn’t choose to send in the tanks when the Warsaw Pact started to slip away, making him, as Victor Sebstyen put it, one of very few leaders in the history of the world to elect not to use force to maintain an empire. Finally, and although it certainly was never his intention, he brought the Soviet Union in for a soft landing, keeping the chaos to a minimum and keeping the missiles from flying. Who would have imagined Gorbachev was capable of such vision, such — and I don’t use this word lightly — heroism? Who would have imagined he could weave his way around the hardliners at home and abroad to accomplish what he did? Prior to assuming office in 1985, he was just a smart, capable Party man who knew who buttered his bread, who, as he later admitted, “licked Brezhnev’s ass” alongside his colleagues. And then when he got to the top he looked around, accepted that the system just wasn’t working, and decided to change it. Gorbachev reminds us that the hero is often not the one who picks up a gun but the one who chooses not to.

(In addition to the sources listed in the previous article, Way Out There in the Blue by Frances FitzGerald is the best history I’ve found of SDI and its politics.)


Tags: , ,

T Plus 4: Bombing Nevada



You're in a narrow underground chamber, illuminated by an open door in the east wall. The walls and ceiling are gouged with deep spiral ruts; they look as if they've been routed out with heavy machinery.

A large cylinder occupies most of the chamber. The maze of cables and pipes surrounding it trails west, into the depths of a tunnel.


The cables and pipes lining the tunnel's walls look like bloated veins and arteries in the splinter's flickering glow. Deep tunnels bend off to the east and west.

Some careless technician has left a walkie-talkie lying in the dirt.

>get walkie-talkie

>turn on walkie-talkie
You turn on the rocker switch.

Time passes.

A tinny voice, half-buried in static, says "Two."

Time passes.


Time passes.

The walkie-talkie clicks and hisses.

Time passes.

For a brief moment, the tunnel is bathed in a raw white glare.

The most subtly chilling vista in Trinity is found not inside one of its real-world atomic vignettes, but rather in the magical land that serves as the central hub for your explorations. This landscape is dotted with incongruous giant toadstools, each of which, you eventually realize, represents a single atomic explosion.

As your eyes sweep the landscape, you notice more of the giant toadstools. There must be hundreds of them. Some sprout in clusters, others grow in solitude among the trees. Their numbers increase dramatically as your gaze moves westward, until the forest is choked with pale domes.

The scene is a representation of time, following the path of the sun from east to west. The toadstools choking the forest to the west presumably represent the nuclear apocalypse you’ve just escaped. If we subtract those toadstools along with the two somewhere far off to the east that must represent the Hiroshima and Nagasaki blasts, we’re left with those that represent not instances of atomic bombs used in anger, but rather tests. A few of these we know well as historical landmarks in their own right: the first hydrogen bomb; the first Soviet bomb; that original Trinity blast, off far to the southeast with the rising sun, from which the game takes its name and where its climax will play out. Like the bombs used in anger, these don’t interest us today; we’ll give them their due in future articles. What I do want to talk about today is some of the blasts we don’t usually hear so much about. As the landscape would indicate, there have been lots of them. Since the nuclear era began one summer morning in the New Mexico desert in 1945, there has been a verified total of 2119 tests of nuclear bombs. Almost half of that number is attributed to the United States alone. Yes, there have been a lot of bombs.

At the close of World War II, the big question for planners and politicians in the United States was that of who should be given control of the nation’s burgeoning nuclear arsenal. The Manhattan Project had been conducted under the ostensible auspices of the Army Air Force (the Air Force wouldn’t become its own independent service branch until 1947), but in reality had been something of a law unto itself. Now both Army and Navy were eager to lay claim to the bomb. The latter had dismissed the bomb’s prospects during the war years and declined to play any role in the Manhattan Project, but was nevertheless able to wrangle enough control now to be given responsibility for the first post-war tests of the gadgets, to be called Operation Crossroads. The tests’ announced objective was to determine the impact of the atomic bomb on military ships. Accordingly, the Navy assembled for atomic target practice around Bikini Atoll in the Marshall Islands a fleet made up of surplus American ships and captured German and Japanese that would have been the envy of most other nations. Its 93 vessels included in their ranks 2 aircraft carriers, 5 battleships, and 4 cruisers. The 167 native residents of Bikini were shipped off to another, much less survivable island, first stop in what would prove to be a long odyssey of misery. (Their sad story is best told in Operation Crossroads by Jonathan M. Weisgall.)

From the start, Operation Crossroads had more to do with politics than with engineering or scientific considerations. It was widely hyped as a “test” to see if the very idea of a fighting navy still had any relevance in this new atomic age. More importantly in the minds of its political planners, it would also be a forceful demonstration to the Soviet Union of just what this awesome new American weapon could do. Operation Crossroads was the hottest ticket in town during the summer of 1946. Politicians, bureaucrats, and journalists — everyone who could finagle an invitation — flocked to Bikini to enjoy the spectacle along with good wine and food aboard one of the Navy’s well-appointed host vessels, swelling the number of on-site personnel to as high as 40,000.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

Unprotected sailors aboard the German cruiser Prinz Eugen just hours after it was irradiated by an atomic bomb.

The spectators would get somewhat less than they bargained for, many of the sailors considerably more. The first bomb was dropped from a borrowed Army Air Force B-29 because the Navy had no aircraft capable of carrying the gadget. Dropped on a hazy, humid morning from high altitude, from which level the B-29 was notoriously inaccurate even under the best conditions, the bomb missed the center of the doomed fleet by some 700 yards. Only two uninteresting attack transports sank instantly in anything like the expected spectacular fashion, and only five ships sank in total, the largest of them a cruiser. As the journalists filed their reams of disappointed copy and the Navy’s leadership breathed a sigh of relief, some 5000 often shirtless sailors were dispatched to board the various vessels inside the hot zone to analyze their damage; as a safety precaution, they first scrubbed them down using water, soap, and lye to get rid of any lingering radiation. The operation then proceeded with the second bomb, an underwater blast that proved somewhat more satisfying, ripping apart the big battleship Arkansas and the aircraft carrier Saratoga amongst other vessels and tossing their pieces high into the air.

The second Operation Crossroads shot, July 25, 1946.

The second Operation Crossroads shot, July 25, 1946.

Operation Crossroads was emblematic of a Navy leadership that had yet to get their collective heads around just what a paradigm-annihilating device the atomic bomb actually was. Their insistence on dropping it on warships, as if the future was just going to bring more Battles of Midway with somewhat bigger explosions, shows that they still thought of the atomic bomb as essentially just a more powerful version of the bombs they were used to, a fundamentally tactical rather than strategic device. Their complete failure to take seriously the dangers of radioactive fallout, meanwhile, may be the reason that the sailors who took part in Operation Crossroads suffered an average life-span reduction of three months compared to others in their peer group. These were early days yet in atomic physics, but their state of denial is nevertheless difficult to understand. If the horrific photographs and films out of Hiroshima and Nagasaki — some of which apparently are shocking enough to still be classified — hadn’t been warning enough, there was always the case of Los Alamos physicist Louis Slotin. Less than six weeks before Operation Crossroads began, Slotin had accidentally started a chain reaction while experimenting with the atomic core of the same type of bomb used in the tests. He stopped the reaction through quick thinking and bravery, but not before absorbing a lethal dose of radiation. His slow, agonizing death — the second such to be experienced by a Los Alamos physicist — was meticulously filmed and documented, then made available to everyone working with atomic weapons. And yet the Navy chortled about the failure of the atomic bomb to do as much damage as expected whilst cheerfully sending in the boys to do some cleanup, ignoring both the slowly dying goats and other animals they had left aboard the various ships and the assessment of the Bulletin of Atomic Scientists of the likely fate of any individual ship in the target fleet: “The crew would be killed by the deadly burst of radiation from the bomb, and only a ghost ship would remain, floating unattended in the vast waters of the ocean.”

Just as President Eisenhower would take space exploration out from under the thumb of the military a decade later with the creation of NASA, President Truman did an end-run around the military’s conventional thinking about the atomic bomb on January 1, 1947, when the new, ostensibly civilian Atomic Energy Commission took over all responsibility for the development, testing, and deployment of the nation’s atomic stockpile. The Atomic Energy Commision would continue to conduct a steady trickle of tests in the remoter reaches of the Pacific for many years to come, albeit none with quite the bizarre spectator-sport qualities of Operation Crossroads. But the twin shocks of the first Soviet test of an atomic bomb on August 29, 1949, and the beginning of the Korean War in 1950, which came equipped with a raging debate about whether, how, and when the United States should again use its nuclear arsenal in anger, led weapons developers to agitate for a more local test site where they could regularly and easily set off smaller weapons than the blockbusters that tended to get earmarked to the Pacific. There were, they argued, plenty of open spaces in the American Southwest that would suit such a purpose perfectly well. On December 18, 1950, Truman therefore approved the allocation for this purpose of a 680-square-mile area inside the vast Nellis Air Force Gunnery and Bombing Range in the Nevada desert some 65 miles northwest of Las Vegas. The first test there, marking the first atomic bomb to be exploded on American soil since the original Trinity device, took place astonishingly soon thereafter, on January 27, 1951. By the end of the year sleeping quarters, mess halls, and laboratories had been built, creating a functioning, happy little community dedicated to making ever better bombs. The saga of the Nevada Test Site had begun. In the end no fewer than 928 of the 1032 nuclear tests ever conducted by the United States would be conducted right here.

One of the many test shots seen from the Las Vegas strip during the 1950s.

One of the many test shots seen from the Las Vegas Strip during the 1950s.

The strangest years of this very strange enterprise were the earliest. With money plentiful and the need to keep ahead of the Soviets perceived as urgent, bombs were exploded at quite a clip — twelve during the first year alone. At first they were mostly dropped from airplanes, later more commonly hung from balloons or mounted atop tall temporary towers. The testing regime was, as test-site geophysicist Wendell Weart would later put it, very “free-form.” If someone at one of the nation’s dueling atomic-weapons laboratories of Lawrence Livermore and Los Alamos determined that he needed a “shot” to prove a point or answer a question, he generally got it in pretty short order. Whatever else the testing accomplished, it was also a hell of a lot of fun. “I guess little boys like fireworks and firecrackers,” Weart admits, “and this was the biggest set of fireworks you could ever hope to see.” Las Vegas residents grew accustomed to the surreal sight of mushroom clouds blooming over their cityscape, like scenes from one of the B-grade atomic-themed monsters movies that filled the theaters of the era. When the bombs went off at night, they sometimes made enough light to read a newspaper by.

This era of free-form atmospheric testing at the Nevada Test Site coincided with the era of atomic mania in the United States at large, when nuclear energy of all stripes was considered the key to the future and the square-jawed scientists and engineers who worked on it veritable heroes. The most enduring marker of this era today is also one of the first. In 1946, not one but two French designers introduced risqué new women’s bathing suits that were smaller and more revealing than anything that had come before. Jacques Heim called his the “atome,” or atom, “the world’s smallest bathing suit.” Louis Réard named his the bikini after the recently concluded Operation Crossroads tests at Bikini Atoll. “Like the bomb,” he declared, “the bikini is small and devastating.” It was Réard’s chosen name that stuck. In addition to explosive swimwear, by the mid-1950s you could get a “Lone Ranger atomic-bomb ring” by sending in 15 cents plus a Kix cereal proof of purchase; buy a pair of atomic-bomb salt and pepper shakers; buy an “Atomic Disintegrator” cap gun. Trinity‘s accompanying comic book with its breathless “Atomic Facts: Stranger than Fiction!” and its hyperactive patriotism is a dead ringer for those times.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Showgirl Lee Merlin, Miss Atomic Bomb 1958.

Said times being what they were, Las Vegas denizens, far from being disturbed by the bombs going off so close by, embraced them with all of their usual kitschy enthusiasm. The test site helpfully provided an annual calendar of scheduled tests for civilians so they could make plans to come out and enjoy the shows. For children, it was a special treat to drive up to one of the best viewpoints on Mount Charleston early in the morning on the day of a shot, like an even better version of the Fourth of July; the budding connoisseurs  cataloged and ranked the shots and compared notes with their friends in the schoolyard. Many adults, being connoisseurs of another stripe, might prefer the “Miss Atomic Bomb” beauty pageants and revues that were all the rage along the Strip.

Showgirl Sally McCloskey does an "atomic ballet" in front of a shot.

Showgirl Sally McCloskey does an “atomic ballet” in front of a shot.

The official government stance, at the time and to a large extent even today, is that the radioactive fallout from these explosions traveled little distance if at all and was in any case minor enough to present few to no health or environmental concerns. Nevertheless, ranchers whose sheep grazed in the vicinity of the test site saw their flocks begin to sicken and die very soon after the test shots began. They mounted a lawsuit, which was denied under somewhat questionable circumstances in 1956; the sheep, claimed the court, had died of “malnutrition” or some other unidentified sickness. That judgment, almost all of the transcripts from which have since been lost, was later overturned on the rather astonishing basis of outright “fraud on the court” by the government’s defense team. That judgment was in its turn vacated on appeal in 1985, more than thirty years after the events in question. Virtually all questions about the so-called “Downwinders” who were affected — or believe they were affected — by fallout from the test site seem to end up in a similarly frustrating tangle.

What does seem fairly clear amid the bureaucratic babble, from circumstantial evidence if nothing else, is that the government even in the 1950s had more awareness of and concerns about fallout from the site than they owned up to publicly. Radioactive debris from those very first tests in early 1951 was detected, according to test-site meteorologist Philip Wymer Allen, going “up over Utah and across the Midwest and Illinois, not too far south of Chicago, and out across the Atlantic Coast and was still easily measured as the cloud passed north of Bermuda. We didn’t track it any further than that.” Already in 1952 physical chemist Willard Libby, inventor of radiocarbon dating and later a member of the Atomic Energy Commission, was expressing concerns about radioactive cesium escaping the site and being absorbed into the bones of people, especially children. A potential result could be leukemia. Another, arguably even graver concern, was radioiodine particles, which could be carried a surprising distance downwind before settling to earth, potentially on the forage preferred by sheep, goats, and cows. Many people in rural communities, especially in those days, drank unprocessed milk straight from the cow, as it were. If enough milk containing radioiodine is ingested, it can lead to thyroid cancer. Children were, once again, both particularly big drinkers of milk and particularly prone to the effects of the radioiodine that might be within it. When environmental chemist Delbert Barth was hired in the 1960s to conduct studies of radioiodine dispersion patterns at the site, he was asked to also make historical projections for the atmospheric shots of the 1950s — a request that, at least on its surface, seems rather odd if everyone truly believed there was absolutely nothing to fear. Similarly odd seems a policy which went into effect very early: not to conduct shots if the winds were blowing toward Las Vegas.

The radioactive exposure — or lack thereof — of the Downwinders remains a major political issue inside Nevada and also Utah, which many claim also received its fair share of fallout. Most people who were associated with the site say, predictably enough, that the Downwinders are at best misguided and at worst would-be freeloaders. Studies have not established a clear causal link between incidences of cancer and proximity to the Nevada Test Site, although many, including Barth, have expressed concerns about the methodologies they’ve employed. What we’re left with, then, are lots of heartbreaking stories which may have been caused by the activities at the site or may represent the simple hand of fate. (For a particularly sad story, which I won’t go into here because I don’t want to sound exploitative, see this interview with Zenna Mae and Eugene Bridges.)

The first era of the Nevada Test Site came to an abrupt end in November of 1958, when the United States and the Soviet Union entered into a non-binding mutual moratorium on all sorts of nuclear testing. For almost three years, the bombs fell silent at the test site and at its Soviet equivalent near Semipalatinsk in Kazakhstan. But then, on September 1, 1961, the Soviets suddenly started testing again, prompting the Nevada Test Site to go back into action as well. Still, the public was growing increasingly concerned over what was starting to look like the reckless practice of atmospheric testing. While Las Vegas had continued to party hearty, even before the moratorium the doughty farmers and ranchers working still closer to the site had, as Lawrence Livermore physicist Clifford Olsen rather dismissively puts it, “started to grumble a bit” about the effect they believed the fallout was having on their animals and crops and possibly their own bodies and those of their children. And now an international environmentalist movement was beginning to arise in response to Rachel Carson’s Silent Spring. In one of his last major acts before his assassination, President Kennedy in October of 1963 signed along with Soviet First Secretary Khrushchev the Partial Nuclear Test Ban Treaty that required all future nuclear tests to take place underground.

But never fear, the good times were hardly over at the Nevada Test Site. The scientists and engineers there had been experimenting with underground explosions for some years already in anticipation of this day that the more politically aware among them had begun to see as inevitable. Thus they were more than prepared to continue full-speed-ahead with a new regime of underground testing. The number of shots actually increased considerably during the 1960s, often clipping along at a steady average of one per week or more. Las Vegas, meanwhile, was still not allowed to forget about the presence of the test site. Residents grew accustomed to tremors that cracked plaster and made high-rises sway disconcertingly, phenomena that came to be known as “seismic fallout.” As the political mood shifted over the course of the decade, the number of complaints grew steadily, especially after a couple of big shots of well over 1 megaton in 1968 that caused serious structural damage to a number of buildings in Las Vegas. One of the most persistent and vociferous of the complainers was the eccentric billionaire recluse Howard Hughes, who was living at the time on the top two floors of the Desert Inn hotel. Hughes marshaled lots of money, employees, and political connections to his cause during the late 1960s, but was never able to stop or even slow the testing.

As for the environmental impact of this new breed of underground tests, the news is mixed. While neither is exactly ideal, it’s obviously preferable from an environmental standpoint to be exploding atomic bombs underground rather than in the open air. A whole new applied sub-science of geophysics, the discipline of nuclear “containment,” evolved out of efforts to, well, contain the explosions — to keep any radioactive material at all from “venting” to the surface during an explosion or “seeping” to the surface during the hours, months, and years afterward. And yet the attitudes of the folks working on the shots can still sound shockingly cavalier today. About 30 percent of the underground tests conducted during the 1960s leaked radioactivity to the surface to one degree or another. Those working at the site considered this figure acceptable. Virtually everyone present there during the 1960s makes note of the positive, non-bureaucratic, “can-do” attitude that still persisted into this new era of underground testing. Linda Smith, an administrator at the site, characterizes the attitude thus: “There is such a strong bias to get it done that overrides everything. Is there any profound discussion of should we or shouldn’t we? Is this good for the country? Is it not? There’s no question. You are there to get it done.” Clifford Olsen says, “We were all pretty much sure we were doing the right thing.”

What to make of this lack of introspection? Whatever else we say about it, we shouldn’t condemn the people of the Nevada Test Site too harshly for it. There were heaps of brilliant minds among them, but their backgrounds were very different from those of the people who had worked on the Manhattan Project, many of whom had thought and agonized at length about the nature of the work they were doing and the unimaginable power they were unleashing on the world. The men and few women of the Nevada Test Site, by contrast, had mostly come of age during or immediately after World War II, and had been raised in the very bosom of the burgeoning military-industrial complex. Indeed, most had had their education funded by military or industrial backers for the singular purpose of designing and operating nuclear weapons. This set them apart from their predecessors, who before the Manhattan Project and to a large degree after it — many among that first generation of bomb-makers considered their work in this area essentially done once the first few bombs had been exploded — tended to focus more on “pure” science than on its practical application. A few Brits aside, the Nevada Test Site people were monolithically American; many on the Manhattan Project came from overseas, including lots of refugees from occupied Europe. The Nevada Test Site people were politically conservative, in favor of law and order and strong defense (how could they not be given the nature of their work?); the Manhattan Project people were a much more politically heterogeneous group, with a leader in Robert Oppenheimer who had worked extensively for communist causes. Someone with a background like his would never have been allowed past the front gate of the Nevada Test Site.

Whatever else it was, the Nevada Test Site was just a great place to work. Regarded as they were as the nation’s main bulwark against the Soviet Union, the atomic scientists and all of those who worked with and supported them generally got whatever they asked for. Even the chow was first-rate: at the cafeteria, a dollar would get you all the steaks — good steaks — that you could eat. When all the long hours spent planning and calculating got to be too much, you could always take in a movie or go bowling: a little self-contained all-American town called Mercury grew up with the test site there in the middle of the desert. Its population peaked at about 10,000 during the 1960s, by which time it included in addition to a movie theater and bowling alley a post office, schools, churches, a variety of restaurants, a library, a swimming hall, and hotels — including one named, inevitably, the Atomic Motel. Or you could always take a walk just outside of town amidst the splendid, haunting desolation of the Nevada desert. And for those not satisfied with these small-town pleasures, the neon of Las Vegas beckoned just an hour or so down the highway.

But just as importantly, the work itself was deeply satisfying. After the slide rules and the geological charts were put away, there still remained some of that old childlike pleasure in watching things go boom. Wendell Weart: “I would go back in a tunnel and see what happened to these massive structures that we had put in there, and to see how it manhandled them and just wadded them up into balls. That was impressive.” Nor was the Nevada Test Site entirely an exercise in nuclear nihilism. While weapons development remained always the primary focus, most working there believed deeply in the peaceful potential for nuclear energy — even for nuclear explosions. One of the most extended and extensive test series conducted at the site was known as Operation Plowshare, a reference to “beating swords into plowshares” from the Book of Isaiah. Operation Plowshare eventually encompassed 27 separate explosions, stretching from 1961 to 1973. Its major focus was on nuclear explosions as means for carrying out grand earth-moving and digging operations, for the creation of trenches and canals among other things. (Such ideas formed the basis of the proposal Edward Teller bandied about during the Panama Canal controversy of the late 1970s to just dig another canal using hydrogen bombs.) Serious plans were mooted at one point to dig a whole new harbor at Cape Thompson in Alaska, more as a demonstration of the awesome potential of hydrogen bombs for such purposes than out of any practical necessity. Thankfully for the delicate oceanic ecosystem thereabouts, cooler heads prevailed in the end.

So, the people who worked at the site weren’t bad people. They were in fact almost uniformly good friends, good colleagues, good workers who were at the absolute tops of their various fields. Almost any one of them would have made a great, helpful neighbor. Nor, as Operation Plowshare and other projects attest, were they bereft of their own certain brand of idealism. If they sound heartlessly dismissive of the Downwinders’ claims and needlessly contemptuous of environmentalists who fret over the damage their work did and may still be doing, well, it would be hard for any of us to even consider the notion that the work to which we dedicated our lives — work which we thoroughly enjoyed, which made us feel good about ourselves, around which many of our happiest memories revolve — was misguided or downright foolish or may have even killed children, for God’s sake. I tend to see the people who worked at the site as embodying the best and the worst qualities of Americans in general, charging forward with optimism and industry and that great American can-do spirit — but perhaps not always thinking enough about just where they’re charging to.

A plume of radioactive debris vents from the Baneberry shot.

A plume of radioactive debris vents from the Baneberry shot.

The golden age of free-and-easy atomic testing at the Nevada Test Site ended at last on December 18, 1970. That was the day of Baneberry, a routine underground shot of just 10 kilotons. However, due to what the geophysicists involved claim was a perfect storm of factors, its containment model failed comprehensively. A huge cloud of highly radioactive particles burst to the surface and was blown directly over a mining encampment that was preparing the hole for another test nearby. By now the nature of radioactivity and its dangers was much better appreciated than it had been during the time of Operation Crossroads. All of the people at the encampment were put through extended, extensive decontamination procedures. Nevertheless, two heretofore healthy young men, an electrician and a security guard, died of leukemia within four years of the event. Their widows sued the government, resulting in another seemingly endless series of trials, feints, and legal maneuvers, culminating in yet another frustrating non-resolution in 1996: the government was found negligent and the plaintiffs awarded damages, but the deaths of the two men were paradoxically ruled not to have been a result of their radiation exposure. As many in the Downwinder community darkly noted at the time, a full admission of guilt in this case would have left the government open to a whole series of new lawsuits. Thus, they claimed, this strange splitting of the difference.

The more immediate consequence of Baneberry was a six-month moratorium on atomic testing at the Nevada Test Site while the accident was investigated and procedures were overhauled. When testing resumed, it did so in a much more controlled way, with containment calculations in particular required to go through an extended process of peer reviews and committee approvals. The Atomic Energy Commission also began for the first time to put pressure on the scientists and engineers to minimize the number of tests conducted by pooling resources and finding ways to get all the data they could out of each individual shot. The result was a slowdown from that high during the 1960s of about one shot per week to perhaps one or two per month. Old-timers grumbled about red tape and how the can-do spirit of the 1950s and 1960s had been lost, but, perhaps tellingly, there were no more Baneberrys. Of the roughly 350 shots at the Nevada Test Site after Baneberry, only 4 showed any detectable radiation leakage at all.

The site continued to operate right through the balance of the Cold War. The last bomb to be exploded there was also the last exploded to date by the United States: an anticlimactic little 5-kiloton shot on September 23, 1992. By this time, anti-nuclear activists had made the Nevada Test Site one of their major targets, and were a constant headache for everyone who worked there. Included among the ranks of those arrested for trespassing and disruption during the test site’s twilight years are Kris Kristofferson, Martin Sheen, Robert Blake, and Carl Sagan. Needless to say, the mood of the country and the public’s attitude toward nuclear weapons had changed considerably since those rah-rah days of atomic cap guns.

A tunnel waits in readiness, just in case.

A tunnel waits in readiness, just in case.

Since the mid-1990s the United States, along with Russia and the other established nuclear powers, has observed a long-lasting if non-binding tacit moratorium on all types of nuclear testing (a moratorium which unfortunately hasn’t been observed by new members of the nuclear club India, Pakistan, and North Korea). Stories of the days when mushroom clouds loomed over the Las Vegas Strip and the ground shook with the force of nuclear detonations are now something for long-time Nevada residents to share with their children or grandchildren. With its reason for existence in abeyance, the Nevada Test Site is in a state of largely deserted suspended animation today, Mercury a ghost town inhabited by only a few caretakers and esoteric researchers. One hopes that if Mercury should ever start to buzz with family life and commerce again it’s because someone has found some other, safer purpose for the desert landscape that surrounds it. In the meantime, the tunnels are still kept in readiness, just in case someone decides it’s time to start setting off the bombs again.

(The definitive resource on the history of the Nevada Test Site must be, now and likely forevermore, the University of Nevada at Las Vegas’s amazing Nevada Test Site Oral History Project. I could barely scratch the surface of the hundreds of lengthy interviews there when researching this article. And thanks to Duncan Stevens for his recommendation of Operation Crossroads by Jonathan M. Weisgall. I highly recommend the documentary The Atomic Cafe as a portrait of the era of atomic kitsch.)


Tags: , ,

T Plus 3: Edward Teller and His Superbomb



Whoever threw this place together wasn't too worried about permanence. Tin walls rise on flimsy studs to a ceiling that sags under its own weight. It reminds you of a prefab tool shed, several stories high.

You're standing beside a monstrous conglomeration of pipes, compressors and pressure valves that fills most of the building.

A stairway leads downward.

You descend the stairway.

Bottom of Scaffold

A maze of plumbing rises before you like the back of a giant refrigerator. Stairs lead up to a scaffold overlooking the equipment. Turning south, you see a closed set of sliding doors, with a small box and a loudspeaker mounted on the wall beside them.

>open box
You swing the box open.

A toggle switch and a red button are mounted inside.

>turn on toggle switch
You turn on the toggle switch.

The loudspeaker emits a burst of static, then a steady hiss.

"Zero minus two minutes."

>push red button
You push the red button.

With an electric whirr, the heavy doors slide open.

"Zero minus ninety seconds."

South Beach

The waters of a peaceful lagoon reflect the tropical dawn like a fiery mirror. A few stars are still visible in the rosy sky.

The glorified tool shed dominates this little island, leaving room only for a narrow strip of sand that curves to the northeast and northwest. A red button is mounted on the wall beside the open sliding doors of the shed.

"Zero minus one minute."

You follow the curve of the shore.

East Beach

Palm trees far across the lagoon stand in dark relief against the eastern sky. The shore continues northwest and southwest, around the equipment shed.

"Zero minus thirty seconds."

You follow the curve of the shore.

North Beach

A square wooden extension juts out of the side of the building, stretching away across the lagoon as far as you can see.

The beach continues around the equipment shed to the southeast and southwest.

"Five. Four. Three. Two. One."

Your tropical vacation is cut short by a multimegaton thermonuclear detonation, centered in the nearby equipment shed.

The first Trinity test of an atomic bomb in 1945 yielded an explosion equivalent to 20 kilotons of TNT. Barely seven years later, on November 1, 1952, the United States exploded the first thermonuclear bomb — known colloquially as the “hydrogen bomb” — on Enewetak Atoll, a member of the Marshall Islands group. That first hydrogen bomb yielded an explosion worthy of 10.4 megatons of TNT, 520 times the force of the Trinity blast. Moore’s Law’s got nothing on the early days of atomic-bomb development.

President Truman had likened the Trinity bomb to the wrath of the God of the Old Testament, comparing it to “the fire destruction prophesied in the Euphrates Valley Era, after Noah and his fabulous Ark.” What then to make of the hydrogen bomb? It was a destructive force beyond comprehension. That we got there so quickly was almost entirely down to the drive of one man who was there at the meeting that would lead to the Manhattan Project and the first atomic bomb and who would continue to be a major voice in both American politics and American weapons development through the entirety of the Cold War. Driven by scientific genius, patriotism, paranoia, and a titanic ego, he became the nation’s longest-serving Cold Warrior, perhaps the ultimate exemplar of the mentality that spawned and fueled that shadowy conflict and the lurking specter of nuclear apocalypse that accompanied it. His name was Edward Teller.

Born on January 15, 1908, in Budapest as the son of a prosperous Jewish attorney, Teller didn’t say a word until age three, leading his parents to believe he might be retarded. But then, when he did start to speak at last, he spoke in complete sentences. As a young boy his favorite author was Jules Verne: “His words carried me into an exciting world. The possibilities of man’s improvement seemed unlimited. The achievements of science were fantastic, and they were good.” But he wouldn’t be allowed much time for boyish dreams. During 1918 and 1919, amidst the end of the First World War, the breakup of the old Austro-Hungarian Empire, and the Russian Revolution that was taking place nearby, governments came and went quickly in Budapest. First there was the relatively benign if chaotic Hungarian Democratic Republic. Then came the Hungarian Soviet Republic, the second communist state in the world, built on Lenin’s model; it was less benign, and even more chaotic. And finally there was the proto-fascistic Kingdom of Hungary, accompanied by the White Terror, a series of bloody purges and brutal repressions aimed at scourging the country of communism and, it often seemed, the Jews who had disproportionately supported it. The institutionalized discrimination of the period may have been the source of the relentless competitive drive that would mark the rest of Teller’s life; his father told him that as a Jew “he would have to excel the average just to stay even.” His father also told him that he would have to emigrate if he wished to really make something of himself. Young Edward therefore worked like mad on his academics, and in 1926 he was accepted to study Chemical Engineering at Karlsruhe University in Germany.

As the political climate in Germany darkened, Teller completed his undergraduate studies at Karlsruhe, followed by a PhD in Physics at the University of Leipzig. He also lost most of his right foot in a streetcar accident in Munich; he would wear a prosthetic, and walk with a pronounced limp, for the rest of his life. He took up a research post at the University of Göttingen, where he published papers like mad. Setting a pattern that would hold throughout his career, he almost always worked with a coauthor, who would be responsible for sorting insights that sometimes came off more as feverish ravings than rigorous science into some manageable, organized form, and who would do the tedious but necessary work of calculating and verifying what seemed to come to Teller unbidden as intuitive truths. Teller was given to occasional fits of brooding, but at other times could be great fun, possessed of an easy, self-deprecating humor and a ready laugh. He was quite an accomplished classical pianist, but his approach to the art said much about an internal drive that he sometimes masked in casual contact with his peers: he played everything fortissimo, treating the composers whose works he played like personal challengers. On the whole, though, he was well-liked, and increasingly well-respected for his theoretical élan.

But soon it was, as Teller later put it, “a foregone conclusion I had to leave” Germany; Hitler had come to power, bringing with him an institutionalized antisemitism that would soon make his run-ins with bigotry in Hungary seem mild. Thankfully, in 1934 his burgeoning reputation won him an appointment to work with the great Danish physicist Niels Bohr at the University of Copenhagen, the very center of the universe of physics at that time. This was followed by a brief stay at University College London. The following year he accepted a professorship at George Washington University. In the company of his new wife, his childhood sweetheart Mici whom he had returned to Budapest one last time to marry, he booked passage to the United States, not at all sure about his decision to leave Europe. His worries were unfounded; he quickly fell in love with the New World with all the patriotic passion an immigrant often musters, and never again wanted to live anywhere else.

In early 1939 a wave of excitement swept an international physics community still struggling to retain its dedication to the open sharing of knowledge in the face of the war clouds gathering over Europe. Two German scientists, Otto Hahn and Fritz Strassmann, had managed a feat most of their colleagues had heretofore considered impossible: they had split an atom of uranium. They had, in other words, achieved nuclear fission. This opened up a possibility that had been long discussed but also long dismissed by most physicists as a fantasy: to create a fission chain reaction capable of releasing almost inconceivable amounts of energy — energy with the potential to create an almost inconceivably powerful bomb. Teller’s fellow physicist and Hungarian émigré Leó Szilárd immediately began agitating for a top-secret crash program to build one of these “atomic bombs,” or failing that to prove definitively that it could not be done. Using logic that would become all too familiar over the decades of atomic history to come, he said that the democratic world had to have the bomb before Nazi Germany. And, having split the atom first, the Germans were obviously already ahead. (Szilárd apparently didn’t consider that the willingness of Hahn and Strassmann to publish their work in scientific journals probably meant that they weren’t, at least yet, thinking all that seriously about its potential as a weapon.) But other scientists, including Bohr, remained unwilling to sacrifice their traditional openness in the name of something which they thought was likely to be impossible anyway. The chief stumbling block was the need for comparatively huge quantities of uranium-235, which no one knew how to produce in any remotely efficient way. “It can never be done,” said Bohr, “unless you turn the United States into one huge factory.” Unconvinced, Szilárd kept insisting that everything had changed as soon as fission was proved to be possible, and that his colleagues denied it at their peril.

It’s at this point that Teller, heretofore a promising but hardly a major physicist, enters the history books for the first time — not as a great thinker in his own right but, as he himself would later dryly put it, as “Szilárd’s chauffeur.” Szilárd didn’t drive, and he needed to get out to Long Island for the second of two meetings with Albert Einstein, now also living in exile in the United States, that would change the course of history. Einstein was just about the only physicist American politicians were likely to be familiar with, the only one they were likely to listen to if he came to them with outlandish science-fictional hopes and fears of a futuristic “atomic bomb.” Thus, barely a month before Germany invaded Poland to touch off the Second World War, Szilárd, Teller, and Einstein sat in the latter’s comfortable sitting room — Einstein still in his slippers — sipping tea. Teller, having already served as chauffeur, now accepted the further indignity of being the secretary, writing down the letter to President Roosevelt that his two older colleagues dictated to him. Hand-delivered to Roosevelt by Alexander Sachs, a well-connected Jewish banker, the letter led to the formation of an “Advisory Committee on Uranium,” forefather of the Manhattan Project, in October of 1939. The Committee included both Szilárd and Teller amongst its members. It was in fact Teller himself who made the first request for funding: for $6000 to finance some early experiments to be conducted by the exiled Italian physicist Enrico Fermi. After considerable argument about the expense, the request was approved.

Still, progress was slow, the government’s support was halfhearted, and even Teller himself was uncertain that he wanted to abandon pure science for weapons research. Then came the German invasion of France and the Low Countries in May of 1940. Teller later claimed that the shocking success of the Wehrmacht convinced him that “Hitler would conquer the world unless a miracle happened.” A speech by Roosevelt galvanized him to action: “If the scientists in the free countries will not make weapons to defend the freedom of their countries, then freedom will be lost.” Teller’s duty as he saw it was clear: “My mind was made up, and it has not changed since.” Actually, records of Roosevelt’s speeches from the period reveal no such formulation as the one in Teller’s recollection. There’s merely an offering of absolution to scientists for having enabled so many technologies which Germany was now putting to such evil use, along with a paean to the search for knowledge, scientific and otherwise, which Germany was now so actively repressing. Nevertheless, Teller would soon believe the “miracle” the world so urgently needed to be within sight in the form of the atomic bomb. His insistence on seeing weapons of mass destruction in such quasi-religious terms would come to define the role he would play in many dramas to come.

In mid-1941, just a few months after he and Mici took the oath of American citizenship, Edward Teller moved to Columbia to work more closely with Fermi and Szilárd, to whose cause he was now a complete convert. Soon after, he was party to yet another conversation that would change the world, this time with he himself as the active agent of that change. When Teller and Fermi were walking back from lunch one day, the latter mused “out of the blue” whether it might be possible to use the as-yet nonexistent atomic bomb as a mere catalyst for a much bigger bomb, one that fused rather than split atoms. Specifically, hydrogen might be fused to helium, like the process that powered the Sun. Fermi estimated that a fusion bomb could be made to explode with three orders of magnitude more force than a simple fission device. He considered the idea a throwaway; the numbers would start to get so big that you kind of had to ask what the point would really be. Teller, however, took it as a challenge.

This was vintage Teller. Already in 1941 he considered the fission bomb essentially a solved problem in theoretical physics. Just as he needed patient collaborators to clean up and finish his research papers, he was more than happy to turn over the practical work on the fission bomb to others while he swam after the next big fish. Within a year he thought he knew “precisely how to do it.” He broke the news to his colleague and best friend, exiled German physicist Hans Bethe:

Teller told me that the fission bomb was all well and good and, essentially, was now a sure thing. In reality, the work had hardly begun. Teller likes to jump to conclusions. He said that what we really should think about was the possibility of igniting deuterium [an isotope of hydrogen, sometimes known as “heavy hydrogen”] by a fission weapon — the hydrogen bomb.

Teller’s idea was soon christened “the Super.” He estimated that it should be able to “devastate an area of more than 100 square miles.”

Teller followed Fermi to the University of Chicago in 1942, where Fermi took charge of the project to build what became known as Chicago Pile 1, the world’s first nuclear reactor. When activated in November of that year, it proved once and for all that an atomic chain reaction, and thus an atomic bomb, was possible. With that proof, the newly christened Manhattan Project now ramped up in earnest under the stewardship of Army Air Force General Leslie Groves, who was placed in charge of infrastructure and practical and military concerns, and American physicist Robert Oppenheimer, who headed the scientific research. Teller was one of the first scientists to arrive at Los Alamos, the little community Groves and Oppenheimer constructed to finish the work of making the atomic bomb way out in the splendid isolation of the New Mexico desert. Teller liked and admired Oppenheimer with an enthusiasm that could sometimes verge on hero worship. Oppenheimer was, he said, a “bricklayer,” capable of seeing the whole puzzle and fitting its pieces together, as opposed to the “brick makers” around him who could only see their own small piece.

By now the Manhattan Project was working to develop not one but two types of fission bomb. The first would be a relatively crude device that used uranium-235. Niels Bohr’s words about “turning the United States into one huge factory” were proving to be prophetic, as Groves oversaw a massive industrial effort to enrich enough uranium to power it; this part of the Manhattan Project alone would eventually employ tens of thousands of people. The other bomb was a more elegant and efficient but also much more uncertain design that used the newly synthesized element of plutonium instead of uranium. It would have to be triggered by precisely placed and shaped explosive charges, which would implode its plutonium core into a supercritical mass and start the chain reaction. Teller worked for some time on this implosion process, the trickiest technical problem of all those that the Manhattan Project had to overcome.

Enrico Fermi and Edward Teller hiking near Los Alamos, circa 1944.

Enrico Fermi and Edward Teller hiking near Los Alamos, circa 1944.

Teller, however, soon became aggrieving and aggrieved, building the foundation of yet another lifelong reputation: that of someone who just doesn’t play well with others. As the little community of Los Alamos grew around him, Teller had expected to either be placed in charge of all theoretical physicists or of an entirely new project to work on the Super. He didn’t get either appointment. Instead, his old friend Hans Bethe took charge of the theoretical physicists, leaving Teller so resentful that it spoiled their friendship forever. The Super project, meanwhile, never got started at all; it was declared an idea maybe worth revisiting after the fission bombs were finished, but nothing to use resources on now. Teller began to neglect his assigned tasks in favor of working independently on the Super. Bethe, he recalls, “wanted me to work on calculational details at which I am not particularly good, while I wanted to continue not only on the hydrogen bomb, but on other novel subjects.” George Gamow, a Russian émigré physicist who had known Teller from his years in Germany, notes that “something changed” in Teller after he got to Los Alamos. Before, he had been “helpful, willing and able to work on other people’s ideas without insisting on everything having to be his own.” Now… well, not so much. “Since the theoretical division was very shorthanded,” says Bethe, “it was necessary to bring in new scientists to do the work that Teller declined to do.”

Teller took to prowling about distracting scientists from other, more immediately useful work with his ideas and proposals, driving Bethe crazy. At last in the spring of 1944 Bethe, with Oppenheimer’s approval, relieved Teller “of further responsibility for work on the wartime development of the atomic bomb.” (The man who replaced Teller on the implosion team, Rudolf Peierls, brought with him an assistant named Klaus Fuchs who would share many details of the atomic bomb’s design — most importantly the tricky implosion process itself — with the Soviet Union.) Oppenheimer personally convinced an irate Teller not to leave. After all, he said, this was just what he wanted; now he could work on the Super full-time. And so Teller’s work on those first atomic bombs was largely done.

After the war was ended by the dropping of two examples of Los Alamos’s handiwork — one of uranium, the other of plutonium — on Japan, the little desert community began to disperse. Many of the most important minds behind the bomb, including Bethe, Fermi, and Oppenheimer himself, were eager to put weapons development behind them and return to either pure research or, in Oppenheimer’s case, increasing political engagement with the handling of their creation. Teller was deeply disturbed at this loss of brainpower, and even more disturbed that he still couldn’t get approval of his Super project. He wrote an urgent letter trying to convince his colleagues of the necessity of further weapons development, particularly on the Super, which he said was realizable within five years if they all put their minds to it. Deploying the same paranoid logic that had led to the development of the fission bombs, he said that the Soviet Union might very well be able to make a hydrogen bomb without even bothering with a fission-only bomb; the shadowy threat was now the Soviet Union rather than Nazi Germany, but the formulation was otherwise the same. He pronounced colleagues like Oppenheimer who would prefer to reach diplomatic accommodation with the Soviet Union, accommodation which might even entail sharing the atomic bomb with them, guilty of “fallacy.” And, sounding another thoroughgoing theme of his career, he pronounced thermonuclear explosions to be potentially useful for many peaceful purposes; they would “allow us to extend our power over natural phenomena far beyond anything we can at present imagine.”

Some of his colleagues were able to secure time on the ENIAC, by some reckonings the world’s first real computer, to do calculations which seemed to prove the Super feasible. An official conference held at Los Alamos in April of 1946 produced more general agreement that it should be possible, although by no means did everyone agree with all of Teller’s most optimistic predictions for its timetable. For the time being, though, those remaining at Los Alamos were busy preparing for Operation Crossroads, as well as improving the safety and reliability of the existing arsenal. Thus the Super remained firmly on the back burner. Teller himself had already departed in frustration by the time of the Super conference; he joined Fermi at the University of Chicago in February of 1946.

But thirty months later Teller, proclaiming himself increasingly disturbed by the Chinese Civil War and the by now blatant takeover of all of Eastern Europe by the Soviet Union, returned to Los Alamos. “I fully realize the menacing international situation,” he said, “and I believe that the United States must develop its military strength to the utmost if we are not to succumb to the danger of communism.” And, casting himself as a martyr to the cause, he proclaimed a sense of patriotic duty to be behind the move, “in spite of the fact that I cannot hope to work as happily and with as much immediate satisfaction in a field of applied science.” More quietly but perhaps more honestly, he admitted to a friend that it was “quite clear that I am needed in Los Alamos more than I am needed in Chicago,” and “being necessary is an extremely important thing for me.” For their part, many of his colleagues noted not so much a considered position behind his decision as a visceral hatred of the Soviet Union that could sometimes seem to verge on the ethnic. The Soviet Union had just completed its takeover of Hungary in June of 1948, when the Soviet-backed Hungarian Communist Party effectively outlawed the democratic opposition and cut Teller off from his remaining family in Budapest. His Hungarian friend and Los Alamos colleague John von Neumann notes that “Russia was traditionally the enemy” of Hungary, subject to “an emotional fear and dislike” among his countrypeople.

Teller’s return to Los Alamos coincided with increasingly urgent consideration of the Super in the halls of government, prompted by clear signs from intelligence sources that the Soviet Union was getting close to a fission bomb of its own. “It would be dreadful,” wrote a White House aide named William Golden, “if the Russians got it [the Super] first.” Teller was on holiday in England in September of 1949 when he got the news that the Soviet Union had just exploded its first atomic bomb, at least a year before the CIA’s most pessimistic predictions.

His advocacy now shifted into overdrive. Despite the fact that the Soviets were still very obviously playing catch-up, and largely using stolen American designs to do so (that first Soviet bomb was a virtual clone of the Trinity bomb), he announced that the United States was in “grave danger that we have lost or are losing the atomic armaments race.” “If the Russians demonstrate a Super before we possess one,” he declared, “our situation will be hopeless.” His logic was questionable at best, to the extent that Yet he had at last an eager audience looking for any source of comfort in the face of the Soviet test. Oppenheimer, now increasingly at odds with Teller personally as well as professionally, wrote despairingly of “this miserable thing” that “appears to have caught the imagination, both of the Congressional and of the military people, as the answer to the problem posed by the Russian advance.” Seeing it as “the way to save the country and the peace,” he wrote, “appears to me full of dangers.” Teller took very, very personally Oppenheimer’s advocacy for diplomacy with the Soviet Union and his persistent skepticism about both the moral wisdom and the technical feasibility of the Super.

Advocates of reasoned diplomacy seldom won over advocates of nuclear armaments during the Cold War. On January 31, 1950, President Truman announced to the world that the United States was going forward with work “on all forms of atomic weapons, including the so-called hydrogen or super-bomb.” Announcing the Super publicly in this way made a marked contrast to the top-secret Manhattan Project. The move, driven largely by domestic political calculations on the part of Truman’s staff, explicitly defined future nuclear research as a race to the Super between the Americans and the Soviets, a sort of perverted forefather to the Moon Race in which both sides would seek to be first to unleash the most terrible destructive force in the history of humanity.

Some scientists declined to work on the project out of moral misgivings; others simply because they didn’t want to work with Teller. Future Nobel laureate Emilio Segrè, for example, pronounced Teller “dominated by irresistible passions much stronger than even his powerful rational intellect,” and turned his job offer down. The core of the team that was finally assembled included, in addition to Teller, two less visible European veterans of the Manhattan Project, Stanislaw Ulam and John von Neumann. They didn’t make for a very happy family. Within weeks Ulam was complaining about “Edward’s obstinacy, his single-mindedness, and his overwhelming ambition.” As Ulam and Neumann worked through the sorts of tedious calculations that Teller always found beneath him, a painful reality slowly dawned on them: Teller’s plan for the Super, which he had first conceived even before the fission bomb was a reality, simply wouldn’t work. When they tried to demonstrate this to Teller, the latter, in the words of Stanislaw Ulam’s wife Françoise, “objected loudly and cajoled everyone around into disbelieving the results. What should have been the common examination of difficult problems became an unpleasant confrontation.” “Teller was not easily reconciled to our results,” says Stanislaw Ulam himself more laconically. “I learned that the bad news drove him once to tears of frustration, and he suffered great disappointment. I never saw him personally in that condition, but he certainly appeared glum in those days, and so were other enthusiasts of the H-bomb project.” Teller was soon engaging in conspiracy theorizing, believing that Ulam and von Neumann were deliberately biasing their findings to make him and his Super look bad. He demanded that virtually all of Los Alamos be placed at his disposal, but as 1950 ground on and his theories looked more and more flawed nobody, least of all Teller, seemed quite sure what they should actually be doing.

Then, one day in late January of 1951, Françoise Ulam found her husband staring vacantly into their back garden. “‘I found a way to make it work.’ ‘What work?’ I asked. ‘The Super,’ he replied. ‘It is a totally different scheme, and it will change the course of history.'” The technical details of Ulam’s new scheme, and of Teller’s original, we won’t go into here. Suffice to say that Teller immediately saw the new idea’s potential. “Edward is full of enthusiasm about these possibilities,” wrote Ulam to a colleague. In an indication of just how far their relationship had deteriorated, he then added a stinger: “This is perhaps an indication they will not work.”

There soon followed what

From then on Teller pushed Stan aside and refused to deal with him any longer. He never met or talked meaningfully with Stan ever again. Stan was, I felt, more wounded than he knew by this unfriendly reception, although I never heard him express ill feelings toward Teller. (He rather pitied him instead.) Secure in his own mind that his input had been useful, he withdrew.

Teller would minimize Ulam’s contribution for the rest of his life. Ulam himself never seriously campaigned to be awarded his own proper share of the credit, perhaps because he was much more ambivalent about their accomplishment than Teller. He often compared the hydrogen bomb to the Jewish legend of the Golem, which, having been created as a means of protection, eventually gets out of its maker’s control and goes on a murderous rampage through Prague.

With the Super now looking feasible, the Korean War raging, and the knowledge that, thanks not least to Truman’s grand pronouncement, this was now a race with the Soviets, even the likes of Oppenheimer, Fermi, and Bethe now supported its development. Teller, however, still created chaos everywhere he went. He demanded to be placed in sole charge of the Super project, including not only the research but the logistics, the engineering, and the administration. Knowing that that way lay madness, Los Alamos director Norris Bradbury absolutely refused. On September 17, 1951, Teller quit in a huff. Many of his colleagues mumbled darkly about what seemed a developing pattern: Teller had quit on the fission-bomb project as well just when it needed him most. (Teller himself would likely have replied that, as a theoretical physicist through and through, he was neither terribly interested in nor terribly good at the engineering details of actually building either the fission bomb or the Super.) “Once Teller left Los Alamos,” Bethe remembers, “even though they were working on ‘his’ weapon, he found all sorts of reasons why it wouldn’t work. He tried to criticize it wherever possible.”

Nevertheless, Los Alamos soldiered on to shock the world and escalate the nuclear standoff to a potentially planet-wrecking scale when they detonated the first hydrogen bomb on November 1, 1952, a scene evocatively portrayed by Trinity in the vignette whose extracts open this article. It stripped not only Enewetak but every nearby island of all animal life and vegetation, as if someone had taken a giant potato peeler to their surfaces. It blew 80 million tons of highly radioactive material high into the air; parts of the fallout would travel to every corner of the globe. It vaporized birds in midair. It cooked nearby fish as if they had been dropped into a hot frying pan. (Yes, that cute, friendly dolphin that was so helpful to you in Trinity wasn’t long for this world.) Teller’s dubious dream had come to its fruition.

The world's first hydrogen bomb explodes on November 1, 1952.

The world’s first hydrogen bomb explodes on November 1, 1952.

He should have been pleased, but he had other things on his mind. While Los Alamos worked to finish the Super, he was organizing an entirely new nuclear-weapons laboratory that would not be bound by what he saw as the carping pessimism of Los Alamos. The Radiation Laboratory at Livermore was founded on the site of a mothballed naval air station in Livermore, California, that summer of 1952. Teller claimed to be too busy setting it up to make the trip to Enewetak to witness the blast, but most of his old colleagues attributed his failure to appear to pique; they believed he had been secretly hoping to see them fail, so his new laboratory could sweep in and save the day. This alleged disappointment did not, however, keep him from claiming his paternity. “It’s a boy!” he announced.

On August 12, 1953, when the Soviet Union exploded its own inevitable first hydrogen bomb, the die for 35 more years of mutually assured destruction was irretrievably cast. On October 30, 1961, almost exactly twenty years after the idle lunch-time conversation that had spawned it, Teller’s baby reached terrifying adulthood when the Soviets detonated over the remote archipelago of Novaya Zemlya the largest atomic bomb and the largest force of any sort ever triggered by humans, a 50-plus-megaton thermonuclear monster that was promptly dubbed the “Tsar Bomba.” It produced a mushroom cloud over seven times the height of Mount Everest; would have caused third-degree burns to someone standing 60 miles away; broke windows over 500 miles away. Even by the standards of the institutionalized insanity of the Cold War this was madness. Neither the Soviets nor the Americans ever tested or built another bomb of anywhere close to that size for the simple reason that no one could quite imagine what to actually do with such a giant. Their 5- and 10-megaton warheads were less expensive, easier to make, and had more than enough megatonnage among them to destroy all life on the planet.

By the time of the Tsar Bomba Teller had largely abandoned the nuts and bolts of nuclear physics in favor of a career as an administrator of the military-industrial complex and as an increasingly visible political advocate for nuclear weapons and the strongest possible anti-communist stance. Just as Los Alamos seemed to have inherited some of its founder Robert Oppenheimer’s personality, being relatively cautious and pragmatic about the terrible weapons it developed, Teller’s Livermore laboratory developed a reputation for shooting from the hip and a damn-the-consequences drive for ever bigger and dirtier bombs. Teller characterized his transformation from physicist to advocate as a principled move that he made only sadly and reluctantly. He was, he claimed again, a martyr to his thankless cause: “I cannot just go back to physics because I believe that to prevent another war happens to be incomparably more important.” Others questioned whether Teller didn’t enjoy the limelight a lot more than he admitted. Robert Brownlee, a colleague who worked with him during the 1950s, makes this observation:

Edward was, in my experience, two entirely different people. When he was with scientists, just scientists, every idea was interesting and valuable and rational and so on. And the moment a certain kind of person would walk in the room, a person who was outside the family, and therefore might take tales back, a press person, Edward would become a wild man. He would be showing off for the press or for the visitor, would say things that would make you do this: This guy has absolutely lost it, he’s completely crazy. But it was an affectation which he put on when somebody came. So the press, whenever they interviewed him, carried away with them a strange view of Edward. When he was just with us kids, he was not that at all. So when you could talk with Edward with the people right there, it was entirely different than having a stranger in there, because the moment that stranger arrived, Edward became another person. And it had something to do with publicity—I don’t know a better word for it. There must be a better word for it. But I learned that despite what everybody else at the lab said, Edward’s value had to be determined independent of his personality. He was extremely valuable, but nobody liked him because he was, every so often, totally flaky.

It was apparently this “crazy” version of Teller that the American people at large came to know well by 1960. After Teller made headlines across the country through his strident opposition to the Partial Nuclear Test Ban Treaty of 1963 that moved all nuclear testing underground, Stanley Kubrick was inspired to make his caricature the eponymous star of Dr. Strangelove. He became for decades the favorite scientist of the American Right largely by telling them exactly what they wanted to hear. For instance, he played a major role, as we’ve already seen, in Ronald Reagan’s foolish SDI initiative of the 1980s, claiming to be able to provide not only its technology but also providing its justification: “If we went into a nuclear war today,” he said in 1980, “there is practically no question that the Russians would win that war and the United States would not exist.” The similarity of this rhetoric to that he had used to justify the Super 30 years before is not, I trust, lost on you. Even as the technologies of warheads and delivery systems evolved, the arguments employed in their justification always had this weird fly-in-amber consistency about them, leaving one to wonder when, if ever, enough would finally be enough. If anything, Teller’s rhetoric grew more extreme over the years; he once claimed that the United States had fallen so far behind the Soviet Union that he fully expected to be in a Soviet prisoner-of-war camp — if not dead — within five years. His unapologetic advocacy of nuclear weapons and nuclear power continued until his death at age 95 in 2003. After the Cold War ended, rather than being thrilled at having seemingly achieved the goal he had worked toward for so many years, he merely chose a new bogeyman to fear: Saddam Hussein.

Teller had by then been ostracized for decades from his old Manhattan Project colleagues, who, whilst Teller plunged into Cold War politics, had collected an impressive shelf of Nobel Prizes amongst themselves working with more peaceful applications of nuclear physics. He replaced those old relationships with new ones forged with a group of younger colleagues at the Livermore laboratory who, having had their educations largely funded by the military-industrial complex, saw themselves first and foremost not as scientists but as weapons designers. To them, Teller was a hero. To the old guard, he was nothing less than the traitor in their ranks. The source of their enduring enmity was not his questionable advocacy for the Super or even his slighting of Ulam, but rather another sequence of events involving a man he had once admired greatly: Robert Oppenheimer.

In May of 1952, the FBI questioned Teller on the subject of Oppenheimer, another in a seemingly endless string of pseudo-investigations born of Oppenheimer’s pre-war involvement with communist causes and his current less than gung-ho attitude toward the nation’s nuclear buildup. Teller, who believed Oppenheimer personally responsible for delaying his beloved Super program, laid into his old boss with a vengeance. The country, he claimed, could easily have had the hydrogen bomb a year ago if not for Oppenheimer’s obstructionism. While he stopped short of outright calling him a Soviet spy, he was careful not to exclude the possibility either. Otherwise, he conducted what amounted to a character assassination. Oppenheimer was motivated not by principle but by vanity and jealousy in his opposition to Teller’s plans, as he didn’t want to see Teller better his own fission bomb with the Super. He had “great ambitions in science and realizes that he is not as great a physicist as he would like to be.” (Ironically, many of Teller’s colleagues would have happily accused him of this exact deep-seated sense of insecurity and its resulting personal failings.) It would be better for the country, Teller said, if Oppenheimer was “separated” from the corridors of power.

Not quite two years later, with Joseph McCarthy’s communist witch hunt near its peak, Oppenheimer’s enemies pounced openly at last, initiating hearings to revoke all of Oppenheimer’s security clearances; doing so would end his time as a policy adviser since virtually all of the policy about which he advised involved classified weapons systems. In April of 1954, Robert Oppenheimer was effectively put on trial. A parade of hawks from inside the military, the FBI, and the Washington establishment testified against him; a parade of his old Manhattan Project colleagues testified strongly in his favor. Except for Edward Teller. Called to the stand on April 28, Teller was unwilling to support Oppenheimer but also seemingly too craven to repeat his accusations of two years before in the man’s presence. Asked point-blank if he believed Oppenheimer a security risk, he equivocated like mad:

In a great number of cases I have seen Dr. Oppenheimer acting —  I understood that Dr. Oppenheimer acted — in a way which for me was exceedingly hard to understand. I thoroughly disagreed with him in numerous issues and his actions frankly appeared to me confused and complicated. To this extent I feel that I would like to see the vital interests of this country in hands which I understand better, and therefore trust more. In this very limited sense I would like to express a feeling that I would feel personally more secure if public matters would rest in other hands.

I believe, and that is merely a question of belief and there is no expertness, no real information behind it, that Dr. Oppenheimer’s character is such that he would not knowingly and willingly do anything that is designed to endanger the safety of this country. To the extent, therefore, that your question is directed toward intent, I would say I do not see any reason to deny clearance.

If it is a question of wisdom and judgment, as demonstrated by actions since 1945, then I would say one would.

“I’m sorry,” said Teller to Oppenheimer as he left the courtroom. “After what you’ve just said, I don’t know what you mean,” replied Oppenheimer. On May 27, Oppenheimer’s security clearances were formally and permanently revoked. “I think it broke his spirit really,” says an old friend. “He was not the same person afterward,” says Bethe. He spent most of his remaining years sailing and puttering around his beach house in the Virgin Islands. The Kennedy and Johnson administrations made some efforts to rehabilitate his reputation, most notably awarding him the Enrico Fermi Award for his service in 1963, but his security clearances, and with them his political influence, were never restored. He died at age 62 in 1967.

Robert Oppenheimer and Edward Teller share an uncomfortable handshake on the occasion of the former being awarded the Enrico Fermi Award, 1963.

Robert Oppenheimer and Edward Teller share an uncomfortable handshake on the occasion of the former being awarded the Enrico Fermi Award, 1963.

After the trial was done and gone, the scientists who had once worked with and admired Teller were left with the same question that we are: what the hell happened to him? How did this brilliant young scientist turn into the paranoid war-monger Americans soon got used to seeing on their television screens, opposing in his thickly accented English every effort at arms control ever mooted during the Cold War? How could a nuclear physicist, raised on science, talk about winning nuclear wars and dismiss the dangers of radioactive fallout as trivial?

There have been thousands of theories deployed in thousands of attempts to figure out Teller. Some have pointed back to that Munich street-car accident in his youth, which they claim — a bit melodramatically in my view — left him “in constant pain” for the rest of his life. Some have noted his deep-seated personal insecurity, which seemed to have its origins even earlier, to when he as a sheltered child with a doting mother suffered constant abuse and harassment at school for the crimes of being smart and being Jewish. Some have traced his hatred of communism to the chaos it brought to the Hungary of his youth — or, as noted previously, traced it to a Hungarian’s ethnic antipathy for Russia and the Russians. Enrico Fermi’s observation is amongst the most telling as well as the most witty: Teller was the only monomaniac he knew, he said, who had several manias.

Edward Teller (right) with Mikhail Gorbachev and Ronald Reagan

Edward Teller (right) with Mikhail Gorbachev and Ronald Reagan.

Whatever made Teller the man he became, it wasn’t as simplistic as any of the above, taken in isolation or even in combination. For all his legendary arrogance and his willingness to hold grudges, he was also frequently described as a “warm” man and a good, true friend. When his former colleagues all cut him after his testimony at Oppenheimer’s security hearing, sometimes even publicly refusing to shake his hand, Teller reportedly spent hours “weeping” at the spoiling of most of the most important relationships in his life. He even tried desperately to recant his testimony, only to learn it was too late. No, none of us humans are that easy to figure out.

Yet there does seem to be a larger pattern that holds true not only for Teller but for many other architects of the nuclear-arms race: the sheer seductive allure of the Bomb itself. As Trinity‘s box copy proclaims, “The basic power of the universe has been unleashed.” To wield such unprecedented power is a heady drug indeed. The Bomb is the One Ring, the Dark Side of the Force. (Interesting that so many of the most enduring mythic fictions of the Cold War feature such powerful but corrupting temptations…) Some people, like Robert Oppenheimer, were Prosperos, unnerved by its power and eager to eliminate it from the world. Others, like Edward Teller, were Dr. Faustuses, ready to ride this unholy force right down to the depths of Hell. Dueling aphorisms coined by the two men sound like extracts from Paradise Lost. “Physicists have known sin,” says Oppenheimer, eyes downcast. Teller, his trademark bushy eyebrows twitching with passion, replies, “Physicists have known power!”

(For a good history of the relationship between Teller and Oppenheimer — and also Ernest Lawrence, a figure I didn’t have room for in this article — see Brotherhood of the Bomb by Gregg Herken. You can find Carl Sagan’s article on Teller in The Demon-Haunted World.)


Tags: , ,