RSS

Category Archives: Digital Antiquaria

Spycraft: The Great Game, Part 1 (or, Parallel Spies)

Police recover William Colby’s body on the coast of Maryland, May 6, 1996.

The last people known to have seen William Colby alive are a cottage caretaker and his sister. They bumped into the former head of the CIA early on the evening of April 27, 1996, watering the willow trees around his vacation home on Neale Sound in Maryland, about 60 miles south of Washington, D.C. The trio chatted together for a few minutes about the fine weather and about the repairs Colby had spent the day doing to his sailboat, which was moored in the marina on Cobb Island, just across the sound. Then the caretaker and his sister went on their way. Everything seemed perfectly normal to them.

The next morning, a local handyman, his wife, and their two children out on the water in their motorboat spotted a bright green canoe washed up against a spit of land that extended from the Maryland shore. The canoe appeared to be abandoned. Moving in to investigate, they found that it was full of sand. This was odd, thought the handyman; he had sailed past this same place the day before without seeing the canoe, and yet so much sand could hardly have collected in it naturally over the course of a single night. It was almost as if someone had deliberately tried to sink the canoe. Oh, well; finders keepers. It really was a nice little boat. He and his family spent several hours shoveling out the sand, then towed the canoe away with them.

In the meantime, Colby’s next-door neighbor was surprised not to see him out and about. The farthest thing from a layabout, the wiry 76-year-old was usually up early, puttering about with something or other around his cottage or out on the sound. Yet now he was nowhere to be seen outside and didn’t answer his door, even though his car was still in the driveway and the neighbor thought she could hear a radio playing inside the little house. Peeking around back, she saw that Colby’s green canoe was gone. At first, she thought the mystery was solved. But as the day wore on and he failed to return, she grew more and more concerned. At 7:00 that evening, she called the police.

When they arrived, the police found that both doors to the cottage were unlocked. The radio was indeed turned on, as was Colby’s computer. Even weirder, a half-eaten meal lay in the sink, surrounded by unwashed dishes and half a glass of white wine. It wasn’t at all like the man not to clean up after himself. And his wallet and keys were also lying there on the table. Why on earth would he go out paddling without them?

Inquiries among the locals soon turned up Colby’s canoe and the story of its discovery. Clearly something was very wrong here. The police ordered a search. Two helicopters, twelve divers, and 100 volunteers in boats pulling drag-lines behind them scoured the area, while CIA agents also arrived to assist the investigation into the disappearance of one of their own; their presence was nothing to be alarmed at, they assured everyone, just standard procedure. Despite the extent of the search effort, it wasn’t until the morning of May 6, nine days after he was last seen, that William Colby’s body was found washed up on the shore, just 130 feet from where the handyman had found his canoe, but on the other side of the same spit of land. It seemed that Colby must have gone canoeing on the lake, then fallen overboard and drowned. He was 76 years old, after all.

But the handyman who had found the canoe, who knew these waters and their currents as well as anyone, didn’t buy this. He was sure that the body could not have gotten so separated from the canoe as to wind up on the opposite side of the spit. And why had it taken it so long to wash up on shore? Someone must have gone out and planted it there later on, he thought. Knowing Colby’s background, and having seen enough spy movies to know what happened to inconvenient witnesses in cases like this one, he and his family left town and went into hiding.

The coroner noticed other oddities. Normally a body that has been in the water a week or more is an ugly, bloated sight. But Colby’s was bizarrely well-preserved, almost as if it had barely spent any time in the water at all. And how could the divers and boaters have missed it for so long, so close to shore as it was?

Nonetheless, the coroner concluded that Colby had probably suffered a “cardiovascular incident” while out in his canoe, fallen into the water, and drowned. This despite the fact that he had had no known heart problems, and was in general in a physical shape that would have made him the envy of many a man 30 years younger than he was. Nor could the coroner explain why he had chosen to go canoeing long after dark, something he was most definitely not wont to do. (It had been dusk already when the caretaker and his sister said goodbye to him, and he had presumably sat down to his dinner after that.) Why had he gone out in such a rush, leaving his dinner half-eaten and his wine half-drunk, leaving his radio and computer still turned on, leaving his keys and wallet lying there on the table? It just didn’t add up in the eyes of the locals and those who had known Colby best.

But that was that. Case closed. The people who lived around the sound couldn’t help but think about the CIA agents lurking around the police station and the morgue, and wonder at everyone’s sudden eagerness to put a bow on the case and be done with it…


Unusually for a septuagenarian retired agent of the security state, William Colby had also been a game developer, after a fashion at least. In fact, at the time of his death a major game from a major publisher that bore his name very prominently right on the front of the box had just reached store shelves. This article and the next will partly be the story of the making of that game. But they will also be the story of William Colby himself, and of another character who was surprisingly similar to him in many ways despite being his sworn enemy for 55 years — an enemy turned friend who consulted along with him on the game and appeared onscreen in it alongside him. Then, too, they will be an inquiry into some of the important questions the game raises but cannot possibly begin to answer.


Sierra’s Police Quest: Open Season, created with the help of controversial former Los Angeles police chief Daryl Gates, was one of the few finished products to emerge from a brief-lived vision of games as up-to-the-minute, ripped-from-the-headlines affairs. Spycraft: The Great Game was another.

Activision’s Spycraft: The Great Game is the product of a very specific era of computer gaming, when “multimedia” and “interactive movies” were among the buzzwords of the zeitgeist. Most of us who are interested in gaming history today are well aware of the set of technical and aesthetic approaches these terms imply: namely, games built from snippets of captured digitized footage of real actors, with interactivity woven as best the creators can manage between these dauntingly large chunks of static content.

There was a certain ideology that sometimes sprang up in connection with this inclusion of real people in games, a belief that it would allow games to become relevant to the broader culture in a way they never had before, tackling stories, ideas, and controversies that ordinary folks were talking about around their kitchen tables. At the margins, gaming could almost become another form of journalism. Ken Williams, the founder and president of Sierra On-Line, was the most prominent public advocate for this point of view, as exemplified by his decision to make a game with Daryl F. Gates, the chief of police for Los Angeles during the riots that convulsed that city in the spring of 1992. Williams, writing during the summer of 1993, just as the Gates game was being released:

I want to find the top cop, lawyer, airline pilot, fireman, race-car driver, politician, military hero, schoolteacher, white-water rafter, mountain climber, etc., and have them work with us on a simulation of their world. Chief Gates gives us the cop game. We are working with Emerson Fittipaldi to simulate racing, and expect to announce soon that Vincent Bugliosi, the lawyer who locked up Charles Manson, will be working with us to do a courtroom simulation. My goal is that products in the Reality Role-Playing series will be viewed as serious simulations of real-world events, not games. If we do our jobs right, this will be the closest most of us will ever get to seeing the world through these people’s eyes.

It sounded good in theory, but would never get all that far in practice, for a whole host of reasons: a lack of intellectual bandwidth and sufficient diversity of background in the games industry to examine complex social questions in an appropriately multi-faceted way (the jingoistic Gates game is a prime case in point here); a lack of good ideas for turning such abstract themes into rewarding forms of interactivity, especially when forced to work with the canned video snippets that publishers like Sierra deemed an essential part of the overall vision; the expense of the games themselves, the expense of the computers needed to run them, and the technical challenges involved in getting them running, which in combination created a huge barrier to entry for newcomers from outside the traditional gamer demographics; and, last but not least, the fact that those existing gamers who did meet all the prerequisites were generally perfectly happy with more blatantly escapist entertainments, thank you very much. Tellingly, none of the game ideas Ken Williams mentions above ever got made. And I must admit that this failure does not strike me as any great loss for world culture.

That said, Williams, being the head of one of the two biggest American game publishers, had a lot of influence on the smaller ones when he prognosticated on the future of the industry. Among the latter group was Activision, a toppled giant which had been rescued from the dustbin of bankruptcy in 1991 by a young wheeler-dealer named Bobby Kotick. His version of the company got fully back onto its feet the same year that Williams wrote the words above, thanks largely to Return to Zork, a cutting-edge multimedia evocation of the Infocom text adventures of yore, released at the perfect time to capitalize on a generation of gamers’ nostalgia for those bygone days of text and parsers (whilst not actually asking them to read much or to type out their commands, of course).

With that success under their belts, Kotick and his cronies thought about what to do next. Adventure games were hot — Myst, the bestselling adventure of all time, was released at the end of 1993 — and Ken Williams’s ideas about famous-expert-driven “Reality Role-Playing” were in the air. What might they do with that? And whom could they get to help them do it?

They hit upon espionage, a theme that, in contrast to many of those outlined by Williams, seemed to promise a nice balance of ripped-from-the-headlines relevance with interesting gameplay potential. Then, when they went looking for the requisite famous experts, they hit the mother lode with William Colby, the head of the CIA from September of 1973 to January of 1976, and Oleg Kalugin, who had become the youngest general in the history of the First Central Directorate of the Soviet Committee for State Security, better known as the KGB, in 1974.

I’ll return to Spycraft itself in due course. But right now, I’d like to examine the lives of these two men, which parallel one another in some perhaps enlightening ways. Rest assured that in doing so I’m only following the lead of Activision’s marketers; they certainly wanted the public to focus first and foremost on the involvement of Colby and Kalugin in their game.


William Colby (center), looking every inch the dashing war hero in Norway just after the end of World War II.

William Colby was born in St. Paul, Minnesota on January 4, 1920. He was the only child of Elbridge Colby, a former soldier and current university professor who would soon rejoin the army as an officer and spend the next 40 years in the service. His family was deeply Catholic — his father thanks to a spiritual awakening and conversion while a student at university, his mother thanks to long family tradition. The son too absorbed the ethos of a stern but loving God and the necessity of serving Him in ways both heavenly and worldly.

The little family bounced around from place to place, as military families generally do. They wound up in China for three years starting in 1929, where young Bill learned a smattering of Chinese and was exposed for the first time to the often compromised ethics of real-world politics, in this case in the form of the United States’s support for the brutal dictatorship of Chiang Kei-shek. Colby’s biographer Randall Bennett Woods pronounces his time in China “one of the formative influences of his life.” It was, one might say, a sort of preparation for the many ugly but necessary alliances — necessary as Colby would see them, anyway — of the Cold War.

At the age of seventeen, Colby applied to West Point, but was rejected because of poor eyesight. He settled instead for Princeton, a university whose faculty included Albert Einstein among many other prominent thinkers. Colby spent the summer of 1939 holidaying in France, returning home just after the fateful declarations of war in early September, never imagining that the idyllic environs in which he had bicycled and picnicked and practiced his French on the local girls would be occupied by the Nazis well before another year had passed. Back at Princeton, he made the subject of his senior thesis the ways in which France’s weakness had allowed the Nazi threat on its doorstep to grow unchecked. This too was a lesson that would dominate his worldview throughout the decades to come. After graduating, Colby received his officer’s commission in the United States Army, under the looming shadow of a world war that seemed bound to engulf his own country sooner or later.

When war did come on December 7, 1941, he was working as an artillery instructor at Fort Sill in Oklahoma. To his immense frustration, the Army thought he was doing such a good job in that role that it was inclined to leave him there. “I was afraid the war would be over before I got a chance to fight,” he writes in his memoir. He therefore leaped at the opportunity when he saw an advertisement on a bulletin board for volunteers to become parachutists with the 82nd Airborne. He tried to pass the entrance physical by memorizing the eye chart. The doctor wasn’t fooled, but let him in anyway: “I guess your eyesight is good enough for you to see the ground.”

Unfortunately, he broke his ankle in a training jump, and was forced to watch, crestfallen, as his unit shipped out to Europe without him. Then opportunity came calling again, in a chance to join the new Office of Strategic Services (OSS), the forerunner of the CIA. Just as the CIA would later on, the OSS had two primary missions: foreign intelligence gathering and active but covert interference. Colby was to be dropped behind enemy lines, whence he would radio back reports of enemy troop movements and organize resistance among the local population. It would be, needless to say, an astonishingly dangerous undertaking. But that was the way Colby wanted it.

William Colby finally left for Britain in December of 1943, aboard the British luxury liner Queen Elizabeth, now refitted to serve as a troop transport. It was in a London bookstore that he first encountered another formative influence, the book Seven Pillars of Wisdom by T.E. Lawrence — the legendary Lawrence of Arabia, who had convinced the peoples of the Middle East to rise up against their Turkish overlords during the last world war. Lawrence’s book was, Colby would later say, an invaluable example of “an outsider operat[ing] within the political framework of a foreign people.” It promptly joined the Catholic Bible as one of the two texts Colby carried with him everywhere he went.

As it happened, he had plenty of time for reading: the weeks and then months passed in Britain, and still there came no orders to go into action. There was some talk of using Colby and his fellow American commandos to sow chaos during the run-up to D-Day, but this role was given to British units in the end. Instead Colby watched from the sideline, seething, as the liberation of France began. Then, out of the blue, action orders came at last. On the night of August 14, 1944, Colby and two exiled French soldiers jumped out of a B-24 bomber flying over central France.

The drop was botched; the men landed fifteen miles away from the intended target, finding themselves smack dab in the middle of a French village instead of out in the woods. Luckily, there were no Germans about, and the villagers had no desire to betray them. There followed a hectic, doubtless nerve-wracking month, during which Colby and his friends made contact with the local resistance forces and sent back to the advancing Allied armies valuable information about German troop movements and dispositions. Once friendly armies reached their position, the commandos made their way back to the recently liberated Paris, thence to London. It had been a highly successful mission, with more than enough danger and derring-do to suffice for one lifetime in the eyes of most people. But for Colby it all felt a bit anticlimactic; he had never even discharged his weapon at the enemy. Knowing that his spoken German wasn’t good enough to carry out another such mission behind the rapidly advancing Western European front, Colby requested a transfer to China.

He got another offer instead. Being an accomplished skier, he was asked to lead 35 commandos into the subarctic region of occupied Norway, to interdict the German supply lines there. Naturally, he agreed.

The parachute drop that took place on the night of March 24, 1945, turned into another botched job. Only fifteen of the 35 commandos actually arrived; the other planes strayed far off course in the dark and foggy night, accidentally dropping their passengers over neutral Sweden, or giving up and not dropping them at all. But Colby was among the lucky (?) fifteen who made it to their intended destination. Living off the frigid land, he and his men set about dynamiting railroad tracks and tunnels. This time, he got to do plenty of shooting, as his actions frequently brought him face to face with the Wehrmacht.

On the morning of May 7, word came through on the radio that Adolf Hitler was dead and his government had capitulated; the war in Europe was over. Colby now set about accepting the surrender of the same German occupiers he had recently been harassing. While the operation he had led was perhaps of doubtful necessity in the big picture of a war that Germany had already been well along the path of losing, no one could deny that he had demonstrated enormous bravery and capability. He was awarded the Silver Star.

Gung ho as ever, Colby proposed to his superiors upon returning to London that he lead a similar operation into Francisco Franco’s Spain, to precipitate the downfall of that last bastion of fascism in Europe. Having been refused this request, he returned to the United States, still seeming a bit disappointed that it had all ended so quickly. Here he asked for and was granted a discharge from the Army, asked for and was granted the hand in marriage of his university sweetheart Barbara Heinzen, and asked for and was granted a scholarship to law school. He wrote on his application that he hoped to become a lawyer in the cause of organized labor. (Far from the fire-breathing right-wing extremist some of his later critics would characterize him to be, Colby would vote Democrat throughout his life, maintaining a center-left orientation when it came to domestic politics at least.)


Oleg Kalugin at age seventeen, a true believer in Joseph Stalin and the Soviet Communist Party.

While the war hero William Colby was seemingly settling into a more staid time of life, another boy was growing up in the heart of the nation that Colby and most other Americans would soon come to regard as their latest great enemy. Born on September 6, 1934, in Leningrad (the once and future Saint Petersburg), Oleg Kalugin was, like Colby, an only child of a couple with an ethic of service, the son of a secret-police agent and a former factory worker, both of whose loyalty to communism was unimpeachable; the boy’s grandmother caused much shouting and hand-wringing in the family when she spirited him away to have him baptized in a furtive Orthodox ceremony in a dark basement. That piece of deviancy notwithstanding, Little Oleg was raised to see Joseph Stalin as his god on earth, the one and only savior of his people.

On June 22, 1941, he was “hunting maybugs with a pretty girl,” as he writes, when he saw a formation of airplanes roar overhead and drop a load of bombs not far away. The war had come to his country, six months before it would reach that of William Colby. With the German armies nearing Leningrad, he and his mother fled to the Siberian city of Omsk while his father stayed behind to fight. They returned to a devastated hometown in the spring of 1944. Oleg’s father had survived the terrible siege, but the boy had lost all of his grandparents — including that gentle soul who had caused him to be baptized — along with four uncles to either starvation or enemy bullets.

Kalugin remained a true believer after the Great Patriotic War was over, joining the Young Communist League as soon as he was eligible at the age of fourteen. At seventeen, he decided to join the KGB; it “seemed like the logical place for a person with my academic abilities, language skills, and fervent desire to fight class enemies, capitalist parasites, and social injustice.” Surprisingly, his father, who had seen rather too much of what Soviet-style class struggle really meant over the last couple of decades, tried to dissuade him. But the boy’s mind was made up. He entered Leningrad’s Institute of Foreign Languages, a shallow front for the training of future foreign agents, in 1952.

When Stalin died in March of the following year, the young zealot wrote in his diary that “Stalin isn’t dead. He cannot die. His physical death is just a formality, one that needn’t deprive people of their faith in the future. The fact that Stalin is still alive will be proven by our country’s every new success, both domestically and internationally.” He was therefore shocked when Stalin’s successor, Nikita Khrushchev, delivered a speech that roundly condemned the country’s erstwhile savior as a megalomaniac and a mass-murderer who had cynically corrupted the ideology of Marx and Lenin to serve his own selfish ends. It was Kalugin’s initiation into the reality that the state he so earnestly served was less than incorruptible and infallible.

Nevertheless, he kept the faith, moving to Moscow for advanced training in 1956. In 1958, he was selected on the basis of his aptitude for English to go to the United States as a graduate student. “Just lay the foundation for future work,” his superiors told him. “Buy yourself good maps. Improve your English. Find out about their way of life. Communicate with people and make as many friends as possible.” Kalugin’s joyous reaction to this assignment reflects the ambivalence with which young Soviets like him viewed the United States. It was, they fervently believed, the epicenter of the imperialism, capitalism, racism, and classism they hated, and must ultimately be destroyed for that reason. Yet it was also the land of jazz and rock and roll, of fast cars and beautiful women, with a standard of living so different from anything they had ever known that it might as well have been Shangri-La. “I daydreamed constantly about America,” Kalugin admits. “The skyscrapers of New York and Chicago, the cowboys of the West…” He couldn’t believe he was being sent there, and on a sort of paid vacation at that, with few concrete instructions other than to experience as much of the American way of life as he could. Even his sadness about leaving behind the nice Russian girl he had recently married couldn’t overwhelm his giddy excitement.


William Colby in Rome circa 1955, with his son Carl and daughter Catherine.

As Oleg Kalugin prepared to leave for the United States, William Colby was about to return to that same country, where he hadn’t been living for seven years. He had become a lawyer as planned and joined the National Labor Relations Board to forward the cause of organized labor, but his tenure there had proved brief. In 1950, he was convinced to join the new CIA, the counterweight to the KGB on the world stage. He loved his new “band of brothers,” filled as he found it to be with “adventuresome spirits who believed fervently that the communist threat had to be met aggressively, innovatively, and courageously.”

In April of 1951, he took his family with him on his first foreign assignment, under the cover identity of a mid-level diplomatic liaison in Stockholm, Sweden. His real purpose was to build and run an intelligence operation there. (All embassies were nests of espionage in those days, as they still are today.) “The perfect operator in such operations is the traditional gray man, so inconspicuous that he can never catch the waiter’s eye in a restaurant,” Colby wrote. He was — or could become — just such a man, belying his dashing commando past. Small wonder that he proved very adept at his job. The type of spying that William Colby did was, like all real-world espionage, more John Le Carré than Ian Fleming, an incrementalist milieu inhabited by just such quiet gray men as him. Dead-letter drops, secret codes, envelopes stuffed with cash, and the subtle art of recruitment without actually using that word — the vast majority of his intelligence contacts would have blanched at the label of “spy,” having all manner of other ways of defining what they did to themselves and others — were now his daily stock in trade.

In the summer of 1953, Colby and his family left Stockholm for Rome. Still riven by discontent and poverty that the Marshall Plan had never quite been able to quell, with a large and popular communist party that promised the people that it alone could make things better, Italy was considered by both the United States and the Soviet Union to be the European country most in danger of changing sides in the Cold War through the ballot box, making this assignment an unusually crucial one. Once again, Colby performed magnificently. Through means fair and occasionally slightly foul, he propped up Italy’s Christian Democratic Party, the one most friendly to American interests. His wife and five young children would remember these years as their happiest time together, with the Colosseum visible outside their snug little apartment’s windows, with the trapping of their Catholic faith all around them. The sons became altar boys, learning to say Mass in flawless Latin, and Barbara amazed guests with her voluble Italian, which was even better than her husband’s.

She and her children would gladly have stayed in Rome forever, but after five years there her husband was growing restless. The communist threat in Italy had largely dissipated by now, thanks to an improving economy that made free markets seem more of a promise than a threat, and Colby was itching to continue the shadowy struggle elsewhere. In 1958, he was recalled to the States to begin preparing for a new, more exotic assignment: to the tortured Southeast Asian country of Vietnam, which had recently won its independence from France, only to become a battleground between the Western-friendly government of Ngo Dinh Diem and a communist insurgency led by Ho Chi Minh.


Oleg Kalugin (center) at Columbia University, 1958.

While Colby was hitting the books at CIA headquarters in Langley, Virginia, in preparation for his latest assignment, Kalugin was doing the same as a philology student on a Fulbright scholarship to New York City’s Columbia University. (Fully half of the eighteen exchange students who traveled with him were also spies-in-training.) A natural charmer, he had no trouble ingratiating himself with the native residents of the Big Apple as he had been ordered to do.

He went home when his one-year scholarship expired, but returned to New York City one year after that, to work as a journalist for Radio Moscow. Now, however, his superiors expected a bit more from him. Despite the wife and young daughter he had left behind, he seduced a string of women who he believed could become valuable informants — so much so that American counter-espionage agents, who were highly suspicious of him, labeled him a “womanizer” and chalked it up as his most obvious weakness, should they ever be in need of one to exploit. (For his part, Kalugin writes that “I always told my officers, male and female, ‘Don’t be afraid of sex.’ If they found themselves in a situation where making love with a foreigner could help our work, I advised them to hop into bed.”)

Kalugin’s unlikely career as Radio Moscow’s foreign correspondent in New York City lasted almost four years in all. He covered — with a pro-Soviet spin, naturally — the election of President John F. Kennedy, the trauma of the Bay of Pigs Invasion and the Cuban Missile Crisis, and the assassination of Kennedy by a man with Soviet ties. He was finally called home in early 1964, his superiors having decided he was now attracting too much scrutiny from the Americans. He found returning to the dingy streets of Moscow from the Technicolor excitement of New York City to be rather dispiriting. “Worshiping communism from afar was one thing. Living in it was another thing altogether,” he writes wryly, echoing sentiments shared by many an idealistic Western defector for the cause. Shortly after his return, the reform-minded Nikita Khrushchev was ousted in favor of Leonid Brezhnev, a man who looked as tired as the rest of the Soviet Union was beginning to feel. It was hard to remain committed to the communist cause in such an environment as this, but Kalugin continued to do his best.


William Colby, looking rather incongruous in his typical shoe salesman’s outfit in a Vietnamese jungle.

William Colby might have been feeling similar sentiments somewhere behind that chiseled granite façade of his. For he was up to his eyebrows in the quagmire that was Vietnam, the place where all of the world’s idealism seemed to go to die.

When he had arrived in the capital of Saigon in 1959, with his family in tow as usual, he had wanted to treat this job just as he had his previous foreign postings, to work quietly behind the scenes to support another basically friendly foreign government with a communist problem. But Southeast Asia was not Europe, as he learned to his regret — even if the Diem family were Catholic and talked among themselves in French. There were systems of hierarchy and patronage inside the leader’s palace that baffled Colby at every turn. Diem himself was aloof, isolated from the people he ruled, while Ho Chi Minh, who already controlled the northern half of the country completely and had designs on the rest of it, had enormous populist appeal. The type of espionage Colby had practiced in Sweden and Italy — all mimeographed documents and furtive meetings in the backs of anonymous cafés — would have been useless against such a guerilla insurgency even if it had been possible. Which it was not: the peasants fighting for and against the communists were mostly illiterate.

Colby’s thinking gradually evolved, to encompass the creation of a counter-insurgency force that could play the same game as the communists. His mission in the country became less an exercise in pure espionage and overt and covert influencing than one in paramilitary operations. He and his family left Vietnam for Langley in the summer of 1962, but the country was still to fill a huge portion of Colby’s time; he was leaving to become the head of all of the CIA’s Far Eastern operations, and there was no hotter spot in that hot spot of the world than Vietnam. Before departing, the entire Colby family had dinner with President Diem in his palace, whose continental cuisine, delicate furnishings, and manicured gardens almost could lead one to believe one was on the French Riviera rather than in a jungle in Southeast Asia. “We sat there with the president,” remembers Barbara. “There was really not much political talk. Yet there was a feeling that things were not going well in that country.”

Sixteen months later — in fact, just twenty days before President Kennedy was assassinated — Diem was murdered by the perpetrators of a military coup that had gone off with the tacit support of the Americans, who had grown tired of his ineffectual government and felt a change was needed. Colby was not involved in that decision, which came down directly from the Kennedy White House to its ambassador in the country. But, good soldier that he was, he accepted it after it had become a fait accompli. He even agreed to travel to Vietnam in the immediate aftermath, to meet with the Vietnamese generals who had perpetrated the coup and assure them that they had powerful friends in Washington. Did he realize in his Catholic heart of hearts that his nation had forever lost the moral high ground in Vietnam on the day of Diem’s murder? We cannot say.

The situation escalated quickly under the new President Lyndon Johnson, as more and more American troops were sent to fight a civil war on behalf of the South Vietnamese, a war which the latter didn’t seem overly inclined to fight for themselves. Colby hardly saw his family now, spending months at a stretch in the country. Lawrence of Arabia’s prescription for winning over a native population through ethical persuasion and cultural sensitivity was proving unexpectedly difficult to carry out in Vietnam, most of whose people seemed just to want the Americans to go away. It appeared that a stronger prescription was needed.

Determined to put down the Viet Cong — communist partisans in the south of the country who swarmed over the countryside, killing American soldiers and poisoning their relations with the locals — Colby introduced a “Phoenix Program” to eliminate them. It became without a doubt the biggest of all the moral stains on his career. The program’s rules of engagement were not pretty to begin with, allowing for the extra-judicial execution of anyone believed to be in the Viet Cong leadership in any case where arresting him was too “hard.” But it got entirely out of control in practice, as described by James S. Olsen and Randy W. Roberts in their history of the war: “The South Vietnamese implemented the program aggressively, but it was soon laced with corruption and political infighting. Some South Vietnamese politicians identified political enemies as Viet Cong and sent Phoenix hit men after them. The pressure to identify Viet Cong led to a quota system that incorrectly labeled many innocent people the enemy.” Despite these self-evident problems, the Americans kept the program going for years, saying that its benefits were worth the collateral damage. Olsen and Roberts estimate that at least 20,000 people lost their lives as a direct result of Colby’s Phoenix Program. A large proportion of them — possibly even a majority — were not really communist sympathizers at all.

In July of 1971, Colby was hauled before the House Committee on Government Operations by two prominent Phoenix critics, Ogden Reid and Pete McCloskey (both Republicans.) It is difficult to absolve him of guilt for the program’s worst abuses on the basis of his circuitous, lawyerly answers to their straightforward questions.

Reid: Can you state categorically that Phoenix has never perpetrated the premeditated killing of a civilian in a noncombat situation?

Colby: No, I could not say that, but I do not think it happens often. Individual members of it, subordinate people in it, may have done it. But as a program, it is not designed to do that.

McCloskey: Did Phoenix personnel resort to torture?

Colby: There were incidents, and they were treated as an unjustifiable offense. If you want to get bad intelligence, you use bad interrogation methods. If you want to get good intelligence, you had better use good interrogation methods.


Oleg Kalugin (right) receives from Bulgarian security minister Dimitri Stoyanov the Order of the Red Star, thanks largely to his handling of John Walker. The bespectacled man standing between and behind the two is Yuri Andropov, then the head of the KGB, who would later become the fifth supreme leader of the Soviet Union.

During the second half of the 1960s, Oleg Kalugin spent far more time in the United States than did William Colby. He returned to the nation that had begun to feel like more of a home than his own in July of 1965. This time, however, he went to Washington, D.C., instead of New York City. His new cover was that of a press officer for the Soviet Foreign Ministry; his real job was that of a deputy director in the KGB’s Washington operation. He was to be a spy in the enemy’s city of secrets. “By all means, don’t treat it as a desk job,” he was told.

Kalugin took the advice to heart. He had long since developed a nose for those who could be persuaded to share their country’s deepest secrets with him, long since recognized that the willingness to do so usually stemmed from weakness rather than strength. Like a lion on the hunt, he had learned to spot the weakest prey — the nursers of grudges and harborers of regrets; the sexually, socially, or professionally frustrated — and isolate them from the pack of their peers for one-on-one persuasion. At one point, he came upon a secret CIA document that purported to explain the psychology of those who chose to spy for that yin to his own service’s yang. He found it to be so “uncannily accurate” a description of the people he himself recruited that he squirreled it away in his private files, and quoted from it in his memoir decades later.

Acts of betrayal, whether in the form of espionage or defection, are almost in every case committed by morally or psychologically unsteady people. Normal, psychologically stable people — connected with their country by close ethnic, national, cultural, social, and family ties — cannot take such a step. This simple principle is confirmed by our experience of Soviet defectors. All of them were single. In every case, they had a serious vice or weakness: alcoholism, deep depression, psychopathy of various types. These factors were in most cases decisive in making traitors out of them. It would only be a slight exaggeration to say that no [CIA] operative can consider himself an expert in Soviet affairs if he hasn’t had the horrible experience of holding a Soviet friend’s head over the sink as he poured out the contents of his stomach after a five-day drinking bout.

What follows from that is that our efforts must mostly be directed against weak, unsteady members of Soviet communities. Among normal people, we should pay special attention to the middle-aged. People that age are starting their descent from their psychological peak. They are no longer children, and they suddenly feel the acute realization that their life is passing, that their ambitions and youthful dreams have not come true in full or even in part. At this age comes the breaking point of a man’s career, when he faces the gloomy prospect of pending retirement and old age. The “stormy forties” are of great interest to an [intelligence] operative.

It’s great to be good, but it’s even better to be lucky. John Walker, the spy who made Kalugin’s career, shows the truth in this dictum. He was that rarest of all agents in the espionage trade: a walk-in. A Navy officer based in Norfolk, Virginia, he drove into Washington one day in late 1967 with a folder full of top-secret code ciphers on the seat of his car next to him, looked up the address of the Soviet embassy in the directory attached to a pay phone, strode through the front door, plunked his folder down on the front desk, and said matter-of-factly, “I want to see the security officer, or someone connected with intelligence. I’m a naval officer. I’d like to make some money, and I’ll give you some genuine stuff in return.” Walker was hastily handed a down payment, ushered out of the embassy, and told never under any circumstances to darken its doors again. He would be contacted in other ways if his information checked out.

Kalugin was fortunate enough to be ordered to vet the man. The picture he filled in was sordid, but it passed muster. Thirty years old when his career as a spy began, Walker had originally joined the Navy to escape being jailed for four burglaries he committed as a teenager. A born reprobate, he had once tried to convince his wife to become a prostitute in order to pay off the gambling debts he had racked up. Yet he could also be garrulous and charming, and had managed to thoroughly conceal his real self from his Navy superiors. A fitness report written in 1972, after he had already been selling his country’s secrets for almost five years, calls him “intensely loyal, taking great pride in himself and the naval service, fiercely supporting its principles and traditions. He possesses a fine sense of personal honor and integrity, coupled with a great sense of humor.” Although he was only a warrant officer in rank, he sat on the communications desk at Norfolk, handling radio traffic with submarines deployed all over the world. It was hard to imagine a more perfect posting for a spy. And this spy required no counseling, needed no one to pretend to be his friend, to talk him down from crises of conscience, or to justify himself to himself. Suffering from no delusions as to who and what he was, all he required was cold, hard cash. A loathsome human being, he was a spy handler’s dream.

Kalugin was Walker’s primary handler for two years, during which he raked in a wealth of almost unbelievably valuable information without ever meeting the man face to face. Walker was the sort of asset who turns up “once in a lifetime,” in the words of Kalugin himself. He became the most important of all the spies on the Kremlin’s payroll, even recruiting several of his family members and colleagues to join his ring. “K Mart has better security than the Navy,” he laughed. He would continue his work long after Kalugin’s time in Washington was through. Throughout the 1970s and into the 1980s, Navy personnel wondered at how the Soviets always seemed to know where their ships and submarines were and where their latest exercises were planned to take place. Not until 1985 was Walker finally arrested. In a bit of poetic justice, the person who turned him in to the FBI was his wife, whom he had been physically and sexually abusing for almost 30 years.

The luster which this monster shed on Kalugin led to the awarding of the prestigious Order of the Red Star, and then, in 1974, his promotion to the rank of KGB general while still just shy of his 40th birthday, making him the youngest such in the post-World War II history of the service. By that time, he was back in Moscow again, having been recalled in January of 1970, once again because it was becoming common knowledge among the Americans that his primary work in their country was that of a spy. He was too hot now to be given any more long-term foreign postings. Instead he worked out of KGB headquarters in Moscow, dealing with strategic questions and occasionally jetting off to far-flung trouble spots to be the service’s eyes and ears on the ground. “I can honestly say that I loved my work,” he writes in his memoir. “My job was always challenging, placing me at the heart of the Cold War competition between the Soviet Union and the United States.” As ideology faded, the struggle against imperialism had become more of an intellectual fascination — an intriguing game of chess — than a grand moral crusade.


William Colby testifies before Congress, 1975.

William Colby too was now back in his home country on a more permanent basis, having been promoted to executive director of the CIA — the third highest position on the agency’s totem pole — in July of 1971. Yet he was suffering through what must surely have been the most personally stressful period of his life since he had dodged Nazis as a young man behind enemy lines.

In April of 1973, his 23-year-old daughter Catherine died of anorexia. Her mental illness was complicated, as they always are, but many in the family believed it to have been aggravated by being the daughter of the architect of the Phoenix Program, a man who was in the eyes of much of her hippie generation Evil Incarnate. His marriage was now, in the opinion of his biographer Randall Bennett Woods, no more than a “shell.” Barbara blamed him not only for what he had done in Vietnam but for failing to be there with his family when his daughter needed him most, for forever skipping out on them with convenient excuses about duty and service on his lips.

Barely a month after Catherine’s death, Colby got a call from Alexander Haig, chief of staff in Richard Nixon’s White House: “The president wants you to take over as director of the CIA.” It ought to have been the apex of his professional life, but somehow it didn’t seem that way under current conditions. At the time, the slow-burning Watergate scandal was roiling the CIA almost more than the White House. Because all five of the men who had been arrested attempting to break into the Democratic National Committee’s headquarters the previous year had connections to the CIA, much of the press was convinced it had all been an agency plot. Meanwhile accusations about the Phoenix Program and other CIA activities, in Vietnam and elsewhere, were also flying thick and fast. The CIA seemed to many in Congress to be an agency out of control, ripe only for dismantling. And of course Colby was still processing the loss of his daughter amidst it all. It was a thankless promotion if ever there was one. Nevertheless, he accepted it.

Colby would later claim that he knew nothing of the CIA’s many truly dirty secrets before stepping into the top job. These were the ones that other insiders referred to as the “family jewels”: its many bungled attempts to assassinate Fidel Castro, before and after he became the leader of Cuba, as well as various other sovereign foreign leaders; the coups it had instigated against lawfully elected foreign governments; its experiments with mind control and psychedelic drugs on unwilling and unwitting human subjects; its unlawful wiretapping and surveillance of scores of Americans; its longstanding practice of opening mail passing between the United States and less-than-friendly nations. That Colby could have risen so high in the agency without knowing these secrets and many more seems dubious on the face of it, but it is just possible; the CIA was very compartmentalized, and Colby had the reputation of being a bit of a legal stickler, just the type who might raise awkward objections to such delicate necessities. “Colby never became a member of the CIA’s inner club of mandarins,” claims the agency’s historian Harold Ford. But whether he knew about the family jewels or not beforehand, he was stuck with them now.

Perhaps in the hope that he could make the agency’s persecutors go away if he threw them just a little red meat, Colby came clean about some of the dodgy surveillance programs. But that only whet the public’s appetite for more revelations. For as the Watergate scandal gradually engulfed the White House and finally brought down the president, as it became clear that the United States had invested more than $120 billion and almost 60,000 young American lives into South Vietnam only to see it go communist anyway, the public’s attitude toward institutions like the CIA was not positive; a 1975 poll placed the CIA’s approval rating at 14 percent. President Gerald Ford, the disgraced Nixon’s un-elected replacement, was weak and unable to protect the agency. Indeed, a commission chaired by none other than Vice President Nelson Rockefeller laid bare many of the family jewels, holding back only the most egregious incidents of meddling in foreign governments. But even those began to come out in time. Both major political parties had their sights set on future elections, and thus had a strong motivation to blame a rogue CIA for any and all abuses by previous administrations. (Attorney General Robert F. Kennedy, for example, had personally ordered and supervised some of the attempts on Fidel Castro’s life during the early 1960s.)

It was a no-win situation for William Colby. He was called up to testify in Congress again and again, to answer questions in the mold of “When did you stop beating your wife?”, as he put it to colleagues afterward. Everybody seemed to hate him: right-wing hardliners because they thought he was giving away the store (“It is an act of insanity and national humiliation,” said Secretary of State Henry Kissinger, “to have a law prohibiting the president from ordering assassinations”), left-wingers and centrists because they were sure he was hiding everything he could get away with and confessing only to that which was doomed to come out anyway — which was probably true. Colby was preternaturally cool and unflappable at every single hearing, which somehow only made everyone dislike him that much more. Some of his few remaining friends wanted to say that his relative transparency was a product of Catholic guilt — over the Phoenix Program, over the death of his daughter, perchance over all of the CIA’s many sins — but it was hard to square that notion with the rigidly composed, lawyerly presence that spoke in clipped, minimalist phrases before the television cameras. He seemed more like a cold fish than a repentant soul.

On November 1, 1975 — exactly six months after Saigon had fallen, marking the humiliating final defeat of South Vietnam at the hands of the communists — William Colby was called into the White House by President Ford and fired. “There goes 25 years just like that,” he told Barbara when he came home in a rare display of bitterness. His replacement was George Herbert Walker Bush, an up-and-coming Republican politician who knew nothing about intelligence work. President Ford said such an outsider was the only viable choice, given the high crimes and misdemeanors with which all of the rank and file of the CIA were tarred. And who knows? Maybe he was right. Colby stayed on for three more months while his green replacement got up to speed, then left public service forever.


An Oleg Kalugin campaign poster from 1990, after he reinvented himself as a politician. “Let’s vote for Oleg Kalugin!” reads the caption.

Oleg Kalugin was about to suffer his own fall from grace. According to his account, his rising star flamed out when he ventured out on a limb to support a defector from the United States, one of his own first contacts as a spy handler, who was now accused of stealing secrets for the West. The alleged double agent was sent to a Siberian prison despite Kalugin’s advocacy. Suspected now of being a CIA mole himself, Kalugin was reassigned in January of 1980 to a dead-end job as deputy director of the KGB’s Leningrad branch, where he would be sure not to see too much valuable intelligence. You live by the sword, you die by the sword; duplicity begets suspicions of duplicity, such that spies always end up eating their own if they stay in the business long enough.

Again according to Kalugin himself, it was in Leningrad that his nagging doubts about the ethics and efficacy of the Soviet system — the same ones that had been whispering at the back of his mind since the early 1960s — rose to a roar which he could no longer ignore. “It was all an elaborately choreographed farce, and in my seven years in Leningrad I came to see that we had created not only the most extensive totalitarian state apparatus in history but also the most arcane,” he writes. “Indeed, the mind boggled that in the course of seven decades our communist leaders had managed to construct this absurd, stupendous, arcane ziggurat, this terrifyingly centralized machine, this religion that sought to control all aspects of life in our vast country.” We might justifiably wonder that it took him so long to realize this, and note with some cynicism that his decision to reject the system he had served all his life came only after that system had already rejected him. He even confesses that, when Leonid Brezhnev died in 1982 and was replaced by Yuri Andropov, a former head of the KGB who had always thought highly of Kalugin, he wasn’t above dreaming of a return to the heart of the action in the intelligence service. But it wasn’t to be. Andropov soon died, to be replaced by another tired old man named Konstantin Chernenko who died even more quickly, and then Mikhail Gorbachev came along to accidentally dismantle the Soviet Union in the name of saving it.

In January of 1987, Kalugin was given an even more dead-end job, as a security officer in the Academy of Sciences in Moscow. From here, he watched the extraordinary events of 1989, as country after country in the Soviet sphere rejected its communist government, until finally the Berlin Wall fell, taking the Iron Curtain down with it. Just like that, the Cold War was over, with the Soviet Union the undeniable loser. Kalugin must surely have regarded this development with mixed feelings, given what a loyal partisan he had once been for the losing side. Nevertheless, on February 26, 1990, he retired from the KGB. After picking up his severance check, he walked a few blocks to the Institute of History and Archives, where a group of democracy activists had set up shop. “I want to help the democratic movement,” he told them, in a matter-of-fact tone uncannily similar to that of John Walker in a Soviet embassy 22 years earlier. “I am sure that my knowledge and experience will be useful. You can use me in any capacity.”

And so Oleg Kalugin reinvented himself as an advocate for Russian democracy. A staunch supporter of Boris Yeltsin and his post-Soviet vision for Russia, he became an outspoken opponent of the KGB, which still harbored in its ranks many who wished to return the country to its old ways. He was elected to the Supreme Soviet in September of 1990, in the first wave of free and fair elections ever held in Russia. When some of his old KGB colleagues attempted a coup in August of 1991, he was out there manning the barricades for democracy. The coup was put down — just.


William Colby in his later years, enjoying his sailboat, one of his few sources of uncalculated joy.

William Colby too had to reinvent himself after the agency he served declared that it no longer needed him. He wrote a circumspect, slightly anodyne memoir about his career; its title of Honorable Men alone was enough to tell the world that it wasn’t the tell-all book from an angry spy spurned that it might have been hoping for. He consulted for the government on various issues for larger sums than he had ever earned as a regular federal employee, appeared from time to time as an expert commentator on television, and wrote occasional opinion pieces for the national press, most commonly about the ongoing dangers posed by nuclear weapons and the need for arms-control agreements with the Soviet Union.

In 1982, at the age of 62, this stiff-backed avatar of moral rectitude fell in love with a pretty, vivacious 37-year-old, a former American ambassador to Grenada named Sally Shelton. It struck those who knew him as almost a cliché of a mid-life crisis, of the sort that the intelligence services had been exploiting for decades — but then, clichés are clichés for a reason, aren’t they? “I thought Bill Colby had all the charisma of a shoe clerk,” said one family friend. “Sally is a very outgoing woman, even flamboyant. She found him a sex object, and with her he was.” The following year, Colby asked his wife Barbara for a divorce. She was taken aback, even if their marriage hadn’t been a particularly warm one in many years. “People like us don’t get a divorce!” she exclaimed — meaning, of course, upstanding Catholic couples of the Greatest Generation who were fast approaching their 40th wedding anniversary. But there it was. Whatever else was going on behind that granite façade, it seemed that Colby felt he still had some living to do.

None of Colby’s family attended the marriage ceremony, or had much to do with him thereafter. He lost not only his family but his faith: Sally Shelton had no truck with Catholicism, and he only went to church after he married her for weddings and funerals. Was the gain worth the loss? Only Colby knew the answer.


Old frenemies: Oleg Kalugin and William Colby flank Ken Berris, who directed the Spycraft video sequences.

Oleg Kalugin met William Colby for the first time in May of 1991, when both were attending the same seminar in Berlin — appropriately enough, on the subject of international terrorism, the threat destined to steal the attention of the CIA and the Russian FSB (the successor to the KGB) as the Cold War faded into history. The two men had dinner together, then agreed to be jointly interviewed on German television, a living symbol of bygones becoming bygones. “What do you think of Mr. Colby as a leading former figure in U.S. intelligence?” Kalugin was asked.

“Had I had a choice in my earlier life, I would have gladly worked under Mr. Colby,” he answered. The two became friends, meeting up whenever their paths happened to cross in the world.

And why shouldn’t they be friends? They had led similar lives in so many ways. Both were ambitious men who had justified their ambition as a call to service, then devoted their lives to it, swallowing any moral pangs they might have felt in the process, until the people they served had rejected them. In many ways, they had more in common with one another than with the wives and children they had barely seen for long stretches of their lives.

And how are we to judge these two odd, distant men, both so adept at the art of concealment as to seem hopelessly impenetrable? “I am not emotional,” Colby said to a reporter during his turbulent, controversy-plagued tenure as director of the CIA. “I admit it. Oh, don’t watch me like that. You’re looking for something underneath which isn’t there. It’s all here on the surface, believe me.”

Our first instinct might be to scoff at such a claim; surely everyone has an inner life, a tender core they dare reveal only to those they love best. But maybe we should take Colby at his word; maybe doing so helps to explain some things. As Colby and Kalugin spouted their high-minded ideals about duty and country, they forgot those closest to them, the ones who needed them most of all, apparently believing that they possessed some undefined special qualities of character or a special calling that exempted them from all that. Journalist Neil Sheehan once said of Colby that “he would have been perfect as a soldier of Christ in the Jesuit order.” There is something noble but also something horrible about such devotion to an abstract cause. One has to wonder whether it is a crutch, a compensation for some piece of a personality that is missing.

Certainly there was an ultimate venality, an amorality to these two men’s line of work, as captured in the subtitle of the computer game they came together to make: “The Great Game.” Was it all really just a game to them? It would seem so, at least at the end. How else could Kalugin blithely state that he would have “gladly” worked with Colby, forgetting the vast gulf of ideology that lay between them? Tragically, the ante in their great game was all too often human lives. Looking back on all they did, giving all due credit to their courage and capability, it seems clear to me that the world would have been better off without their meddling. The institutions they served were full of people like them, people who thought they knew best, who thought they were that much cleverer than the rest of the world and had a right to steer its course from the shadows. Alas, they weren’t clever enough to see how foolish and destructive their arrogance was.

“My father lived in a world of secrets,” says William’s eldest son Carl Colby. “Always watching, listening, his eye on the door. He was tougher, smarter, smoother, and could be crueler than anybody I ever knew. I’m not sure he ever loved anyone, and I never heard him say anything heartfelt.” Was William Colby made that way by the organization he served, or did he join the organization because he already was that way? It’s impossible to say. Yet we must be sure to keep these things in mind when we turn in earnest to the game on which Colby and Kalugin allowed their names to be stamped, and find out what it has to say about the ethical wages of being a spy.

(Sources: the books Legacy of Ashes: The History of the CIA by Tim Weiner, The Sword and the Shield: The Mitrokhin Archive and the Secret History of the KGB by Christopher Andrew and Vasili Mitrokhin, Lost Crusader: The Secret Wars of CIA Director William Colby by John Prados, Spymaster: My Thirty-Two Years in Intelligence and Espionage against the West by Oleg Kalugin, Where the Domino Fell: America and Vietnam, 1945-2010, sixth edition by James S. Olson and Randy Roberts, Shadow Warrior: William Egan Colby and the CIA by Randall B. Woods, Honorable Men: My Life in the CIA by William Colby and Peter Forbath, and Lost Victory: A Firsthand Account of America’s Sixteen-Year Involvement in Vietnam by William Colby and James McCargar; the documentary film The Man Nobody Knew: In Search of My Father, CIA Spymaster William Colby; Sierra On-Line’s newsletter InterAction of Summer 1993; Questbusters of February 1994. Online sources include “Who Murdered the CIA Chief?” by Zalin Grant at Pythia Press.)

 

Tags: , ,

Sequels in Strategy Gaming, Part 3: Heroes of Might and Magic II

New World Computing’s Heroes of Might and Magic II: The Succession Wars is different from the strategy-game sequels we’ve previously examined in this series in a couple of important ways. For one thing, it followed much more quickly on the heels of its predecessor: the first Heroes shipped in September of 1995, this follow-up just over one year later. This means that it doesn’t represent as dramatic a purely technological leap as do Civilization II and Master of Orion II; Heroes I as well was able to take advantage of SVGA graphics, CD-ROM, and all the other transformations the average home computer underwent during the first half of the 1990s. But the Heroes series as a whole is also conceptually different from the likes of Civilization and Master of Orion. It’s a smaller-scale affair, built around human-crafted rather than procedurally-generated maps, with more overt, pre-scripted narrative elements. All of these factors cause Heroes II to blur the boundaries between the fiction-driven and the systems-driven sequel. Its campaign — which is, as we’ll see, only one part of what it has to offer — is presented as a direct continuation of the story, such as it was, of Heroes I. At the same time, though, it strikes me as safe to say that no one bought the sequel out of a burning desire to find out what happens to the sons of Lord Morglin Ironfist, the star of the first game’s sketchy campaign. They rather bought it because they wanted a game that did what Heroes I had done, only even better. And fortunately for them, this is exactly what they got.

Heroes II doesn’t revamp its predecessor to the point of feeling like a different game entirely, as Master of Orion II arguably does. The scant amount of time separating it from its inspiration wouldn’t have allowed for that even had its creators wished it. It would surely not have appeared so quickly — if, indeed, it ever appeared at all — absent the new trend of strategy-game sequels. But as it was, Jon Van Caneghem, the founder of New World Computing and the mastermind of Heroes I and II, approached it as he had his earlier Might and Magic CRPG series, which had seen five installments by the time he (temporarily) shifted his focus to strategy gaming. “We weren’t making a sequel for the first time,” says the game’s executive producer Mark Caldwell. “So we did as we always did. Take ideas we couldn’t use or didn’t have time to implement in the previous game and work them into the next one. Designing a computer game, at least at [New World], was always about iterating. Just start, get something working, then get feedback and iterate.” In the case of Heroes II, it was a matter of capitalizing on the first game’s strengths — some of which hadn’t been entirely clear to its own makers until it was released and gamers everywhere had fallen in love with it — and punching up its relatively few weaknesses.

Ironically, the aspects of Heroes I that people seemed to appreciate most of all were those that caused it to most resemble the Might and Magic CRPGs, whose name it had borrowed more for marketing purposes than out of any earnest belief that it was some sort of continuation of that line. Strategy designers at this stage were still in the process of learning how the inclusion of individuals with CRPG-like names and statistics, plus a CRPG-like opportunity to level them up as they gained experience, could allow an often impersonal-feeling style of game to forge a closer emotional connection with its players. The premier examples before Heroes I were X-COM, which had the uncanny ability to make the player’s squad of grizzled alien-fighting soldiers feel like family, and Master of Magic, whose own fantasy heroes proved to be so memorable that they almost stole the show, much to the surprise of that game’s designer. Likewise, Jon Van Caneghem had never intended for the up to eight heroes you can recruit to your cause in Heroes of Might and Magic to fill as big a place in players’ hearts as they did. He originally thought he was making “a pure strategy game that was meant to play and feel like chess.” But a rudimentary leveling system along with names and character portraits for the heroes, all borrowed to some extent from his even earlier strategy game King’s Bounty, sneaked in anyway, and gamers loved it. The wise course was clearly to double down on the heroes in Heroes II.

Thus we get here a much more fleshed-out system for building up the capabilities of our fantasy subordinates, and building up our emotional bond with them in the process. The spells they can cast, in combat and elsewhere, constitute the single most extensively revamped part of the game. Not only are there many more of them, but they’ve been slotted into a magic system complex enough for a full-fledged CRPG, with spell books and spell points and all the other trimmings. And there are now fourteen secondary skills outside the magic system for the heroes to learn and improve as they level up, from Archery (increases the damage a hero’s minions do in ranged combat) to Wisdom (allows the hero to learn higher level spells), from Ballistics (increases the damage done by the hero’s catapults during town sieges) to Scouting (lets the hero see farther when moving around the map).

Another thing that CRPGs had and strategy games usually lacked was a strong element of story. Heroes I did little more than gesture in that direction; its campaign was a set of generic scenarios that were tied together only by the few sentences of almost equally generic text in a dialog box that introduced each of them. The campaign in Heroes II, on the other hand, goes much further. It’s the story of a war between the brothers Roland and Archibald, good and evil respectively, the sons of the protagonist of Heroes I. You can choose whose side you wish to fight on at the outset, and even have the opportunity to switch sides midstream, among other meta-level decisions. Some effects and artifacts carry over from scenario to scenario during the campaign, giving the whole experience that much more of a sense of continuity. And the interstices between the scenarios are filled with illustrations and voice-over narration. The campaign isn’t The Lord of the Rings by any means — if you’re like me, you’ll have forgotten everything about the story roughly one hour after finishing it — but it more than serves its purpose while you’re playing.

Instead of just a few lines of text, each scenario in the campaign game is introduced this time by some lovely pixel art and a well-acted voice-over.

These are the major, obvious improvements, but they’re joined by a host of smaller ones that do just as much in the aggregate to make Heroes II an even better, richer game. When you go into battle, the field of action has been made bigger — or, put another way, the hexes on which the combatants stand have been made smaller, giving space for twice as many of them on the screen. This results in engagements that feel less cramped, both physically and tactically; ranged weapons especially really come into their own when given more room to roam, as it were. The strategic maps too can be larger, up to four times so — or for that matter smaller, again up to four times so. This creates the potential for scenarios with wildly different personalities, from vast open-world epics to claustrophobic cage matches.

The tactical battlefields are larger now, with a richer variety of spells to employ in ingenious combinations.

And then, inevitably, there’s simply more stuff everywhere you turn. There are two new factions to play as or against, making a total of six in all; more types of humans, demi-humans, and monsters to recruit and fight against; more types of locations to visit on the maps; more buildings to construct in your towns; way more cool weapons and artifacts to discover and give to your heroes; more and more varying standalone scenarios to play in addition to the campaign.

One of the two new factions is the necromancers, who make the returning warlocks seem cute and cuddly by comparison. Necromancer characters start with, appropriately enough, the “necromancy” skill, which gives them the potential to raise huge armies of skeletons from their opponents’ corpses after they’ve vanquished them. This has been called unbalancing, and it probably is in at least some situations, but it’s also a heck of a lot of fun, not to mention the key to beating a few of the most difficult scenarios.

The other new faction is the wizards. They can eventually recruit lightning-flinging titans, who are, along with the warlocks’ black dragons, the most potent single units in the game.

After Heroes II was released, New World delegated the task of making an expansion pack to Cyberlore Studios, an outfit with an uncanny knack for playing well with others’ intellectual property. (At the time, Cyberlore had just created a well-received expansion pack for Blizzard’s Warcraft II.) Heroes of Might and Magic II: The Price of Loyalty, the result of Cyberlore’s efforts, comes complete with not one but four new campaigns, each presented with the same lavishness as the chronicles of Roland and Archibald, along with still more new creatures, locations, artifacts, and standalone scenarios. All are welcome.

There are even riddles. Sigh… you can’t win them all, I guess. I feel about riddles in games the way Indiana Jones does about snakes in archaeological sites.

But wait, I can hear you saying: didn’t you just complain in those articles about Civilization II and Master of Orion II that just adding more stuff doesn’t automatically or even usually make a game better? I did indeed, and I’ve been thinking about why my reaction to Heroes II is so different. Some of it undoubtedly comes down to purely personal preferences. The hard truth is that I’ve always been more attracted to Civilization as an idea than an actual game; impressive as Civilization I was in the context of its time, I’ll go to my grave insisting that there are even tighter, even more playable designs than that one in the Sid Meier canon, such as Pirates! and Railroad Tycoon. I’m less hesitant to proclaim Master of Orion I a near-perfect strategy masterpiece, but my extreme admiration for it only makes me unhappy with the sequel, which removed some of the things I liked best about the original in favor of new complexities that I find less innovative and less compelling. I genuinely love Heroes I as well — and thus love the sequel even more for not trying to reinvent this particular wheel, for trying only to make it glide along that much more smoothly.

I do think I can put my finger on some objective reasons why Heroes II manages to add so much to its predecessor’s template without adding any more tedium. It’s starting from a much sparser base, for one thing; Heroes I is a pretty darn simple beast as computer strategy games of the 1990s go. The places where Heroes II really slathers on the new features — in the realms of character development, narrative, and to some extent tactical combat — are precisely those where its predecessor feels most under-developed. The rest of the new stuff, for all its quantity, adds variety more so than mechanical complexity or playing time. A complete game of Heroes II doesn’t take significantly longer to play than a complete game of Heroes I (unless you’re playing on one of those new epic-sized maps, of course). That’s because you won’t even see most of the new stuff in any given scenario. Heroes II gives its scenario designers a toolbox with many more bits and pieces to choose from, so that the small subset of them you see as a player each time out is always fresh and surprising. You have to play an awful lot of scenarios to exhaust all this game has to offer. It, on the other hand, will never exhaust you with fiddly details.

Anyway, suffice to say that I love Heroes II dearly. I’ve without a doubt spent more hours with it than any other game I’ve written about on this site to date. One reason for that is that my wife, who would never be caught dead playing a game like Civilization II or Master of Orion II, likes this one almost as much as I do. We’ve whiled away many a winter evening in multiplayer games, sitting side by side on the sofa with our laptops. (If that isn’t a portrait of the modern condition, I don’t know what is…) Sure, Heroes II is a bit slow to play this way by contemporary standards, being turn-based, and with consecutive rather than simultaneous turns at that, but that’s what good tunes on the stereo are for, isn’t it? We don’t like to fight each other, so we prefer the scenarios that let us cooperate — another welcome new feature. Or, failing that, we just agree to play until we’re the only two factions left standing.

What makes Heroes II such a great game in the eyes of both of us, and such a superb example of an iterative sequel done well? Simply put, everything that was fun in the first game is even more fun in the sequel. It still combines military strategy with the twin joys of exploration and character development in a way that I’ve never seen bettered. (My wife, bless her heart, is more interested in poking her head into every nook and cranny of a map and accessorizing her heroes like they’ve won a gift certificate to Lord & Taylor than she is in actually taking out the enemy factions, which means that’s usually down to me…) The strengthened narrative elements, not only between but within scenarios — a system of triggers now allows the scenario designer to advance the story even as you play — only makes the stew that much richer. Meanwhile the whole game is exquisitely polished, showing in its interface’s every nuance the hours and hours of testing and iterating that went into it before its release.

In this respect and many others, the strengths of Heroes II are the same as those of Heroes I. Both, for example, manage to dodge some of the usual problems of grand-strategy games by setting their sights somewhat lower than the 4X likes of Civilization and Master of Orion. There is no research tree here, meaning that the place where 4X strategizing has a tendency to become most rote is deftly dodged. Very, very little is rote about Heroes; many of its human-designed maps are consciously crafted to force you to abandon your conventional thinking. (The downside of this is a certain puzzle-like quality to some of the most difficult scenarios — a One True Way to Win that you must discover through repeated attempts and repeated failures — but even here, the thrill of figuring them outweighs the pain if you ask me.) Although the problem of the long, anticlimactic ending — that stretch of time after you know you’re going to win — isn’t entirely absent, some scenarios do have alternative victory conditions, and the fact that most of them can be played in a few hours at most from start to finish helps as well. The game never gets overly bogged down by tedious micromanagement, thanks to some sagacious limits that have been put in place, most notably the maximum of eight heroes you’re allowed to recruit, meaning that you can never have more than eight armies in the field. (It’s notable and commendable that New World resisted the temptation that must surely have existed to raise this limit in Heroes II.) The artificial intelligence of your computer opponents isn’t great by any means, but somehow even that doesn’t feel so annoying here; if the more difficult scenarios must still become that way by pitting your human cleverness against silicon-controlled hordes that vastly outnumber you, there are at least always stated reasons for the disparity to hand in a narrative-driven game like this one.

Also like its predecessor, Heroes II is an outlier among the hit games of the late 1990s, being turn-based rather than real-time, and relying on 2D pixel art rather than 3D graphics. Jon Van Caneghem has revealed in interviews that he actually did come alarmingly close to chasing both of those trends, but gave up on them for reasons having more to do with budgetary and time constraints than any sort of purist design ideology. For my part, I can only thank the heavens that such practicalities forced him to keep it old-school in the end. Heroes II still looks great today, which would probably not be the case if it was presented in jaggy 1990s 3D. New World’s artists had a distinct style, one that also marked the Might and Magic CRPG series: light, colorful, whimsical, and unabashedly cartoon-like, in contrast to the darker hued, ultra-violent aesthetic that marked so much of the industry in the post-DOOM era. Heroes II is perhaps slightly murkier in tone and tint than the first game, but it remains a warming ray of sunshine when stood up next to its contemporaries, who always seem to be trying too hard to be an epic saga, man. Its more whimsical touches never lose their charm: the vampires who go Blahhh! like Count Chocula when they make an attack, the medusae who slink around the battlefield like supermodels on the catwalk. Whatever else you can say about it, you can never accuse Heroes of Might and Magic of taking itself too seriously.

But there is one more way that Heroes II improves on Heroes I, and it is in some senses the most important of them all. The sequel includes a scenario construction kit, the very same tool that was used to build the official maps; the only thing missing is a way to make the cut scenes that separate the campaign scenarios. It came at the perfect time to give Heroes II a vastly longer life than it otherwise would have enjoyed, even with all of its other merits.

The idea of gaming construction kits was already a venerable one by the time of Heroes II‘s release. Electronic Arts made it something of their claim to fame in their early years, with products like Pinball Construction Set, Adventure Construction Set, and Racing Destruction Set. Meanwhile EA’s affiliated label Strategic Simulations had a Wargame Construction Set and Unlimited Adventures (the latter being a way of making new scenarios for the company’s beloved Gold Box CRPG engine). But all of these products were hampered somewhat by the problem of what you the buyer were really to do with a new creation into which you had poured your imagination, talent, and time. You could share it among your immediate circle of friends, assuming they had all bought (or pirated) the same game you had, maybe even upload it to a bulletin board or two or to a commercial online service like CompuServe, but doing so only let you reach a tiny cross-section of the people who might be able and willing to play it. And this in turn led to you asking yourself an unavoidable question: is this thing I want to make really worth the effort if it will hardly get played?

The World Wide Web changed all that at a stroke, as it did so much else in computing and gaming. The rise of a free and open, easily navigable Internet meant that you could now share your creation with everyone with the same base game you had bought. And so gaming construction kits of all stripes suddenly became much more appealing, were allowed to begin to fulfill their potential at last. Heroes of Might and Magic II is a prime case in point.

A bustling community of amateur Heroes II designers sprang up on the Internet after the game’s release, to stretch it in all sorts of delightful ways that New World had never anticipated. The best of the thousands of scenarios they produced are so boldly innovative as to make the official ones seem a bit dull and workmanlike by comparison. For example, “Colossal Cavern” lives up to its classic text-adventure namesake by re-imagining Heroes II as a game of dungeon delving and puzzle solving rather than strategic conquest. “Go Ask Alice,” by contrast, turns it into a game of chess with living pieces, like in Alice in Wonderland. “The Road Home” is a desperate chase across a sprawling map with enemy armies that outnumber you by an order of magnitude hot on your heels. And “Agent of Heaven” is a full campaign — one of a surprising number created by enterprising fans — that lets you live out ancient Chinese history, from the age of Confucius through the rise of the Qin and Han dynasties; it’s spread over seven scenarios, with lengthy journal entries to read between and within them as you go along.

The scenario editor has its limits as a vehicle for storytelling, but it goes farther than you might expect. Text boxes like these feature in many scenarios, and not only as a way of introducing them. The designer can set them to appear when certain conditions are fulfilled, such as a location visited for the first time by the player or a given number of days gone by. In practice, the most narratively ambitious scenarios tend to be brittle and to go off the rails from a storytelling perspective as soon as you do something in the wrong order, but one can’t help but be impressed by the lengths to which some fans went. Call it the triumph of hope over experience…

As the size and creative enthusiasm of its fan community will attest, Heroes II was hugely successful in commercial terms, leaving marketers everywhere shaking their heads at its ability to be so whilst bucking the trends toward real-time gameplay and 3D graphics. I can give you no hard numbers on its sales, but anecdotal and circumstantial evidence alone would place it not too far outside the ballpark of Civilization II‘s sales of 3 million copies. Certainly its critical reception was nothing short of rapturous; Computer Gaming World magazine pronounced it “nearly perfect,” “a five-star package that will suck any strategy gamer into [a] black hole of addictive fun.” The expansion too garnered heaps of justified praise and stellar sales when it arrived some nine months after the base game. The only loser in the equation was Heroes I, a charming little game in its own right that was rendered instantly superfluous by the superior sequel in the eyes of most gamers.

Personally, though, I’m still tempted to recommend that you start with Heroes I and take the long way home, through the whole of one of the best series in the history of gaming. Then again, time is not infinite, and mileages do vary. The fact is that this series tickles my sweet spots with uncanny precision. Old man that I’m fast becoming, I prefer its leisurely turn-based gameplay to the frenetic pace of real-time strategy. At the same time, though, I do appreciate that it plays quickly in comparison to a 4X game. I love its use of human-crafted scenarios, which I almost always prefer to procedurally-generated content, regardless of context. And of course, as a dyed-in-the-wool narratological gamer, I love the elements of story and character-building that it incorporates so well.

So, come to think of it, this might not be such a bad place to start with Heroes of Might and Magic after all. Or to finish, for that matter — if only it wasn’t for Heroes III. Now there’s a story in itself…

(Sources: Retro Gamer 239; Computer Gaming World of February 1997 and September 1997; XRDS: The ACM Magazine for Students of Summer 2017. Online sources include Matt Barton’s interview with John Van Caneghem.

Heroes of Might and Magic II is available for digital purchase on GOG.com, in a “Gold” edition that includes the expansion pack.

And here’s a special treat for those of you who’ve made it all the way down here to read the fine print. I’ve put together a zip file of all of the Heroes II scenarios from a “Millennium” edition of the first three Heroes games that was released in 1999. It includes a generous selection of fan-made scenarios, curated for quality. You’ll also find the “Agent of Heaven” campaign mentioned above, which, unlike the three other fan-made scenarios aforementioned, wasn’t a part of the Millennium edition. To access the new scenarios, rename the folder “MAPS” in your Heroes II installation directory to something else for safekeeping, then unzip the downloaded archive into the installation directory. The next time you start Heroes II, you should find all of the new scenarios available through the standard “New Game” menu. Note that some of the more narratively ambitious new scenarios feature supplemental materials, found in the “campaigns” and “Journals” folders. Have fun!)

 
26 Comments

Posted by on February 17, 2023 in Digital Antiquaria, Interactive Fiction

 

Tags: , , ,

Sequels in Strategy Gaming, Part 2: Master of Orion II

MicroProse had just published Master of Magic, the second grand-strategy game from the Austin, Texas-based studio SimTex, when SimCity 2000 made the world safe for numbered strategy sequels. After a quick palate cleanser in the form of a computerized version of the Avalon Hill board game 1830: Railroads & Robber Barons, Steve Barcia and the rest of the SimTex crew turned their attention to a sequel to Master of Orion, their 1993 space opera that was already widely revered as one of the finest ever examples of its breed.

Originally announced as a product for the Christmas of 1995, it took the sequel one full year longer than that to actually appear. And this was, it must be said, all for the better. Master of Magic had been a rather brilliant piece of game design whose commercial prospects had been all but destroyed by its premature release in a woefully buggy state. To their credit, SimTex patched it, patched it, and then patched it some more in the months that followed, until it had realized most of its immense potential as a game. But by then the damage had been done, and what might have been an era-defining strategy game like Civilization — or, indeed, the first Master of Orion — had been consigned to the status of a cult classic. On the bright side, MicroProse did at least learn a lesson from this debacle: Master of Orion II: Battle at Antares was given the time it needed to become its best self. The game that shipped just in time for the Christmas of 1996 was polished on a surface level, whilst being relatively well-balanced and mostly bug-free under the hood.

Gamers’ expectations had changed in some very significant ways in the three years since its predecessor’s release, and not generally to said predecessor’s benefit. The industry had now completed the transition from VGA graphics, usually running at a resolution of 320 X 200, to SVGA, with its resolutions of 640 X 480 or even more. The qualitative difference belies the quantitative one. Seen from the perspective of today, the jump to SVGA strikes me as the moment when game graphics stop looking undeniably old, when they can, in the best cases at any rate, look perfectly attractive and even contemporary. Unfortunately, Master of Orion I was caught on the wrong side of this dividing line; a 1993 game like it tended to look far uglier in 1996 than, say, a 1996 game would in 1999.

So, the first and most obvious upgrade in Master of Orion II was a thoroughgoing SVGA facelift. The contrast is truly night and day when you stand the two games up side by side; the older one looks painfully pixelated and blurry, the newer one crisp and sharp, so much so that it’s hard to believe that only three years separate them. But the differences at the interface level are more than just cosmetic. Master of Orion II‘s presentation also reflects the faster processor and larger memory of the typical 1996 computer, as well as an emerging belief in this post-Windows 95 era that the interface of even a complex strategy game aimed at the hardcore ought to be welcoming, intuitive, and to whatever extent possible self-explanatory. The one we see here is a little marvel, perfectly laid out, with everything in what one intuitively feels to be its right place, with a helpful explanation never any farther away than a right click on whatever you have a question about. It takes advantage of all of the types of manipulation that are possible with a mouse — in particular, it sports some of the cleverest use of drag-and-drop yet seen in a game to this point. In short, everything just works the way you think it ought to work, which is just about the finest compliment you can give to a user interface. Master of Orion I, for all that it did the best it could with the tools at its disposal in 1993, feels slow, jerky, and clumsy by comparison — not to mention ugly.

The home screen of Master of Orion I

…and its equivalent in Master of Orion II. One of the many benefits of a higher resolution is that even the “Huge” galaxy I’ve chosen to play in here now fits onto a single screen.

If Master of Orion II had attempted to be nothing other than a more attractive, playable version of its antecedent, plenty of the original game’s fans would doubtless have welcomed it on that basis alone. In fact, one is initially tempted to believe that this is where its ambitions end. When we go to set up a new game, what we find is pretty much what we would imagine seeing in just such a workmanlike upgrade. Once again, we’re off to conquer a procedurally generated galaxy of whatever size we like, from Small to Huge, while anywhere from two to eight other alien races are attempting to do the same. Sure, there are a few more races to play as or against this time, a new option to play as a custom race with strengths and weaknesses of our own choosing, and a few other new wrinkles here and there, but nothing really astonishing. For example, we do have the option of playing against other real people over a network now, but that was becoming par for the course in this post-DOOM era, when just about every game was expected to offer some sort of networked multiplayer support, and could expect to be dinged by the critics if it didn’t. So, we feel ourselves to be in thoroughly familiar territory when the game proper begins, greeting us with that familiar field of stars, representing yet another galaxy waiting to be explored and conquered.

Master of Orion II‘s complete disconnection from the real world can be an advantage: it can stereotype like crazy when it comes to the different races, thereby making each of them very distinct and memorable. None of us have to feel guilty for hating the Darlocks for the gang of low-down, backstabbing, spying blackguards they are. If Civilization tried to paint its nationalities with such a broad brush, it would be… problematic.

But when we click on our home star, we get our first shock: we see that each star now has multiple planets instead of the single one we’re used to being presented with in the name of abstraction and simplicity. Then we realize that the simple slider bars governing each planetary colony’s output have been replaced by a much more elaborate management screen, where we decide what proportion of our population will work on food production (a commodity we never even had to worry about before), on industrial production, and on research. And we soon learn that now we have to construct each individual upgrade we wish our colony to take advantage of by slotting it into a build queue that owes more to Master of Magic — and by extension to that game’s strong influence Civilization — than it does to Master of Orion I.

By the middle and late game, your options for building stuff can begin to overwhelm; by now you’re managing dozens (or more) of individual colonies, each with its own screen like this. The game does offer an “auto-build” option, but it rarely makes smart choices; you can kiss your chances of winning goodbye if you use it on any but the easiest couple of difficulty levels. It would be wonderful if you could set up default build queues of your own and drag and drop them onto colonies, but the game’s interest in automation doesn’t extend this far.

This theme of superficial similarities obscuring much greater complexity will remain the dominant one. The mechanics of Master of Orion II are actually derived as much from Master of Magic and Civilization as from Master of Orion I. It is, that is to say, nowhere near such a straightforward extension of its forerunner as Civilization II is. It’s rather a whole new game, with whole new approaches in several places. Whereas the original Master of Orion was completely comfortable with high-level abstraction, the sequel’s natural instinct is to drill down into the details of everything it can. Does this make it better? Let’s table that question for just a moment, and look at some of the other ways in which the game has changed and stayed the same.

The old research system, which allowed you to make progress in six different fields at once by manipulating a set of proportional sliders, has been replaced by one where you can research just one technology at a time, like in Civilization. It’s one of the few places where the second game is less self-consciously “realistic” than the first; the scientific establishment of most real space-faring societies will presumably be able to walk and chew gum at the same time. But, in harking back so clearly to Civilization rather than to its own predecessor, it says much about where Steve Barcia’s head was at as he was putting this game together.

Master of Orion I injected some entropy into its systems by giving you the opportunity to research only a randomized subset of the full technology tree, forcing you to think on your feet and play the hand you were given. The sequel divides the full ladder of Progress into groupings of one to three technologies that are always the same, and lets you choose one of them from each group — and only one of them — for yourself rather than choosing for you. You still can’t research everything, in other words, but now it’s you who decides what does get researched. (This assumes that you aren’t playing a race with the “Creative” ability, which lets you gain access to all available technologies each step of the way, badly unbalancing the game in the process.)

The research screen in a game that’s pretty far along. We can choose to research in just one of the eight categories at a time, and must choose just one technology within that category. The others are lost to us, unless we can trade for or steal them from another race.

We’re on more familiar ground when it comes to our spaceships and all that involves them. Once again, we can design our own ships using all of the fancy technologies our scientists have recently invented, and once again we can command them ourselves in tactical battles that don’t depart all that much from what we saw in the first game. That said, even here there are some fresh complications. There’s a new “command point” system that makes the number of fleets we can field dependent on the communications infrastructure we’ve built in our empire, while now we also need to build “freighters” to move food from our bread-basket planets to those focused more on industry or research. Another new wrinkle here is the addition of “leaders,” individuals who come along to offer us their services from time to time. They’re the equivalent of Master of Magic‘s heroes, to the extent that they even level up CRPG-style over time, although they wind up being vastly less consequential and memorable than they were in that game.

Leaders for hire show up from time to time, but you never develop the bonds with them that you do with Master of Magic‘s heroes. That’s a pity; done differently, leaders might have added some emotional interest to a game that can feel a bit dry.

The last major facet of the game after colony, research, and ship management is your relationship with the other aliens you eventually encounter. Here again, we’re on fairly familiar ground, with trade treaties, declarations of war and peace and alliance, and spying for purposes of information theft or sabotage all being possible and, on the more advanced difficulty levels, necessary. We have three ways of winning the game, which is one more than in Master of Orion I. As before, we can simply exterminate all of the other empires, or we can win enough of them over through friendship or intimidation that they vote to make us the supreme leader of a Galactic Council. But we can now also travel to a different dimension and defeat a mysterious alien race called the Antarans that live there, whereupon all of the races back in our home dimension will recognize us as the superior beings we’ve just proved ourselves to be. Here there are more echoes of Master of Magic — specifically, of that game’s two planes of Arcanus and Myrror and the dimensional gates that link them together.

The workings of the Galactic Council vote are virtually unchanged from Master of Orion I.

What to make of this motley blend, which I would call approximately 50 percent Master of Orion I, 25 percent Civilization, and 25 percent Master of Magic? First, let me tell you what most fans of grand strategy think. Then, I’ll give you my own contrarian take on it..

The verdict of the masses is clear: Master of Orion II is one of the most beloved and influential strategy games of all time. As popular in the latter 1990s as any grand-strategy game not called Civilization, it’s still widely played today — much more so, I would reckon, than the likes of its contemporary Civilization II. (Certainly Master of Orion II looks far less dated today by virtue of not running under Windows and using the Windows 3 widgets — to say nothing of those oh-so-1990s live-action video clips Civilization II featured.)  It’s often described as the archetypal strategic space opera, the Platonic ideal which every new space-based grand-strategy game must either imitate or kick against (or a little of both). And why not? Having received several patches back in the day to correct the few issues in its first release, it’s finely balanced (that “Creative” ability aside — and even it has been made more expensive than it used to be), rich in content, and reasonably attractive to look at even today. And on top of all that there’s a gob-smackingly good interface that hardly seems dated at all. What’s not to like?

Well… a few things, in this humble writer’s opinion. For me, the acid test for additional complexity in a game is partially whether it leads to more “interesting choices,” as Sid Meier would put it, but even more whether it makes the fiction come more alive. (I am, after all, very much an experiential player, very much in tune with Meier’s description of the ideal game of Civilization as “an epic story.”) Without one or preferably both of these qualities, added complexity just leads to added tedium in my book. In the beginning, when I’m developing only one or two planets, I can make a solid case for Master of Orion II‘s hands-on approach to colony management using these criteria. But when one or two colonies become one or two dozen, then eventually one or two hundred, the negatives rather outweigh the positives for me. Any benefits you get out of dragging all those little colonists around manually live only at the margins, as it were. For the reality is that you’ll quickly come up with a standard, rote approach to building up each new planet, and see it through as thoughtlessly as you put your shirt on each morning. At most, you might have just a few default approaches, depending on whether you want the colony to focus on agriculture, industry, or research. Only in a rare crisis, or maybe in the rare case of a truly exceptional planet, will you mix it up all that much.

Master of Orion II strikes me as emblematic of a very specific era in strategy gaming, when advances in computing hardware weren’t redounding entirely to the benefit of game design. During the 1980s and early 1990s, designs were brutally constrained by slow processors and small memories; games like the first Master of Orion (as well as such earlier space operas as the 1983 SSG classic Reach for the Stars) were forced by their circumstance to boil things down to their essentials. By 1996, however, with processor speeds starting to be measured in the hundreds of megahertz and memory in the tens of megabytes, there was much more space for bells, whistles, and finicky knob-twiddling. We can see this in Civilization II, and we can see it even more in Master of Orion II. The problem, I want to say, was that computing technology had fallen into a sort of uncanny valley: the latest hardware could support a lot more mechanical, quantitative complexity, but wasn’t yet sufficient to implement more fundamental, qualitative changes, such as automation that allows the human player to intervene only where and when she will and improved artificial intelligence for the computer players. Tellingly, this last is the place where Master of Orion II has changed least. You still have the same tiny set of rudimentary diplomatic options, and the computer players remain as simple-minded and manipulable as ever. As with so many games of this era, the higher difficulty levels don’t make the computer players smarter; they only let them cheat more egregiously, giving them ever greater bonuses to all of the relevant numbers.

There are tantalizing hints that Steve Barcia had more revolutionary ambitions for Master of Orion II at one point in time. Alan Emrich, the Computer Gaming World scribe who coined the term “4X” (“Explore, Expand, Exploit, Exterminate”) for the first game and did so much to shape it as an early play-tester that a co-designer credit might not have been out of order, was still in touch with SimTex while they worked on the second. He states that Barcia originally “envisioned a ‘layered’ design approach so that people could focus on what they wanted to play. Unfortunately, that goal wasn’t reached.” Perhaps the team fell back on what was relatively easy to do when these ambitions proved too hard to realize, or perhaps at least part of the explanation lies in another event: fairly early in the game’s development, Barcia sold his studio to his publisher MicroProse, and accepted a more hands-off executive role at the parent company. From then on, the day-to-day design work on Master of Orion II largely fell to one Ken Burd, previously the lead programmer.

For whatever reason, Master of Orion II not only fails to advance the conceptual state of the art in grand strategy, but actually backpedals on some of the important innovations of its predecessor, which had already addressed some of the gameplay problems of the then-nascent 4X genre. I lament most of all the replacement of the first game’s unique approach to research with something much more typical of the genre. By giving you the possibility of researching only a limited subset of technologies, and not allowing you to dictate what that subset consists of, Master of Orion I forced you to improvise, to build your strategy around what your scientific establishment happened to be good at. (No beam-weapon technologies? Better learn to use missiles! Weak on spaceship-range-extending technologies to colonize faraway star systems? Better wring every last bit of potential out of those closer to home!) In doing so, it ensured that every single game you played was different. Master of Orion II, by contrast, strikes me as too amenable to rote, static strategizing that can be written up almost like an adventure-game walkthrough: set up your race like this, research this, this, and this, and then you have this, which will let you do this… every single time. Once you’ve come up with a set of standard operating procedures that works for you, you’ve done so forever. After that point, “it’s hard to lose Master of Orion II,” as the well-known game critic Tom Chick admitted in an otherwise glowing 2000 retrospective.

In the end, then, the sequel is a peculiar mix of craft and complacency. By no means can one call it just a re-skinning; it does depart significantly from its antecedent. And yet it does so in ways that actually make it stand out less rather than more from other grand-strategy games of its era, thanks to the anxiety of influence.

For influence, you see, can be a funny thing. Most creative pursuits should be and are a sort of dialog. Games especially have always built upon one another, with each worthy innovation — grandly conceptual or strictly granular, it really doesn’t matter — finding its way into other games that follow, quite possibly in a more evolved form; much of what I’ve written on this very site over the past decade and change constitutes an extended attempt to illustrate that process in action. Yet influence can prove a double-edged sword when it hardens into a stultifying conventional wisdom about how games ought to be. Back in 1973, the literary critic Harold Bloom coined the term “anxiety of influence” in reference to the gravitational pull that the great works of the past can exert on later writers, convincing them to cast aside their precious idiosyncrasies out of a perceived need to conform to the way things ought to be done in the world of letters. I would argue that Civilization‘s set of approaches have cast a similar pall over grand-strategy-game design. The first Master of Orion escaped its long shadow, having been well along already by the time Sid Meier’s own landmark game was released. But it’s just about the last grand-strategy game about which that can be said. Master of Orion II reverts to what had by 1996 become the mean: a predictable set of bits and bobs for the player to busy herself with, arranged in a comfortably predictable way.

When I think back to games of Master of Orion I, I remember the big events, the lightning invasions and deft diplomatic coups and unexpected discoveries. When I think back to games of Master of Orion II, I just picture a sea of data. When there are too many decisions, it’s hard to call any of them interesting. Then again, maybe it’s just me. I know that there are players who love complexity for its own sake, who see games as big, fascinating systems to tweak and fiddle with — the more complicated the better. My problem, if problem it be, is that I tend to see games as experiences — as stories.

Ah, well. Horses for courses. If you’re one of those who love Master of Orion II — and I’m sure that category includes many of you reading this — rest assured that there’s absolutely nothing wrong with that. As for me, all this time spent with the sequel has only given me the itch to fire up the first one again…



Although I’ve never seen any hard sales numbers, all indications are that Master of Orion II was about as commercially successful as a game this time-consuming, slow-paced, and cerebral — and not named Civilization — could possibly be, most likely selling well into the hundreds of thousands of units. Yet its success didn’t lead to an especially bright future for SimTex — or MicroProse Austin, as it had now become known. In fact, the studio never managed to finish another game after it. Its last years were consumed by an expensive boondoggle known as Guardians: Agents of Justice, another brainchild of Steve Barcia, an “X-COM in tights,” with superheroes and supervillains instead of soldiers and aliens. That sounds like a pretty fantastic idea to me. But sadly, a turn-based tactical-combat game was at odds with all of the prevailing trends in an industry increasingly dominated by first-person shooters and real-time strategy; one frustrated MicroProse executive complained loudly that Barcia’s game was “slow as a pig.” It was accordingly forced through redesign after redesign, without ever arriving at anything that both satisfied the real or perceived needs of the marketers and was still fun to play. At last, in mid-1998, MicroProse pulled the plug on the project, shutting down the entirety of its brief-lived Austin-based subsidiary at the same time. And so that was that for SimTex; Master of Orion III, when it came, would be the work of a completely different group of people.

Guardians: Agents of Justice was widely hyped over the years. MicroProse plugged it enthusiastically at each of the first four E3 trade shows, and a preview was the cover story of Computer Games Strategy Plus‘s December 1997 issue. “At least Agents never graced a CGW cover,” joshed Terry Coleman of the rival Computer Gaming World just after Guardians‘s definitive cancellation.

Steve Barcia never took up the design reins of another game after conceiving Guardians of Justice, focusing instead on his new career in management, which took him to the very different milieu of the Nintendo-exclusive action-games house Retro Studios after his tenure at MicroProse ended. Some might consider this an odd, perchance even vaguely tragic fate for the designer of three of the most respected and beloved grand-strategy games of all time. On the other hand, maybe he’d just said all he had to say in game design, and saw no need to risk tarnishing his stellar reputation. Either way, his creative legacy is more than secure.

(Sources: the book The Anxiety of Influence: A Theory of Poetry by Harold Bloom; Computer Gaming World of October 1995, December 1996, March 1997, June 1997, July 1997, and October 1998; Computer Games Strategy Plus of December 1997. Online sources include Alan Emrich’s retrospective on the old Master of Orion III site and Tom Chick’s piece on Master of Orion II for IGN.

Master of Orion I and II are available as a package from GOG.com. So, you can compare and contrast, and decide for yourself whether I’m justified in favoring the original.)

 
 

Tags: , ,

Sequels in Strategy Gaming, Part 1: Civilization II

How do you make a sequel to a game that covers all of human history?

— Brian Reynolds

At the risk of making a niche website still more niche, allow me to wax philosophical for a moment on the subject of those Roman numerals that have been appearing just after the names of so many digital games almost from the very beginning. It seems to me that game sequels can be divided into two broad categories: the fiction-driven and the systems-driven.

Like so much else during gaming’s formative years, fiction-driven sequels were built off the example of Hollywood, which had already discovered that no happily ever after need ever be permanent if there was more money to be made by getting the old gang of heroes back together and confronting them with some new threat. Game sequels likewise promised their players a continuation of an existing story, or a new one that took place in a familiar setting with familiar characters. Some of the most iconic names in 1980s and early 1990s gaming operated in this mode: Zork, Ultima, Wizardry, King’s Quest, Carmen Sandiego, Leisure Suit Larry, Wing Commander. As anyone who has observed the progress of those series will readily attest, their technology did advance dramatically over the years. And yet this was only a part of the reason people stayed loyal to them. Gamers also wanted to get the next bit of story out of them, wanted to do something new in their comfortingly recognizable worlds. Unsurprisingly, the fiction-driven sequel was most dominant among games that foregrounded their fictions — namely the narrative-heavy genres of the adventure game and the CRPG.

But there was another type of sequel, which functioned less like a blockbuster Hollywood franchise and more like the version numbers found at the end of other types of computer software. It was the domain of games that were less interested in their fictions. These sequels rather promised to do and be essentially the same thing as their forerunner(s), only to do and be it even better, taking full advantage of the latest advances in hardware. Throughout the 1980s and well into the 1990s, the technology- or systems-driven sequel was largely confined to the field of vehicular simulations, a seemingly fussily specific pursuit that was actually the source in some years of no less than 25 percent of the industry’s total revenues. The poster child for the category is Microsoft’s Flight Simulator series, the most venerable in the entire history of computer gaming, being still alive and well as I write these words today, almost 43 years after it debuted on the 16 K Radio Shack TRS-80 under the imprint of its original publisher subLogic. If you were to follow this franchise’s evolution through each and every installment, from that monochrome, character-graphic-based first specimen to today’s photo-realistic feast for the senses, you’d wind up with a pretty good appreciation of the extraordinary advances personal computing has undergone over the past four decades and change. Each new Flight Simulator didn’t so much promise a new experience as the same old one perfected, with better graphics, better sound, a better frame rate, better flight modeling,  etc. When you bought the latest Flight Simulator — or F-15 Strike Eagle, or Gunship, or Falcon — you did so hoping it would take you one or two steps closer to that Platonic ideal of flying the real thing. (The fact that each installment was so clearly merely a step down that road arguably explains why these types of games have tended to age more poorly than others, and why you don’t find nearly as many bloggers and YouTubers rhapsodizing about old simulations today as you do games in most other genres.)

For a long time, the conventional wisdom in the industry held that strategy games were a poor fit with both of these modes of sequel-making. After all, they didn’t foreground narrative in the same way as adventures and CRPGs, but neither were they so forthrightly tech-centric as simulations. As a result, strategy games — even the really successful ones — were almost always standalone affairs.

But all that changed in a big way in 1993, when Maxis Software released SimCity 2000, a sequel to its landmark city-builder of four years earlierSimCity 2000 was a systems-driven sequel in the purest sense. It didn’t attempt to be anything other than what its predecessor had been; it just tried to be a better incarnation of that thing. Designer Will Wright had done his level best to incorporate every bit of feedback he had received from players of his original game, whilst also taking full advantage of the latest hardware to improve the graphics, sound, and interface. “Is SimCity 2000 a better program than the original SimCity?” asked Computer Gaming World magazine rhetorically. “It is without question a superior program. Is it more fun than the original SimCity? It is.” Wright was rewarded for his willingness to revisit his past with another huge hit, even bigger than his last one.

Other publishers greeted SimCity 2000‘s success as something of a revelation. At a stroke, they realized that the would-be city planners and generals among their customers were as willing as the would-be pilots and submarine captains to buy a sequel that enhanced a game they had already bought before, by sprucing up the graphics, addressing exploits, incongruities, and other weaknesses, and giving them some additional complexity to sink their teeth into. For better or for worse, the industry’s mania for franchises and sequels thus came to encompass strategy games as well.

In the next few articles, I’d like to examine a few of the more interesting results of this revelation — not SimCity 2000, a game about which I have oddly little to say, but another trio that would probably never have come to be without it to serve as a commercial proof of concept. All of the games I’ll write about are widely regarded as strategy classics, but I must confess that I can find unreserved love in my heart for only one of them. As for which one that is, and the reasons for my slight skepticism about the others… well, you’ll just have to read on and see, won’t you?


Civilization, Sid Meier’s colossally ambitious and yet compulsively playable strategy game of everything, was first released by MicroProse Software just in time to miss the bulk of the Christmas 1991 buying season. That would have been the death knell of many a game, but not this one. Instead Civilization became the most celebrated computer game since SimCity in terms of mainstream-media coverage, even as it also became a great favorite with the hardcore gamers. Journalists writing for newspapers and glossy lifestyle magazines were intrigued by it for much the same reason they had been attracted to SimCity, because its sweeping, optimistic view of human Progress writ large down through the ages marked it in their eyes as something uniquely high-toned, inspiring, and even educational in a cultural ghetto whose abiding interest in dwarfs, elves, and magic spells left outsiders like them and their readers nonplussed. The gamers loved it, of course, simply because it could be so ridiculously fun to play. Never a chart-topping hit, Civilization became a much rarer and more precious treasure: a perennial strong seller over months and then years, until long after it had begun to look downright crude in comparison to all of the slick multimedia extravaganzas surrounding it on store shelves. It eventually sold 850,000 copies in this low-key way.

Yet neither MicroProse nor Sid Meier himself did anything to capitalize on its success for some years. The former turned to other games inside and outside of the grand-strategy tent, while the latter turned his attention to C.P.U. Bach, a quirky passion project in computer-generated music that wasn’t even a game at all and didn’t even run on conventional computers. (Its home was the 3DO multimedia console.) The closest thing to a Civilization sequel or expansion in the three years after the original game’s release was Colonization, a MicroProse game from designer Brian Reynolds that borrowed some of Civilization‘s systems and applied them to the more historically grounded scenario of the European colonization of the New World. The Colonization box sported a blurb declaring that “the tradition of Civilization continues,” while Sid Meier’s name became a possessive prefix before the new game’s title. (Reynolds’s own name, by contrast, was nowhere to be found on the box.) Both of these were signs that MicroProse’s restless marketing department felt that the legacy of Civilization ought to be worth something, even if it wasn’t yet sure how best to make use of it.

Colonization hit the scene in 1994, one year after SimCity 2000 had been accorded such a positive reception, and proceeded to sell an impressive 300,000 copies. These two success stories together altered MicroProse’s perception of Civilization forever, transforming what had started as just an opportunistic bit of marketing on Colonization‘s box into an earnest attempt to build a franchise. Not one but two new Civilization games were quickly authorized. The one called CivNet was rather a stopgap project, which transplanted the original game from MS-DOS to Windows and added networked or hot-seat multiplayer capabilities to the equation. The other Civilization project was also to run under Windows, but was to be a far more extensive revamping of the original, making it bigger, prettier, and better balanced than before. Its working title of Civilization 2000 made clear its inspiration. Only at the last minute would MicroProse think better of making SimCity 2000‘s influence quite so explicit, and rename it simply Civilization II.

Unfortunately for MicroProse’s peace of mind, Sid Meier, a designer who always followed his own muse, said that he had no interest whatsoever in repeating himself at this point in time. Thus the project devolved to Brian Reynolds as the logical second choice: he had acquitted himself pretty well with Colonization, and Meier liked him a lot and would at least be willing to serve as his advisor, as he had for Reynold’s first strategy game. “They pitched it to me as if [they thought] I was probably going to be really upset,” laughs Reynolds. “I guess they thought I had my heart set on inventing another weird idea like Colonization. ‘Okay, will he be too mad if we tell him that we want him to do Civilization 2000?’ Which of course to me was the ultimate dream job. You couldn’t have asked me to do something I wanted to do more than make a version of Civilization.”

Like his mentor Meier, Reynolds was an accomplished programmer as well as game designer. This allowed him to do the initial work of hammering out a prototype on his own — from, of all locations, Yorkshire, England, where he had moved to be with his wife, an academic who was there on a one-year Fulbright scholarship. While she went off to teach and be taught every day, he sat in their little flat putting together the game that would transform Civilization from a one-off success into the archetypal strategy franchise.

Brian Reynolds

As Reynolds would be the first to admit, Civilization II is more of a nuts-and-bolts iteration on what came before than any wild flight of fresh creativity. He approached his task as a sacred trust. Reynolds:

My core vision for Civ II was not to be the guy that broke Civilization. How can I make each thing a little bit better without breaking any of it? I wanted to make the AI better. I wanted to make it harder. I wanted to add detail. I wanted to pee in all the corners. I didn’t have the idea that we were going to change one thing and everything else would stay the same. I wanted to make everything a little bit better. So, I both totally respected [Civilization I] as an amazing game, and thought, I can totally do a better job at every part of this game. It was a strange combination of humility and arrogance.

Reynolds knew all too well that Civilization I could get pretty wonky pretty quickly when you drilled down into the details. He made it his mission to fix as many of these incongruities as possible — both the ones that could be actively exploited by clever players and the ones that were just kind of weird to think about.

At the top of his list was the game’s combat system, the source of much hilarity over the years, what with the way it made it possible — not exactly likely, mind you, but possible — for a militia of ancient spearmen to attack and wipe out a modern tank platoon. This was a result of the game’s simplistic “one hit and done” approach to combat. Let’s consider our case of a militia attacking tanks. A militia has an attack strength of one, a tank platoon a defense strength of five. The outcome of the confrontation is determined by adding these numbers together, then taking each individual unit’s strength as its chance of destroying the other unit rather than being destroyed itself. In this case, then, our doughty militia men have a one-in-six chance of annihilating the tanks rather than vice versa — not great odds, to be sure, but undoubtedly better than those they would enjoy in any real showdown.

It was economic factors that made this state of affairs truly unbalancing. A very viable strategy for winning Civilization every single time was the “barbarian hordes” approach: forgo virtually all technological and social development, flood the map with small, primitive cities, then use those cities to pump out huge numbers of primitive units. A computer opponent diligently climbing the tech tree and developing its society over a broader front would in time be able to create vastly superior units like tanks, but would never come close to matching your armies in quantity. So, you could play the law of averages: you might have to attack a given tank platoon five times or more with different militias, but you knew that you would eventually destroy it, as you would the rest of your opponent’s fancy high-tech military with your staggering numbers of bottom feeders. The barbarian-horde strategy made for an unfun way to play once the joy of that initial eureka moment of discovering it faded, yet many players found the allure of near-certain victory on even the highest difficulty levels hard to resist. Part of a game designer’s job is to save players like this from themselves.

This was in fact the one area of Civilization II that Sid Meier himself dived into with some enthusiasm. He’d been playing a lot of Master of Magic, yet another MicroProse game that betrayed an undeniable Civilization influence, although unlike Colonization it was never marketed on the basis of those similarities. When two units met on the world map in Master of Magic, a separate tactical-battle screen opened up for you to manage the fight. Meier went so far as prototyping such a system for Civilization II, but gave up on it in the end as a poor fit with the game’s core identity. “Being king is the heart of Civilization,” he says. “Slumming as a lowly general puts the player in an entirely different story (not to mention violates the Covert Action rule). Win-or-lose battles are not the only interesting choice on the path to good game design, but they’re the only choice that leads to Civ.”

With his mentor having thus come up empty, Brian Reynolds addressed the problem via a more circumspect complication of the first game’s battle mechanics. He added a third and fourth statistic to each unit: firepower and hit points. Now, instead of being one-and-done, each successful “hit” would merely subtract the one unit’s firepower from the other’s total hit points, and then the battle would continue until one or the other reached zero hits points. The surviving unit would quite possibly exit the battle “wounded” and would need some time to recuperate, adding another dimension to military strategy. It was still just barely possible that a wildly inferior unit could defeat its better — especially if the latter came into a battle already at less than its maximum hit points — but such occurrences became the vanishingly rare miracles they ought to be. Consider: Civilization II‘s equivalent of a militia — renamed now to “warriors” — has ones across the board for all four statistics; a tank platoon, by contrast, has an attack strength of ten, a defense strength of five, a firepower of one, and three hit points when undamaged. This means that a group of ancient warriors needs to roll the same lucky number three times in a row on a simulated six-sided die in order to attack an undamaged tank platoon and win. A one-in-six chance has become one chance in 216 — odds that we can just about imagine applying in the real world, where freak happenstances really do occur from time to time.

This change was of a piece with those Reynolds introduced at every level of the game — pragmatic and judicious, evolutionary rather than revolutionary in spirit. I won’t enumerate them exhaustively here, but will just note that they were all very defensible if not always essential in this author’s opinion.

Civilization II was written for Windows 3, and uses that operating system’s standard Windows interface.

The layers of the program that were not immediately visible to the player got an equally judicious sprucing up — especially diplomacy and artificial intelligence, areas where the original had been particularly lacking. The computer players became less erratic in their interactions with you and with one another; no longer would Mahatma Gandhi go to bed one night a peacenik and wake up a nuke-spewing madman. Combined with other systemic changes, such as a rule making it impossible for players to park their military units inside the city boundaries of their alleged allies, these improvements made it much less frustrating to pursue a peaceful, diplomatic path to victory — made it less likely, that is to say, that the other players would annoy you into opening a can of Gandhi-style whoop-ass on them just to get them out of your hair.

In addition to the complications that were introduced to address specific weaknesses of the first game, Civilization II got a whole lot more stuff for the sake of it: more nationalities to play and play against (21 instead of 14); more advances to research (89 instead of 71); more types of units to move around the map (51 instead of 28); a bewildering variety of new geological, biological, and ecological parameters to manipulate to ensure that the game built for you just the sort of random world that you desired to play in; even a new, ultra-hard “Deity” difficulty level to address Reynold’s complaint that Meier’s Civilization was just too easy. There was also a new style of government added to the original five: “Fundamentalism” continued the tradition of mixing political, economic, and now religious ideologies indiscriminately, with all of them seen through a late-twentieth-century American triumphalist lens that might have been offensive if it wasn’t so endearingly naïve in its conviction that the great debates down through history about how human society can be most justly organized had all been definitively resolved in favor of American-style democracy and capitalism. And then the game got seven new Wonders of the World to add to the existing 21. Like their returning stablemates, they were a peculiar mix of the abstract and the concrete, from Adam Smith’s Trading Company (there’s that triumphalism again!) in the realm of the former to the Eiffel Tower in that of the latter.

Reynolds’s most generous move of all was to crack open the black box of the game for its players, turning it into a toolkit that let them try their own hands at strategy-game design. Most of the text and vital statistics were stored in plain-text files that anyone could open up in an editor and tinker with. Names could be changed, graphics and sounds could be replaced, and almost every number in the game could be altered at will. MicroProse encouraged players to incorporate their most ambitious “mods” into set-piece scenarios, which replaced the usual randomized map and millennia-spanning timeline with a more focused premise. Scenarios dealing with Rome during the time of transition from Republic to Empire and World War II in Europe were included with the game to get the juices flowing. In shrinking the timeline so dramatically and focusing on smaller goals, scenarios did tend to bleed away some of Civilization‘s high-concept magic and turn it into more of a typical strategic war game, but that didn’t stop the hardcore fans from embracing them. They delivered scenarios of their own about everything from Egyptian, Greek, and Norse mythology to the recent Gulf War against Iraq, from a version of Conway’s Game of Life to a cut-throat competition among Santa’s elves to become the dominant toy makers.

The ultimate expression of Brian Reynolds’s toolkit approach can be seen right there on the menu every time you start a new game of Civilization II, under the heading of simply “Cheat.” You can use it to change anything you want any time you want, at the expense of not having your high score recorded, should you earn one. At a click of the mouse, you can banish an opposing player from the game, research any advance instantly, give yourself infinite money… you name it. More importantly in the long run, the Cheat menu lets you peek behind the curtain to find out exactly what is going on at any given moment, almost like a programmer sitting in front of a debugging console. Sid Meier was shocked the first time he saw it.

Cheating was an inherent part of the game now, right on the main screen? This was not good. Like all storytelling, gaming is about the journey, and if you’re actively finding ways to jump to the end, then we haven’t made the fantasy compelling enough. A gripping novel would never start with an insert labeled, “Here’s the Last Page, in Case You Want to Read It Now.” Players who feel so inclined will instinctively find their own ways to cheat, and we shouldn’t have to help them out. I could not be convinced this was a good idea.

But Reynolds stuck to his guns, and finally Meier let him have it his way. It was, he now acknowledges, the right decision. The Cheat menu let players rummage around under the hood of the game as it was running, until some of them came to understand it practically as well as Reynolds himself. This was a whole new grade of catnip for the types of mind that tend to be attracted by big, complex strategy games like this one. Meanwhile the loss of a high score to boast about was enough to ensure that gamers weren’t unduly tempted to use the Cheat menu when playing for keeps, as it were.

Of course, the finished Civilization II is not solely a creation of Brian Reynolds. After he returned from Britain with his prototype in hand, two other MicroProse designers named Doug Kaufman and Jeff Briggs joined him for the hard work of polishing, refining, and balancing. Ditto a team of artists and even a film crew.

Yes, a film crew: the aspect of Civilization II that most indelibly dates it to the mid-1990s — even more so than its Windows 3 interface — must surely be your “High Council,” who pop up from time to time to offer their wildly divergent input on the subject of what you should be doing next. They’re played by real actors, hamming it up gleefully in video clips, changing from togas to armor to military uniforms to business suits as the centuries go by. Most bizarre of all is the entertainment advisor, played by… an Elvis Presley impersonator. What can one say? This sort of thing was widely expected to be the future of gaming, and MicroProse didn’t want to be left completely in the cold when the much-mooted merger of Silicon Valley and Hollywood finally became a reality.


Civilization II was released in the spring of 1996 to glowing reviews. Computer Gaming World gave it five stars out of five, calling it “a spectacularly addictive and time-consuming sequel.” Everything I’ve said in this article and earlier ones about the appeal, success, and staying power of Civilization I applies treble to Civilization II. It sold 3 million copies over the five years after its release, staying on store shelves right up to the time that the inevitable Civilization III arrived to replace it. Having now thoroughly internalized the lesson that strategy games could become franchises too, MicroProse sustained interest in the interim with two scenario packs, a “Multiplayer Gold Edition” that did for Civilization II what CivNet had done for Civilization I, and another reworking called Civilization II: Test of Time that extended the timeline of the game into the distant future. Civilization as a whole thus become one of gaming’s most inescapable franchises, the one name in the field of grand strategy that even most non-gamers know.

Given all of this, and given the obvious amount of care and even love that was lavished on Civilization II, I feel a bit guilty to admit that I struggled to get into it when I played it in preparation for this article. Some of my lack of enthusiasm may be down to purely proximate causes. I played a lot of Civilization I in preparation for the long series of articles I wrote about it and the Progress-focused, deeply American worldview it embodies, and the sequel is just more of the same from this perspective. If I’d come to Civilization II cold, as did the majority of those 3 million people who bought it, I might well have had a very different experience with it.

Still, I do think there’s a bit more to my sense of vague dissatisfaction than just a jaded player’s ennui. I miss one or two bold leaps in Civilization II to go along with all of the incrementalist tinkering. Its designers made no real effort to address the big issues that dog games of this ilk: the predictable tech tree that lends itself to rote strategies, the ever more crushing burden of micromanagement as your empire expands, and an anticlimactic endgame that can go on for hours after you already know you’re going to win. How funny to think that Master of Orion, another game published by MicroProse, had already done a very credible job of addressing all of these problems three years before Civilization II came to be!

Then, too, Civilization II may be less wonky than its predecessor, but I find that I actually miss the older game’s cock-eyed jeu d’esprit, of which those ancient militias beating up on tanks was part and parcel. Civilization II‘s presentation, using the stock Windows 3 menus and widgets, is crisper and cleaner, but only adds to the slight sense of sterility that dogs the whole production. Playing it can feel rather like working a spreadsheet at times — always a danger in these kinds of big, data-driven strategy games. Those cheesy High Council videos serve as a welcome relief from the austerity of it all; if you ask me, the game could have used some more of that sort of thing.

I do appreciate the effort that went into all the new nationalities, advances, units, and starting parameters. In the end, though, Civilization II only provides further proof for me — as if I needed it — that shoehorning more stuff into a game doesn’t always or even usually make it better, just slower and more ponderous. In this sense too, I prefer its faster playing, more lovably gonzo predecessor. It strikes me that Civilization II is more of a gamer’s game, emphasizing min-maxing and efficient play above all else, at the expense of the original’s desire to become a flight of the imagination, letting you literally write your own history of a world. Sid Meier liked to call his game first and foremost “an epic story.” I haven’t heard any similar choice of words from Brian Reynolds, and I’ve definitely never felt when playing Civilization I that it needed to be harder, as he did.

I hasten to emphasize, however, that mine is very much a minority opinion. Civilization II was taken up as a veritable way of life by huge numbers of strategy gamers, some of whom have refused to abandon it to this day, delivering verdicts on the later installments in the series every bit as mixed as my opinions about this one. Good for them, I say; there are no rights or wrongs in matters like these, only preferences.


Postscript: The Eternal War

In 2012, a fan with the online handle of Lycerius struck a chord with media outlets all over the world when he went public with a single game of Civilization II which he had been playing on and off for ten years of real time. His description of it is… well, chilling may not be too strong a word.

The world is a hellish nightmare of suffering and devastation. There are three remaining super nations in AD 3991, each competing for the scant resources left on the planet after dozens of nuclear wars have rendered vast swaths of the world uninhabitable wastelands.

The ice caps have melted over 20 times, due primarily to the many nuclear wars. As a result, every inch of land in the world that isn’t a mountain is inundated swampland, useless to farming. Most of which is irradiated anyway.

As a result, big cities are a thing of the distant past. Roughly 90 percent of the world’s population has died either from nuclear annihilation or famine caused by the global warming that has left absolutely zero arable land to farm. Engineers are busy continuously building roads so that new armies can reach the front lines. Roads that are destroyed the very next turn. So, there isn’t any time to clear swamps or clean up the nuclear fallout.

Only three massive nations are left: the Celts (me), the Vikings, and the Americans. Between the three of us, we have conquered all the other nations that have ever existed and assimilated them into our respective empires.

You’ve heard of the 100 Year War? Try the 1700 Year War. The three remaining nations have been locked in an eternal death struggle for almost 2000 years. Peace seems to be impossible. Every time a ceasefire is signed, the Vikings will surprise-attack me or the Americans the very next turn, often with nuclear weapons. So, I can only assume that peace will come only when they’re wiped out. It is this that perpetuates the war ad infinitum.

Because of SDI, ICBMs are usually only used against armies outside of cities. Instead, cities are constantly attacked by spies who plant nuclear devices which then detonate. Usually the downside to this is that every nation in the world declares war on you. But this is already the case, so it’s no longer a deterrent to anyone, myself included.

The only governments left are two theocracies and myself, a communist state. I wanted to stay a democracy, but the Senate would always overrule me when I wanted to declare war before the Vikings did. This would delay my attack and render my turn and often my plans useless. And of course the Vikings would then break the ceasefire like clockwork the very next turn. I was forced to do away with democracy roughly a thousand years ago because it was endangering my empire. But of course the people hate me now, and every few years since then, there are massive guerrilla uprisings in the heart of my empire that I have to deal with, which saps resources from the war effort.

The military stalemate is airtight, perfectly balanced because all remaining nations already have all the technologies, so there is no advantage. And there are so many units at once on the map that you could lose twenty tank units and not have your lines dented because you have a constant stream moving to the front. This also means that cities are not only tiny towns full of starving people, but that you can never improve the city. “So you want a granary so you can eat? Sorry! I have to build another tank instead. Maybe next time.”

My goal for the next few years is to try to end the war and use the engineers to clear swamps and fallout so that farming may resume. I want to rebuild the world. But I’m not sure how.

One can’t help but think about George Orwell’s Oceania, Eurasia, and Eastasia when reading of Lycerius’s three perpetually warring empires. Like Nineteen Eighty-Four, his after-action report has the uncanny feel of a dispatch from one of our own world’s disturbingly possible futures. Many people today would surely say that recent events have made his dystopia seem even more probable than ten years ago.

But never fear: legions of fans downloaded the saved game of the “Eternal War” which Lycerius posted and started looking for a way to end the post-apocalyptic paralysis. A practical soul who called himself “stumpster” soon figured out how to do so: “I opted for a page out of MacArthur’s book and performed my own Incheon landing.” In the game of Civilization, there is always a way. Let us hope the same holds true in reality.

(Sources: the book Sid Meier’s Memoir! by Sid Meier; Computer Gaming World of April/May 1985, November 1987, March 1993, June 1996, July 1996, and August 1996; Retro Gamer 86, 112, and 219. Online sources include Soren Johnson’s interviews with Sid Meier and Brian Reynolds, PC Gamer‘s “Complete History of Civilization,” and  Huffington Post‘s coverage of Lycerius’s game of Civilization and stumpster’s resolution of the stalemate. The original text of original Lycenrius’s Reddit message is posted on the Civilization II wiki.

Civilization II is not currently available for online purchase. You can, however, find it readily enough on any number of abandonware archives; some are dodgier than others, so be cautious. I recommend that you avoid the Multiplayer Gold Edition in favor of the original unless you really, really want to play with your mates. For, in a rather shocking oversight, MicroProse released the Gold Edition with bugged artificial intelligence that makes all of the computer-controlled players ridiculously aggressive and will keep you more or less constantly at war with everyone. If perpetual war is your thing, on the other hand, go for it…

Update: See Blake’s comment below for information on how to get the Multiplayer Gold Edition running with the original artificial intelligence, thereby getting the best of both worlds!

Once you’ve managed to acquire it, there’s a surprisingly easy way to run Civilization II on modern versions of Windows. You just need to install a little tool called WineVDM, and then the game should install and run transparently, right from the Windows desktop. It’s probably possible to get it running on Linux and MacOS using the standard Wine layer, but I haven’t tested this personally.)

In a feat of robust programming of which its makers deserve to be proud, Civilization II is capable of scaling to seemingly any size of screen. Here it is running on my Windows 10 desktop at a resolution of 3440 X 1440 — numbers that might as well have been a billion by a million back in 1996.

 
 

Tags: , , ,

Normality

Sometimes these articles come from the strangest places. When I was writing a little while back about The Pandora Directive, the second of the Tex Murphy interactive movies, I lavished with praise its use of a free-roaming first-person 3D perspective, claiming in the process that “first-person 3D otherwise existed only in the form of action-oriented shooters and static, node-based, pre-rendered Myst clones.” Such a blanket statement is just begging to be contradicted, and you folks didn’t disappoint. Our tireless fact-checker Aula rightly noted that I’d forgotten a whole family of action-CRPGs which followed in the wake of Ultima Underworld (another game I’d earlier lavished with praise, as it happened). And, more pertinently for our subject of today, Sarah Walker informed me that “Gremlin’s Normality was an early 1996 point-and-clicker using a DOOM-style engine.”

I must confess that I’d never even heard of Normality at that point, but Sarah’s description of it made me very interested in checking it out. What I found upon doing so was an amiable little game that feels somehow less earthshaking than its innovative technical approach might lead one to expect, but that I nevertheless enjoyed very much. So, I decided to write about it today, as both an example of a road largely not taken in traditional adventure games and as one of those hidden gems that can still surprise even me, a man who dares to don the mantle of an expert in the niche field of interactive narratives of the past.



The story of Normality‘s creation is only a tiny part of the larger story of Gremlin Interactive, the British company responsible for it, which was founded under the name of Gremlin Graphics in 1984 by Ian Stewart and Kevin Norburn, the proprietors of a Sheffield software shop. Impressed by the coding talents of the teenagers who flocked around their store’s demo machines every afternoon and weekend, one-upping one another with ever more audacious feats of programming derring-do, Stewart and Norburn conceived Gremlin as a vehicle for bringing these lads’ inventions to the world. The company’s name became iconic among European owners of Sinclair Spectrums and Commodore 64s, thanks to colorfully cartoony and deviously clever platformers and other types of action games: the Monty Mole series, Thing on a Spring, Bounder, Switchblade, just to name few. When the 1980s came to an end and the 8-bit machines gave way to the Commodore Amiga, MS-DOS, and the new 16-bit consoles, Gremlin navigated the transition reasonably well, keeping their old aesthetic alive through games like Zool whilst also branching out in new directions, such as a groundbreaking line of 3D sports simulations that began with Actua Soccer. Through it all, Gremlin was an institution unto itself in British game development, a rite of passage for countless artists, designers, and programmers, some of whom went on to found companies of their own. (The most famous of Gremlin’s spinoffs is Core Design, which struck international gold in 1996 with Tomb Raider.)

The more specific story of Normality begins with a fellow named Tony Crowther. While still a teenager in the 1980s, he was one of the elite upper echelon of early British game programmers, who were feted in the gaming magazines like rock stars. A Sheffield lad himself, Crowther’s fame actually predated the founding of Gremlin, but his path converged with its on a number of occasions afterward. Unlike many of his rock-star peers, he was able to sustain his career if not his personal name recognition into the 1990s, when lone-wolf programmers were replaced by teams and project budgets and timelines increased exponentially. He remembers his first sight of id Software’s DOOM as a watershed moment in his professional life: “This was the first game I had seen with 3D graphics, and with what appeared to be a free-roaming camera in the world.” It was, in short, the game that would change everything. Crowther immediately started working on a DOOM-style 3D engine of his own.

He brought the engine, which he called True3D, with him to Gremlin Interactive when he accepted the title of Technical Consultant there in early 1994. “I proposed two game scenarios” for using it, he says. “Gremlin went with the devil theme; the other was a generic monster game.”

The “devil theme” would become Realms of the Haunting, a crazily ambitious and expensive project that would take well over two years to bring to fruition, that would wind up filling four CDs with DOOM-style carnage, adventure-style dialogs and puzzle solving, a complicated storyline involving a globe-spanning occult conspiracy of evil (yes, yet another one), and 90 minutes of video footage of human actors (this was the mid-1990s, after all). We’ll have a closer look at this shaggy beast in a later article.

Today’s more modest subject of inquiry was born in the head of one Adrian Carless, a long-serving designer, artist, writer, and general jack-of-all-trades at Gremlin. He simply “thought it would be cool to make an adventure game in a DOOM-style engine. Realms of the Haunting was already underway, so why not make two games with the same engine?” And so NormalityRealms of the Haunting‘s irreverent little brother, was born. A small team of about half a dozen made it their labor of love for some eighteen months, shepherding it to a European release in the spring of 1996. It saw a North American release, under the auspices of the publisher Interplay, several months later.



To the extent that it’s remembered at all, Normality is known first and foremost today for its free-roaming first-person 3D engine — an approach that had long since become ubiquitous in the realm of action games, where “DOOM clones” were a dime a dozen by 1996, but was known to adventure gamers only thanks to Access Software’s Tex Murphy games. Given this, it might be wise for us to review the general state of adventure-game visuals circa 1996.

By this point, graphical adventures had bifurcated into two distinct groups whose Venn diagram of fans overlapped somewhat, but perhaps not as much as one might expect. The older approach was the third-person point-and-click game, which had evolved out of the 1980s efforts of Sierra and LucasArts. Each location in one of these games was built from a background of hand-drawn pixel art, with the player character, non-player characters, and other interactive objects superimposed upon it as sprites. Because drawing each bespoke location was so intensive in terms of human labor, there tended to be relatively few of them to visit in any given game. But by way of compensation, these games usually offered fairly rich storylines and a fair degree of dynamism in terms of their worlds and the characters that inhabited them. Puzzles tended to be of the object-oriented sort — i.e., a matter of using this thing from your inventory on this other thing.

The alternative approach was pioneered and eternally defined by Myst, a game from the tiny studio Cyan Productions that first appeared on the Macintosh in late 1993 and went on to sell over 6 million copies across a range of platforms. Like DOOM and its ilk, Myst and its many imitators presented a virtual world to their players from a first-person perspective, and relied on 3D graphics rendered by a computer using mathematical algorithms rather than hand-drawn pixel art. In all other ways, however, they were DOOM‘s polar opposite. Rather than corridors teeming with monsters to shoot, they offered up deserted, often deliberately surreal — some would say “sterile” — worlds for their players to explore. And rather than letting players roam freely through said worlds, they presented them as a set of discrete nodes that they could hop between.

Why did they choose this slightly awkward approach? As happens so often in game development, the answer has everything to do with technological tradeoffs. Both DOOM and Myst were 3D-rendered; their differences came down to where and when that rendering took place. DOOM created its visuals on the fly, which meant that the player could go anywhere in the world but which limited the environment’s visual fidelity to what an ordinary consumer-grade computer of the time could render at a decent frame rate. Myst, on the other hand, was built from pre-rendered scenes: scenes that had been rendered beforehand on a high-end computer, then saved to disk as ordinary graphics files — effectively converted into pixel art. This work stream let studios turn out far more images far more quickly than even an army of human pixel-artists could have managed, but forced them to construct their worlds as a network of arbitrarily fixed nodes and views which many players — myself among them — can find confusing to navigate. Further, these views were not easy to alter in any sort of way after they had been rendered, which sharply limited the dynamism of Myst clones in comparison to traditional third-person adventure games. Thus the deserted quality that became for good or ill one of their trademarks, and their tendency to rely on set-piece puzzles such as slider and button combinations rather than more flexible styles of gameplay. (Myst itself didn’t have a player inventory of any sort — a far cry from the veritable pawn shop’s worth of seemingly random junk one could expect to be toting around by the middle stages of the typical Sierra or LucasArts game.)

By no means did Normality lift the set of technical constraints I’ve just described. Yet it did serve as a test bed for a different set of tradeoffs from the ones that adventure developers had been accepting before this point. It asked the question of whether you could make an otherwise completely conventional adventure game — unlike its big brother Realms of the Haunting, Normality has no action elements whatsoever — using a Doom-style engine, accepting that the end result would not be as beautiful as Myst but hoping that the world would feel a lot more natural to move around in. And the answer turned out to be — in this critic’s opinion, at any rate — a pretty emphatic yes.

Tony Crowther may have chosen to call his engine True3D, but it is in reality no such thing. Like the DOOM engine which inspired it, it uses an array of tricks and shortcuts to minimize rendering times whilst creating a reasonably convincing subjective experience of inhabiting a 3D space. That said, it does boast some improvements over DOOM: most notably, it lets you look up and down, an essential capability for an old-school adventure game in which the player is expected to scour every inch of her environment for useful thingamabobs. It thus proved in the context of adventure games a thesis that DOOM had already proved for action games: that gains in interactivity can often more than offset losses in visual fidelity. Just being able to, say, look down from a trapdoor above a piece of furniture and see a crucial detail that had been hidden from floor level was something of a revelation for adventure gamers.

You move freely around Normality‘s world using the arrow keys, just as you do in DOOM. (The “WASD” key combination, much less mouse-look, hadn’t yet become commonplace in 1996.) You interact with the things you see on the screen by clicking on them with the mouse. It feels perfectly natural in no time — more natural, I must say, than any Myst clone has ever felt for me. And you won’t feel bored or lonely in Normality, as so many tend to do in that other style of game; its environment changes constantly and it has plenty of characters to talk to. In this respect as in many others, it’s more Sierra and LucasArts than Myst.

The main character of Normality is a fellow named Kent Knutson, who, some people who worked at Gremlin have strongly implied, was rather a chip off the old block of Adrian Carless himself. He’s an unrepentant slacker who just wants to rock out to his tunes, chow down on pizza, and, one has to suspect based on the rest of his persona, toke up until he’s baked to the perfection of a Toll House cookie. Unfortunately, he’s living in a dictatorial dystopia of the near future, in which conformity to the lowest common denominator — the titular Normality — has been elevated to the highest social value, to be ruthlessly enforced by any and all means necessary. When we first meet Kent, he’s just been released from a stint in jail, his punishment for walking down the street humming a non-sanctioned song. Now he’s to spend some more time in house arrest inside his grotty apartment, with a robot guard just outside the door making sure he keeps his television on 24 hours per day, thereby to properly absorb the propaganda of the Dear Leader, a thoroughly unpleasant fellow named Paul Mystalux. With your help, Kent will find a way to bust out of his confinement. Then he’ll meet the most ineffectual group of resistance fighters in history, prove himself worthy to join their dubious ranks, and finally find a way to bring back to his aptly named city of Neutropolis the freedom to let your freak flag fly.

Adrian Carless. It seems that the apple named Kent didn’t fall far from the tree named Adrian…

There’s a core of something serious here, as I know all too well; I’ve been researching and writing of late about Chairman Mao Zedong’s Cultural Revolution in China, whose own excesses in the name of groupthink were every bit as absurd in their way as the ones that take place in Neutropolis. In practice, though, the game is content to play its premise for laughs. As the creators of Normality put it, “It’s possible to draw parallels between Paul [Mystalux] and many of the truly evil dictators in history — Hitler, Mussolini, Stalin — but we won’t do that now because this is supposed to be light-hearted and fun.” It’s far from the worst way in the world to neutralize tyranny; few things are as deflating to the dictators and would-be dictators among us than being laughed at for the pathetic personal insecurities that make them want to commit such terrible crimes against humanity.

This game is the very definition of laddish humor, as unsubtle as a jab in the noggin, as rarefied as a molehill, as erudite as that sports fan who always seems to be sitting next to you at the bar of a Saturday night. And yet it never fails to be likeable. It always has its heart in the right place, always punches up rather than down. What can I say? I’m a simple man, and this game makes me laugh. My favorite line comes when, true adventure gamer that you are, you try to get Kent to sift through a public trashcan for valuable items: “I have enough trash in my apartment already!”

Normality‘s visual aesthetic is in keeping with its humor aesthetic (not to mention Kent’s taste in music): loud, a little crude, even a trifle obnoxious, but hard to hate for all that. The animations were created by motion-capturing real people, but budget and time constraints meant that it didn’t quite work out. “Feet would float and swim, hands wouldn’t meet, and overall things could look rather strange,” admits artist Ricki Martin. “For sure the end results would have been better if it had been hand-animated.” I must respectfully disagree. To my mind, the shambolic animation only adds to the delightfully low-rent feel of the whole — like an old 1980s Dinosaur Jr. record where the tape hiss and distortion are an essential part of the final impression. (In fact, the whole vibe of the game strikes me as more in line with 1980s underground music than the 1990s grunge that was promised in some of its advertising, much less the Britpop that was sweeping its home country at the time.)

But for all its tossed-off-seeming qualities, Normality has its head screwed on tight where it’s important: it proves to be a meticulously designed adventure game, something neither its overall vibe not its creators’ lack of experience with the genre would lead one to expect. Thankfully, they learned from the best; all of the principals recall the heavy influence that LucasArts had on them — so much so that they even tried to duplicate the onscreen font found in classics like The Secret of Monkey Island, Day of the Tentacle, and Sam and Max Hit the Road. The puzzles are often bizarre — they do take place in a bizarre setting, after all — but they always have an identifiable cartoon logic to them, and there are absolutely no dead ends to ruin your day. As a piece of design, Normality thus acquits itself much better than many another game from more established adventure developers. You can solve this one on your own, folks; its worst design sin is an inordinate number of red herrings, which I’m not sure really constitutes a sin at all. It’s wonderful to discover an adventure game that defies the skepticism with which I always approach obscure titles in the genre from unseasoned studios.


The game begins in Kent’s hovel of a flat.

The game’s verb menu is capable of frightening small children — or, my wife, who declared it the single ugliest thing I’ve ever subjected her to when I play these weird old games in our living room.

Sometimes Normality‘s humor is sly. These rooms with painted-on furniture are a riff on the tendency of some early 3D engines to appear, shall we say, less than full-bodied.

Other times the humor is just dumb — but it still makes me laugh.

The game ends in a noisy concert that’s absolutely off the hook, which is absolutely perfect.



Normality was released with considerable fanfare in Europe, including a fifteen-page promotional spread in the popular British magazine PC Zone, engineered to look like a creation of the magazine’s editorial staff rather than an advertisement. (Journalistic ethics? Schmethics!) Here and elsewhere, Gremlin plugged the game as a well-nigh revolutionary adventure, thanks to its 3D engine. But the public was less than impressed; the game never caught fire.

In the United States, Interplay tried to inject a bit of star power into the equation by hiring the former teen idol Corey Feldman to re-record all of Kent’s lines; mileages will vary here, but personally I prefer original actor Tom Hill’s more laconic approach to Feldman’s trademark amped-up surfer-dude diction. Regardless, the change in casting did nothing to help Normality‘s fortunes in the United States, where it sank without a trace — as is amply testified by the fact that this lifelong adventure fan never even knew it existed until recently. Few of the magazines bothered to review it at all, and those that did took strangely scant notice of its formal and technical innovations. Scorpia, Computer Gaming World‘s influential adventure columnist, utterly buried the lede, mentioning the 3D interface only in nonchalant passing halfway into her review. Her conclusion? “Normality isn’t bad.” Another reviewer pronounced it “mildly fun and entertaining.” With faint praise like that, who needs criticism?

Those who made Normality have since mused that Gremlin and Interplay’s marketing folks might have leaned a bit too heavily on the game’s innovative presentation at the expense of its humorous premise and characters, and there’s probably something to this. Then again, its idiosyncratic vibe resisted easy encapsulation, and was perhaps of only niche appeal anyway — a mistake, if mistake it be, that LucasArts generally didn’t make. Normality was “‘out there,’ making it hard to put a genre on it,” says Graeme Ing, another artist who worked on the game — “unlike Monkey Island being ‘pirates’ and [Day of the] Tentacle being ‘time travel.'” Yet he admits that “I loved the game for the same reasons. Totally unique, not just a copy of another hit.”

I concur. Despite its innovations, Normality is not a major game in any sense of the word, but sometimes being “major” is overrated. To paraphrase Neil Young, traveling in the middle of the road all the time can become a bore. Therefore this site will always have time for gaming’s ditches — more time than ever, I suspect, as we move deeper into the latter half of the 1990s, an era when gaming’s mainstream was becoming ever more homogenized. My thanks go to Sarah Walker for turning me onto this scruffy outsider, which I’m happy to induct into my own intensely idiosyncratic Hall of Fame.

(Sources: the book A Gremlin in the Works by Mark James Hardisty, which with its digital supplement included gives you some 800 pages on the history of Gremlin Interactive, thus nicely remedying this site’s complete silence on that subject prior to now. It comes highly recommended! Also Computer Gaming World of November 1996, Next Generation of November 1996, PC Zone of May 1996, PC World of September 1996, Retro Gamer 11, 61, and 75.

Normality is available for digital purchase at GOG.com, in a version with the original voice acting. Two tips: remember that you can look up and down using the Page Up and Page Down, and know that you can access the map view to move around the city at any time by pressing “M.” Don’t do what I did: spend more than an hour searching in vain for the exit to a trash silo you thought you were trapped inside — even if that does seem a very Kent thing to do…)

 
 

Tags: , ,