RSS

From Squadron to Wingleader

Chris Roberts and Richard Garriott, 1988

At the Summer Consumer Electronics Show in June of 1989, Origin Systems and Brøderbund Software announced that they wouldn’t be renewing the distribution contract the former had signed with the latter two years before. It was about as amicable a divorce as has ever been seen in the history of business; in this respect, it could hardly have stood in greater contrast to the dust-up that had ended Origin’s relationship with Electronic Arts, their previous distributor, in 1987. Each company was full of rosy praise and warm wishes for the other at a special “graduation party” Brøderbund threw for Origin at the show. “Brøderbund has been one of the few affiliated-label programs that truly helps a small company grow to a size where it can stand on its own and enter the real world,” said Origin’s Robert Garriott, making oblique reference to the more predatory approach of Electronic Arts. In response, Brøderbund’s Gary Carlston toasted that “it’s been rewarding to have helped Origin pursue its growth, and it’s exciting to see the company take this step,” confirming yet one more time Brøderbund’s well-earned reputation as the nice guys of their industry who somehow kept managing to finish first. And so, with a last slap on the rump and a final chorus of “Kumbaya,” Brøderbund sent Origin off to face the scary “world of full-service software publishing” alone.

It was a bold step for Origin, especially given that they still hadn’t solved a serious problem that had dogged them since their founding in the Garriott brothers’ family garage six years earlier. The first two games released by the young company back in 1983 had been Ultima III, the latest installment in Richard Garriott’s genre-defining CRPG series, and Caverns of Callisto, an action game written by Richard’s high-school buddy Chuck Bueche. Setting the frustrating pattern for what was to come, Ultima III soared up the bestseller charts, while Caverns of Callisto disappeared without a trace. In the years that followed, Origin released some non-Ultima games that were moderately successful, but never came close to managing a full-on hit outside of their signature franchise. This failure left them entirely dependent for their survival on Richard Garriott coming up with a new and groundbreaking Ultima game every couple of years, and on that game then proceeding to sell over 200,000 copies. Robert Garriott, as shrewd a businessman as any in his industry, knew that staking his company’s entire future on a single game every two years was at best a risky way to run things. Yet, try as he might, he couldn’t seem to break the pattern.

Origin had a number of factors working against them in their efforts to diversify, but the first and most ironic among them must be the very outsize success of Ultima itself. The company had become so identified with Ultima that many gamers barely realized that they did anything else. As for other folks working in the industry, they had long jokingly referred to Origin Systems as “Ultima Systems.” Everyone knew that the creator of Ultima was also the co-founder of Origin, and the brother of the man who directed its day-to-day operations. In such a situation, there must be a real question of whether any other game project, even a potentially great one, could avoid being overshadowed by the signature franchise, could find enough oxygen to thrive. Added to these concerns, which would be applicable to any company in such a situation, must be the unique nature of the cast of characters at Origin. Richard Garriott’s habit of marching around trade-show floors in full Lord British regalia, his entourage in tow, didn’t always endear him to the rest of the industry. There were, it sometimes seemed, grounds to question whether Richard himself knew that he wasn’t actually a monarch, just a talented kid from suburban Houston with nary a drop of royal blood coursing through his veins. At times, Origin Systems could feel perilously close to a cult of personality. Throw in the company’s out-of-the-way location in Austin, Texas, and attracting really top-flight projects became quite a challenge for them.

So, when it came to games that weren’t Ultima Origin had had to content themselves with projects one notch down from the top tier — projects which, whether because they weren’t flashy enough or were just too nichey, weren’t of huge interest to the bigger publishers. Those brought in enough revenue to justify their existence but not much more, and thus Robert Garriott continued to bet the company every two years on his brother’s latest Ultima. It was a nerve-wracking way to live.

And then, in 1990, all that changed practically overnight. This article and the one that follows will tell the story of how the house that Ultima built found itself with an even bigger franchise on its hands.


Chris Roberts

By the end of the 1980s, the North American and European computer-game industries, which had heretofore existed in almost total isolation from one another, were becoming slowly but steadily more interconnected. The major American publishers were setting up distribution arms in Europe, and the smaller ones were often distributing their wares through the British importer U.S. Gold. Likewise, the British Firebird and Rainbird labels had set up offices in the United States, and American publishers like Cinemaware were doing good business importing British games for American owners of the Commodore Amiga, a platform that was a bit neglected by domestic developers. But despite these changes, the industry as a whole remained a stubbornly bifurcated place. European developers remained European, American developers remained American, and the days of a truly globalized games industry remained far in the future. The exceptions to these rules stand out all the more thanks to their rarity. And one of these notable exceptions was Chris Roberts, the young man who would change Origin Systems forever.

With a British father and an American mother, Chris Roberts had been a trans-Atlantic sort of fellow right from the start. His father, a sociologist at the University of Manchester, went with his wife to Guatemala to do research shortly after marrying, and it was there that Chris was conceived in 1967. The mother-to-be elected to give birth near her family in Silicon Valley. (From the first, it seems, computers were in the baby’s blood.) After returning for a time to Guatemala, where Chris’s father was finishing his research, the little Roberts clan settled back in Manchester, England. A second son arrived to round out the family in 1970.

His first international adventure behind him, Chris Roberts grew up as a native son of Manchester, developing the distinct Mancunian intonation he retains to this day along with his love of Manchester United football. When first exposed to computers thanks to his father’s position at Manchester University, the boy was immediately smitten. In 1982, when Chris was 14, his father signed him up for his first class in BASIC programming and bought a BBC Micro for him to practice on at home. As it happened, the teacher of that first programming class became a founding editor of the new magazine BBC Micro User. Hungry for content, the magazine bought two of young Chris’s first simple BASIC games to publish as type-in listings. Just like that, he was a published game developer.

Britain at the time was going absolutely crazy for computers and computer games, and many of the new industry’s rising stars were as young or younger than Roberts. It thus wasn’t overly difficult for him to make the leap to designing and coding boxed games to be sold in stores. Imagine Software published his first such, a platformer called Wizadore, in 1985; Superior Software published a second, a side-scrolling shooter called Stryker’s Run, in 1986. But the commercial success these titles could hope to enjoy was limited by the fact that they ran on the BBC Micro, a platform which was virtually unknown outside of Britain and even inside of its home country was much less popular than the Sinclair Spectrum as a gaming machine. Being amply possessed of the contempt most BBC Micro owners felt toward the cheap and toy-like “Speccy,” Roberts decided to shift his attention instead to the Commodore 64, the most popular gaming platform in the world at the time. This decision, combined with another major decision made by his parents, set him on his unlikely collision course with Origin Systems in far-off Austin, Texas.

In early 1986, Roberts’s father got an offer he couldn’t refuse in the form of a tenured professorship at the University of Texas. After finishing the spring semester that year, he, his wife, and his younger son thus traded the gray skies of Manchester for the sunnier climes of Austin. Chris was just finishing his A-Levels at the time. Proud Mancunian that he was, he declared that he had no intention of leaving England — and certainly not for a hick town in the middle of Texas. But he had been planning all along to take a year off before starting at the University of Manchester, and his parents convinced him to at least join the rest of the family in Austin for the summer. He agreed, figuring that it would give him a chance to work free of distractions on a new action/adventure game he had planned as his first project for the Commodore 64. Yet what he actually found in Austin was lots of distractions — eye-opening distractions to warm any young man’s heart. Roberts:

The weather was a little nicer in Austin. The American girls seemed to like the English accent, which wasn’t bad, and there was definitely a lot… everything seemed like it was cheaper and there was more of it, especially back then. Now, the world’s become more homogenized so there’s not things you can only get in America that you don’t get in England as well. Back then it was like, the big American movies would come out in America and then they would come out in England a year later and stuff. So I came over and was like, “Ah, you know, this is pretty cool.”

There were also the American computers to consider; these tended to be much more advanced than their British counterparts, sporting disk drives as universal standard equipment at a time when most British games — including both of Roberts’s previous games — were still published on cassette tapes. In light of all these attractions, it seems doubtful whether Roberts would have kept his resolution to return to Manchester in any circumstances. But there soon came along the craziest of coincidences to seal the deal.

Roberts had decided that he really needed to find an artist to help him with his Commodore 64 game-in-progress. Entering an Austin tabletop-gaming shop one day, he saw a beautiful picture of a gladiator hanging on the wall. The owner of the shop told him the picture had been drawn by a local artist, and offered to call the artist for him right then and there if Roberts was really interested in working with him. Roberts said yes, please do. The artist in question was none other than Denis Loubet, whose professional association with Richard Garriott stretched back to well before Origin Systems had existed, to when he’d drawn the box art for the California Pacific release of Akalabeth in 1980.

Denis Loubet

After years of working as a contractor, Loubet was just about to be hired as Origin’s first regular in-house artist. Nevertheless, he liked Roberts and thought his game had potential, and agreed to do the art for it as a moonlighting venture. Loubet soon showed what he was working on to Richard Garriott and Dallas Snell, the latter of whom tended to serve as a sort of liaison between the business side of the company, in the person of Robert Garriott, and the creative side, in the person of Richard. All three parties were as impressed by the work-in-progress as Loubet had been, and they invited Chris to Origin’s offices to ask if he’d be interested in publishing it through them. Prior to this point, Roberts had never even heard of Origin Systems or the Ultima series; he’d grown up immersed in the British gaming scene, where neither had any presence whatsoever. But he liked the people at Origin, liked the atmosphere around the place, and perhaps wasn’t aware enough of what the company represented to be leery of it in the way of other developers who were peddling promising projects around the industry. “After my experiences in England, which is like swimming in a big pool of sharks,” he remembers, “I felt comfortable dealing with Origin.”

Times of Lore

All thoughts of returning to England had now disappeared. Working from Origin’s offices, albeit still as a contracted outside developer rather than an employee, Roberts finished his game, which came to be called Times of Lore. In the course of its development, the game grew considerably in scope and ambition, and, as seemed only appropriate given the company that was to publish it, took on some light CRPG elements as well. In much of this, Roberts was inspired by David Joiner’s 1987 action-CRPG The Faery Tale Adventure. American influences aside, though, Times of Lore still fit best of all into the grand British tradition of free-scrolling, free-roaming 8-bit action/adventures, a sub-genre that verged on completely unknown to American computer gamers. Roberts made sure the whole game could fit into the Commodore 64’s memory at once to facilitate a cassette-based version for the European market.

Unfortunately, his game got to enjoy only a middling level of sales success in return for all his efforts. As if determined to confirm the conventional wisdom that had caused so many developers to steer clear of them, Origin released Times of Lore almost simultaneously with the Commodore 64 port of Ultima V in 1988, leaving Roberts’s game overshadowed by Lord British’s latest. And in addition to all the baggage that came with the Origin logo in the United States, Times of Lore suffered all the disadvantages of being a pioneer of sorts in Europe, the first Origin title to be pushed aggressively there via a new European distribution contract with MicroProse. While that market would undoubtedly have understood the game much better had they given it a chance, no one there yet knew what to make of the company whose logo was on the box. Despite its strengths, Times of Lore thus failed to break the pattern that had held true for Origin for so long. It turned into yet another non-Ultima that was also a non-hit.

Times of Lore

But whatever the relative disappointments, Times of Lore at least wasn’t a flop, and Chris Roberts stayed around as a valued member of the little Origin family. Part of the reason the Origin people wanted to keep him around was simply because they liked him so much. He nursed the same passions for fantasy and science fiction as most of them, with just enough of a skew provided by his British upbringing to make him interesting. And he positively radiated energy and enthusiasm. He’s never hard to find in Origin group shots of the time. His face stands out like that of a nerdy cherub — he had never lost his facial baby fat, making him look pudgier in pictures than he was in real life — as he beams his thousand-kilowatt smile at all and sundry. Still, it was hardly his personality alone that made him such a valued colleague; the folks at Origin also came to have a healthy respect for his abilities. Indeed, and as we’ve already seen in an earlier article, the interface of Times of Lore had a huge influence on that of no less vital an Origin game than Ultima VI.

Alas, Roberts’s own next game for Origin would be far less influential. After flirting for a while with the idea of doing a straightforward sequel to Times of Lore, he decided to adapt the engine to an even more action-oriented post-apocalyptic scenario. Roberts’s first game for MS-DOS, Bad Blood was created in desultory fits and starts, one of those projects that limps to completion more out of inertia than passion. Released at last in 1990, it was an ugly flop on both sides of the Atlantic. Roberts blames marketplace confusion at least partially for its failure: “People who liked arcade-style games didn’t buy it because they thought Bad Blood would be another fantasy-role-play-style game. It was the worst of both worlds, a combination of factors that contributed to its lack of success.” In reality, though, the most telling factor of said combination was just that Bad Blood wasn’t very good, evincing little of the care that so obviously went into Times of Lore. Reviewers roundly panned it, and buyers gave it a wide berth. Thankfully for Chris Roberts’s future in the industry, the game that would make his name was already well along at Origin by the time Bad Blood finally trickled out the door.

Bad Blood

Had it come to fruition in its original form, Roberts’s third game for Origin would have marked even more of a departure for him than the actual end result would wind up being. Perhaps trying to fit in better with Origin’s established image, he had the idea of doing, as he puts it, “a space-conquest game where you take over star systems, move battleships around, and invade planets. It was going to be more strategic than my earlier games.” But Roberts always craved a little more adrenaline in his designs than such a description would imply, and it didn’t take him long to start tinkering with the formula. The game moved gradually from strategic battles between slow-moving dreadnoughts in space to manic dogfights between fighter planes in space. In other words, to frame the shift the way the science-fiction-obsessed Roberts might well have chosen, his inspiration for his space battles changed from Star Trek to Star Wars. He decided “it would be more fun flying around in a fighter than moving battleships around the screen”; note the (unconscious?) shift in this statement from the player as a disembodied hand “moving” battleships around to the player as an embodied direct participant “flying around” herself in fighters. Roberts took to calling his work-in-progress Squadron.

To bring off his idea for an embodied space-combat experience, Roberts would have to abandon the overhead views used by all his games to date in favor of a first-person out-the-cockpit view, like that used by a game he and every other BBC Micro veteran knew well, Ian Bell and David Braben’s Elite. “It was the first space game in which I piloted a ship in combat,” says Roberts of Elite, “and it opened my eyes to the possibilities of where it could go.” On the plus side, Roberts knew that this and any other prospective future games he might make for Origin would be developed on an MS-DOS machine with many times the processing power of the little BBC Micro (or, for that matter, the Commodore 64). On the negative side, Roberts wasn’t a veritable mathematics genius like Ian Bell, the mastermind behind Elite‘s 3D graphics. Nor could he get away in the current marketplace with the wire-frame graphics of Elite. So, he decided to cheat a bit, both to simplify his life and to up the graphics ante. Inspired by the graphics of the Lucasfilm Games flight simulator Battlehawks 1942, he used pre-rendered bitmap images showing ships from several different sides and angles, which could then be scaled to suit the player’s out-the-cockpit view, rather than making a proper, mathematically rigorous 3D engine built out of polygons. As becomes clear all too quickly to anyone who plays the finished game, the results could be a little wonky, with views of the ships suddenly popping into place rather than smoothly rotating. Nevertheless, the ships themselves looked far better than anything Roberts could possibly have hoped to achieve on the technology of the time using a more honest 3D engine.

Denis Loubet, Roberts’s old partner in crime from the early days of Times of Lore, agreed to draw a cockpit as part of what must become yet another moonlighting gig for both of them; Roberts was officially still supposed to be spending his days at Origin on Bad Blood, while Loubet was up to his eyebrows in Ultima VI. Even at this stage, they were incorporating little visceral touches into Squadron, like the pilot’s hand moving the joystick around in time with what the player was doing with her own joystick in front of the computer screen. As the player’s ship got shot up, the damage was depicted visually there in the cockpit. Like the sparks and smoke that used to burst from the bridge controls on the old Star Trek episodes, it might not have made much logical sense — haven’t any of these space-faring societies invented fuses? — but it served the purpose of creating an embodied, visceral experience. Roberts:

It really comes from wanting to put the player in the game. I don’t want you to think you’re playing a simulation, I want you to think you’re really in that cockpit. When I visualized what it would be like to sit in a cockpit, those are the things I thought of.

I took the approach that I didn’t want to sacrifice that reality due to the game dynamics. If you would see wires hanging down after an explosion, then I wanted to include it, even if it would make it harder to figure out how to include all the instruments and readouts. I want what’s taking place inside the cockpit to be as real as what I’m trying to show outside it, in space. I’d rather show you damage as if you were there than just display something like “damage = 20 percent.” That’s abstract. I want to see it.

Squadron, then, was already becoming an unusually cinematic space-combat “simulation.” Because every action-movie hero needs a sidekick, Roberts added a wingman to the game, another pilot who would fly and fight at the player’s side. The player could communicate with the wingman in the midst of battle, passing him orders, and the wingman in turn would communicate back, showing his own personality; he might even refuse to obey orders on occasion.

As a cinematic experience, Squadron felt very much in tune with the way things in general were trending at Origin, to such an extent that one might well ask who was influencing whom. Like so many publishers in this era in which CD-ROM and full-motion video hovered alluringly just out of view on the horizon, Origin had begun thinking of themselves more and more in the terms of Hollywood. The official “product development structure” that was put in place around this time by Dallas Snell demanded an executive producer, a producer, an assistant producer, a director, an assistant director, and a lead writer for every game; of all the positions on the upper rungs of the chart, only that of lead artist and lead programmer wouldn’t have been listed in the credits of a typical Hollywood film. Meanwhile Origin’s recent hire Warren Spector, who came to them with a Masters in film studies, brought his own ideas about games as interactive dramas that were less literal than Snell’s, but that would if anything prove even more of an influence on his colleagues’ developing views of just what it was Origin Systems really ought to be about. Just the previous year, Origin had released a game called Space Rogue, another of that long line of non-Ultima middling sellers, that had preceded Squadron in attempting to do Elite one better. A free-form player-directed game of space combat and trading, Space Rogue was in some ways much more ambitious than the more railroaded experience Roberts was now proposing. Yet there was little question of which game fit better with the current zeitgeist at Origin.

All of which does much to explain the warm reception accorded to Squadron when Chris Roberts, with Bad Blood finally off his plate, pitched it to Origin’s management formally in very early 1990. Thanks to all those moonlighting hours — as well as, one suspects, more than a few regular working hours — Roberts already had a 3D space-combat game that looked and played pretty great. A year or two earlier, that likely would have been that; Origin would have simply polished it up a little and shipped it. But now Roberts had the vision of building a movie around the game. Between flying a series of scripted missions, you would get to know your fellow pilots and follow the progress of a larger war between humanity and the Kilrathi, a race of savage cats in space.

Having finally made the hard decision to abandon the 8-bit market at the beginning of 1989, Origin was now pushing aggressively in the opposite direction from their old technological conservatism, being determined to create games that showed what the very latest MS-DOS machines could really do. Like Sierra before them, they had decided that if the only way to advance the technological state of the art among ordinary consumers was to release games whose hardware requirements were ahead of the curve — a reversal of the usual approach among game publishers, who had heretofore almost universally gone where the largest existing user base already was — then that’s what they would do. Squadron could become the first full expression of this new philosophy, being unapologetically designed to run well only on a cutting-edge 80386-based machine. In what would be a first for the industry, Chris Roberts even proposed demanding expanded memory beyond the traditional 640 K for the full audiovisual experience. For Roberts, stepping up from a Commodore 64, it was a major philosophical shift indeed. “Sod this, trying to make it work for the lowest common denominator—I’m just going to try and push it,” he said, and Origin was happy to hear it.

Ultima VI had just been completed, freeing personnel for another major project. Suspecting that Squadron might be the marketplace game changer he had sought for so long for Origin, Robert Garriott ordered a full-court press in March of 1990. He wanted his people to help Chris Roberts build his movie around his game, and he wanted them to do it in less than three months. They should have a preview ready to go for the Summer Consumer Electronics Show at the beginning of June, with the final product to ship very shortly thereafter.

Jeff George

Responsibility for the movie’s script was handed to Jeff George, one of the first of a number of fellow alumni of the Austin tabletop-game publisher Steve Jackson Games who followed Warren Spector to Origin. George was the first Origin employee hired explicitly to fill the role of “writer.” This development, also attributable largely to the influence of Spector, would have a major impact on Origin’s future games.

Obviously inspired by the ethical quandaries the Ultima series had become so known for over its last few installments, Chris Roberts had imagined a similarly gray-shaded world for his game, with scenarios that would cause the player to question whether the human empire she was fighting for was really any better than that of the Kilrathi. But George, to once again frame the issue in terms Roberts would have appreciated, pushed the game’s fiction toward the clear-cut good guys and bad guys of Star Wars, away from the more complicated moral universe of Star Trek. All talk of a human “empire,” for one thing, would have to go; everyone at Origin knew what their players thought of first when they thought of empires in space. Jeff George:

In the context of a space opera, empire had a bad connotation that would make people think they were fighting for the bad guys. The biggest influence I had on the story was to make it a little more black and white, where Chris had envisioned something grittier, with more shades of gray. I didn’t want people to worry about moral dilemmas while they were flying missions. That’s part of why it worked so well. You knew what you were doing, and knew why you were doing it. The good guys were really good, the bad guys were really bad.

The decision to simplify the political situation and sand away the thorny moral dilemmas demonstrates, paradoxical though it may first seem, a more sophisticated approach to narrative rather than the opposite. Some interactive narratives, like some non-interactive ones, are suited to exploring moral ambiguity. In others, though, the player just wants to fight the bad guys. While one can certainly argue that gaming has historically had far too many of the latter type and far too few of the former, there nevertheless remains an art to deciding which games are best suited for which.

Glen Johnson

Five more programmers and four more artists would eventually join what had been Chris Roberts and Denis Loubet’s little two-man band. With the timetable so tight, the artists were left to improvise large chunks of the narrative along with the game’s visuals. By imagining and drawing the “talking head” portraits of the various other pilots with which the player would interact, artist Glen Johnson wound up playing almost as big a role as Jeff George in crafting the fictional context for the game’s dogfights in space. Johnson:

I worked on paper first, producing eleven black-and-white illustrations. In most games, I would work from a written description of the character’s likes, dislikes, and personality. In this case, I just came up with the characters out of thin air, although I realized they wanted a mixture of men and women pilots. I assigned a call sign to each portrait.

Despite the lack of time at their disposal, the artists were determined to fit the movements of the characters’ mouths to the words of dialog that appeared on the screen, using techniques dating back to classic Disney animation. Said techniques demanded that all dialog be translated into its phonetic equivalent, something that could only be done by hand. Soon seemingly half the company was doing these translations during snatches of free time. Given that many or most players never even noticed the synchronized speech in the finished game, whether it was all worth it is perhaps a valid question, but the determination to go that extra mile in this regard does say much about the project’s priorities.

The music wound up being farmed out to a tiny studio specializing in videogame audio, one of vanishingly few of its kind at the time, which was run by a garrulous fellow named George Sanger, better known as “The Fat Man.” (No, he wasn’t terribly corpulent; that was sort of the joke.) Ever true to his influences, Chris Roberts’s brief to Sanger was to deliver something “between Star Wars and Star Trek: The Motion Picture.” Sanger and his deputy Dave Govett delivered in spades. Hugely derivative of John Williams’s work though the soundtrack was — at times it threatens to segue right into Williams’s famous Star Wars theme — it contributed hugely to the cinematic feel of the game. Origin was particularly proud of the music that played in the background when the player was actually flying in space; the various themes ebbed and swelled dynamically in response to the events taking place on the computer screen. It wasn’t quite the first time anyone had done something like this in a game, but no one had ever managed to do it in quite this sophisticated a way.

The guiding theme of the project remained the determination to create an embodied experience for the player. Chris Roberts cites the interactive movies of Cinemaware, which could be seen as the prototypes for the sort of game he was now trying to perfect, as huge influences in this respect as in many others. Roberts:

I didn’t want anything that made you sort of… pulled you out of being in this world. I didn’t want that typical game UI, or “Here’s how many lives you’ve got, here’s what high score you’ve got.” I always felt that broke the immersion. If you wanted to save the game you’d go to the barracks and you’d click on the bunk. If you wanted to exit, you’d click on the airlock. It was all meant to be in that world and so that was what the drive was. I love story and narrative and I think you can use that story and narrative to tie your action together and that will give your action meaning and context in a game. That was my idea and that was what really drove what I was doing.

The approach extended to the game’s manual. Harking back to the beloved scene-setting packaging of Infocom, the manual, which was written by freelancer Aaron Allston, took the form of Claw Marks, “The Onboard Magazine of TCS Tiger’s Claw” — the Tiger’s Claw being the name of the spaceborne aircraft carrier from which the player would be flying all of the missions. Like the artists, Allston would wind up almost inadvertently creating vital pieces of the game as a byproduct of the compressed schedule. “I couldn’t really determine everything at that point in development,” he remembers, “so, in some cases, specifically for the tactics information, we made some of it up and then retrofitted it and adjusted the code in the game to make it work.”

Once again in the spirit of creating a cohesive, embodied experience for the player, Roberts wanted to get away from the save-and-restore dance that was so typical of ludic narratives of the era. Therefore, instead of structuring the game’s 40 missions as a win-or-go-home linear stream, he created a branching mission tree in which the player’s course through the narrative would be dictated by her own performance. There would, in other words, be no way to definitively lose other than by getting killed. Roberts would always beg players to play the game “honestly,” beg them not to reload and replay each mission until they flew it perfectly. Only in this way would they get the experience he had intended for them to have.

Warren Spector

As the man responsible for tying all of the elements together to create the final experience, Roberts bore the titles of director and producer under Origin’s new cinematic nomenclature. He worked under the watchful eye of Squadron‘s co-producer Warren Spector, who, being older and in certain respects wiser, was equipped to handle the day-to-day administrative tasks that Roberts wasn’t. Spector:

When I came on as producer, Chris was really focused on the direction he wanted to take with the game. He knew exactly where he was going, and it would have been hard to deflect him from that course. It would have been crazy to even want to, so Chris and I co-produced the game. Where his talent dropped out, mine started, and vice versa. We did a task breakdown, and I ended up updating, adjusting, and tracking scheduling and preparing all the documentation. He handled the creative and qualitative issues. We both juggled the resources.

In implying that his own talent “dropped out” when it came to creative issues, Spector is selling himself about a million dollars short. He was a whirling dervish of creative energy throughout the seven years he spent with Origin, if anything even more responsible than Richard Garriott for the work that came out of the company under the Ultima label during this, the franchise’s most prolific period. But another of the virtues which allowed him to leave such a mark on the company was an ability to back off, to defer to the creative visions of others when it was appropriate. Recognizing that no one knew Chris Roberts’s vision like Chris Roberts, he was content in the case of Squadron to act strictly as the facilitator of that vision. In other words, he wasn’t too proud to just play the role of organizer when it was appropriate.

Still, it became clear early on that no combination of good organization and long hours would allow Squadron to ship in June. The timetable slipped to an end-of-September ship date, perfect to capitalize on the Christmas rush.

Although Squadron wouldn’t ship in June, the Summer Consumer Electronics Show loomed with as much importance as ever as a chance to show off the game-to-be and to drum up excitement that might finally end the sniggering about Ultima Systems. Just before the big show, Origin’s lawyers delivered the sad news that calling the game Squadron would be a bad idea thanks to some existing trademarks on the name. After several meetings, Wingleader emerged as the consensus choice for a new name, narrowly beating out Wing Commander. It was thus under the former title that the world at large got its first glimpse of what would turn into one of computer gaming’s most iconic franchises. Martin Davies, Origin’s Vice President of Sales:

I kicked hard to have a demo completed for the show. It was just a gut reaction, but I knew I needed to flood retail and distribution channels with the demo. Before the release of the game, I wanted the excitement to grow so that the confidence level would be extremely high. If we could get consumers beating a path in and out of the door, asking whether the game was out, distribution would respond.

With Wingleader still just a bunch of art and sound assets not yet wired up to the core game they were meant to complement, an interactive demo was impossible. Instead Chris Roberts put together a demo on videotape, alternating clips of the battles in space with clips of whatever other audiovisual elements he could assemble from what the artists and composers had managed to complete. Origin brought a big screen and a booming sound system out to Chicago for the show; the latter prompted constant complaints from other exhibitors. The noise pollution was perfect for showing the world that there was now more to Origin Systems than intricate quests and ethical dilemmas — that they could do aesthetic maximalism as well as anyone in their industry, pushing all of the latest hardware to its absolute limit in the process. It was a remarkable transformation for a company that just eighteen months before had been doing all development on the humble little 8-bit Apple II and Commodore 64. Cobbled together though it was, the Wingleader demo created a sensation at CES.

Indeed, one can hardly imagine a better demonstration of how the computer-game industry as a whole was changing than the game that had once been known as Squadron, was now known as Wingleader, and would soon go onto fame as Wing Commander. In my next article, I’ll tell the story of how the game would come to be finished and sold, along with the even more important story of what it would mean for the future of digital entertainment.

(Sources: the books Wing Commander I and II: The Ultimate Strategy Guide by Mike Harrison and Game Design Theory and Practice by Richard Rouse III; Retro Gamer 59 and 123; Questbusters of July 1989, August 1990, and April 1991; Computer Gaming World of September 1989 and November 1992; Amiga Computing of December 1988. Online sources include documents hosted at the Wing Commander Combat Information Center, US Gamer‘s profile of Chris Roberts, The Escapist‘s history of Wing Commander, Paul Dean’s interview with Chris Roberts, and Matt Barton’s interview with George “The Fat Man” Sanger. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Wing Commander I and II can be purchased in a package together with all of their expansion packs from GOG.com.)

 
 

Tags: , ,

The 640 K Barrier

There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.

The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.

— with my apologies to The Right Stuff[1]Yes, that is quite possibly the nerdiest thing I’ve ever written.

The idea that the original IBM PC, the machine that made personal computing safe for corporate America, was a hastily slapped-together stopgap has been vastly overstated by popular technology pundits over the decades since its debut back in August of 1981. Whatever the realities of budgets and scheduling with which its makers had to contend, there was a coherent philosophy behind most of the choices they made that went well beyond “throw this thing together as quickly as possible and get it out there before all these smaller companies corner the market for themselves.” As a design, the IBM PC favored robustness, longevity, and expandability, all qualities IBM had learned the value of through their many years of experience providing businesses and governments with big-iron solutions to their most important data–processing needs. To appreciate the wisdom of IBM’s approach, we need only consider that today, long after the likes of the Commodore Amiga and the original Apple Macintosh architecture, whose owners so loved to mock IBM’s unimaginative beige boxes, have passed into history, most of our laptop and desktop computers — including modern Macs — can trace the origins of their hardware back to what that little team of unlikely business-suited visionaries accomplished in an IBM branch office in Boca Raton, Florida.

But of course no visionary has 20-20 vision. For all the strengths of the IBM PC, there was one area where all the jeering by owners of sexier machines felt particularly well-earned. Here lay a crippling weakness, born not so much of the hardware found in that first IBM PC as the operating system the marketplace chose to run on it, that would continue to vex programmers and ordinary users for two decades, not finally fading away until Microsoft’s release of Windows XP in 2001 put to bed the last legacies of MS-DOS in mainstream computing. MS-DOS, dubbed the “quick and dirty” operating system during the early days of its development, is likely the piece of software in computing history with the most lopsided contrast between the total number of hours put into its development and the total number of hours it spent in use, on millions and millions of computers all over the world. The 640 K barrier, the demon all those users spent so much time and energy battling for so many years, was just one of the more prominent consequences of corporate America’s adoption of such a blunt instrument as MS-DOS as its standard. Today we’ll unpack the problem that was memory management under MS-DOS, and we’ll also examine the problem’s multifarious solutions, all of them to one degree or another ugly and imperfect.


 

The original IBM PC was built around an Intel 8088 microprocessor, a cost-reduced and somewhat crippled version of an earlier chip called the 8086. (IBM’s decision to use the 8088 instead of the 8086 would have huge importance for the expansion buses of this and future machines, but the differences between the two chips aren’t important for our purposes today.) Despite functioning as a 16-bit chip in most ways, the 8088 had a 20-bit address space, meaning it could address a maximum of 1 MB of memory. Let’s consider why this limitation should exist.

Memory, whether in your brain or in your computer, is of no use to you if you can’t keep track of where you’ve put things so that you can retrieve them again later. A computer’s memory is therefore indexed by bytes, with every single byte having its own unique address. These addresses, numbered from 0 to the upper limit of the processor’s address space, allow the computer to keep track of what is stored where. The biggest number that can be represented in 20 bits is 1,048,575, or 1 MB. Thus this is the maximum amount of memory which the 8088, with its 20-bit address bus, can handle. Such a limitation hardly felt like a deal breaker to the engineers who created the IBM PC. Indeed, it’s difficult to overemphasize what a huge figure 1 MB really was when they released the machine in 1981, in which year the top-of-the-line Apple II had just 48 K of memory and plenty of other competing machines shipped with no more than 16 K.

A processor needs to address other sorts of memory besides the pool of general-purpose RAM which is available for running applications. There’s also ROM memory — read-only memory, burned inviolably into chips — that contains essential low-level code needed for the computer to boot itself up, along with, in the case of the original IBM PC, an always-available implementation of the BASIC programming language. (The rarely used BASIC in ROM would be phased out of subsequent models.) And some areas of RAM as well are set aside from the general pool for special purposes, like the fully 128 K of addresses given to video cards to keep track of the onscreen display in the original IBM PC. All of these special types of memory must be accessed by the CPU, must be given their own unique addresses to facilitate that, and must thus be subtracted from the address space available to the general pool.

IBM’s engineers were quite generous in drawing the boundary between their general memory pool and the area of addresses allocated to special purposes. Focused on expandability and longevity as they were, they reserved big chunks of “special” memory for purposes that hadn’t even been imagined yet. In all, they reserved the upper three-eighths of the available addresses for specialized purposes actual or potential, leaving the lower five-eighths — 640 K — to the general pool. In time, this first 640 K of memory would become known as “conventional memory,” the remaining 384 K — some of which would be ROM rather than RAM — as “high memory.” The official memory map which IBM published upon the debut of the IBM PC looked like this:

It’s important to understand when looking at a memory map like this one that the existence of a logical address therein doesn’t necessarily mean that any physical memory is connected to that address in any given real machine. The first IBM PC, for instance, could be purchased with as little as 16 K of conventional memory installed, and even a top-of-the-line machine had just 256 K, leaving most of the conventional-memory space vacant. Similarly, early video cards used just 32 K or 64 K of the 128 K of address space offered to them in high memory. The 640 K barrier was thus only a theoretical limitation early on, one few early users or programmers ever even noticed.

That blissful state of affairs, however, wouldn’t last very long. As IBM’s creations — joined, soon enough, by lots of clones — became the standard for American business, more and more advanced applications appeared, craving more and more memory alongside more and more processing power. Already by 1984 the 640 K barrier had gone from a theoretical to a very real limitation, and customers were beginning to demand that IBM do something about it. In response, IBM that year released the PC/AT, built around Intel’s new 80286 microprocessor, which boasted a 24-bit address space good for 16 MB of memory. To unlock all that potential extra memory, IBM made the commonsense decision to extend the memory map above the specialized high-memory area that ended at 1 MB, making all addresses beyond 1 MB a single pool of “extended memory” available for general use.

Problem solved, right? Well, no, not really — else this would be a much shorter article. Due more to software than hardware, all of this potential extended memory proved not to be of much use for the vast majority of people who bought PC/ATs. To understand why this should be, we need to examine the deadly embrace between the new processor and the old operating system people were still running on it.

The 80286 was designed to be much more than just a faster version of the old 8086/8088. Developing the chip before IBM PCs running MS-DOS had come to dominate business computing, Intel hadn’t allowed the need to stay compatible with that configuration to keep them from designing a next-generation chip that would help to take computing to where they saw it as wanting to go. Intel believed that microcomputers were at the stage at which the big institutional machines had been a couple of decades earlier, just about ready to break free of what computer scientist Brian L. Stuart calls the “Triangle of Ones”: one user running one program at a time on one machine. At the very least, Intel believed, the second leg of the Triangle must soon fall; everyone recognized that multitasking — running several programs at a time and switching freely between them — was a much more efficient way to do complex work than laboriously shutting down and starting up application after application. But unfortunately for MS-DOS, the addition of multitasking complicates the life of an operating system to an absolutely staggering degree.

Operating systems are of course complex subjects worthy of years or a lifetime of study. We might, however, collapse their complexities down to a few fundamental functions: to provide an interface for the user to work with the computer and manage her programs and files; to manage the various tasks running on the computer and allocate resources among them; and to act as a buffer or interface between applications and the underlying hardware of the computer. That, anyway, is what we expect at a minimum of our operating systems today. But for a computer ensconced within the Triangle of Ones, the second and third functions were largely moot: with only one program allowed to run at a time, resource-management concerns were nonexistent, and, without the need for a program to be concerned about clashing with other programs running at the same time, bare-metal programming — manipulating the hardware directly, without passing requests through any intervening layer of operating-system calls — was often considered not only acceptable but the expected approach. In this spirit, MS-DOS provided just 27 function calls to programmers, the vast majority of them dealing only with disk and file management. (Compare that, my fellow programmers, with the modern Windows or OS X APIs!) For everything else, banging on the bare metal was fine.

We can’t even begin here to address all of the complications that are introduced when we add multitasking into the equation, asking the operating system in the process to fully embrace all three of the core functions listed above. Memory management alone, the one aspect we will look deeper into today, becomes complicated enough. A program which is sharing a machine with other programs can no longer have free run of the memory map, placing whatever it wants to wherever it wants to; to do so risks overwriting the code or data of another program running on the system. Instead the operating system must demand that individual programs formally request the memory they’d like to use, and then must come up with a way to keep a program, whether due to bugs or malice, from running roughshod over areas of memory that it hasn’t been granted.

Or perhaps not. The Commodore Amiga, the platform which pioneered multitasking on personal computers in 1985, didn’t so much solve the latter part of this problem as punted it away. An application program is expected to request from the Amiga’s operating system any memory that it requires. The operating system then returns a pointer to a block of memory of the requested size, and trusts the application not to write to  memory outside of these bounds. Yet nothing besides the programmer’s skill and good nature absolutely prevents such unauthorized memory access from happening. Every application on the Amiga, in other words, can write to any address in the machine’s memory, whether that address be properly allocated to it or not. Screen memory, free memory, another program’s data, another program’s code — all are fair game to the errant program. Such unauthorized memory access will almost always eventually result in a total system crash. A non-malicious programmer who wants her program to be a good citizen would of course never intentionally write to memory she hasn’t properly requested, but bugs of this nature are notoriously easy to create and notoriously hard to track down, and on the Amiga a single instance of one can bring down not only the offending program but the entire operating system. With all due respect to the Amiga’s importance as the first multitasking personal computer, this is obviously not the ideal way to implement it.

A far more sustainable approach is to take the extra step of tracking and protecting the memory that has been allocated to each program. Memory protection is usually accomplished using  what’s known as virtual memory: when a program requests memory, it’s returned not a true address within the system’s memory pool but rather a virtual address that’s translated back into the real address to which it corresponds every time the program accesses its data. Each program is thus effectively sandboxed from everything else, allowed to read from and write to only its own data. Only the lowest levels of the operating system have global access to the memory pool as a whole.

Implementing such memory protection in software alone, however, must be an untenable drain on the resources available to systems engineers in the 1980s — a fact which does everything to explain its absence from the Amiga. Intel therefore decided to give software a leg up via hardware. They built into the 80286 a memory-management unit that could automatically translate from virtual to real memory addresses and vice versa, making this constantly ongoing process fairly transparent even to the operating system.

Nevertheless, the operating system must know about this capability, must in fact be written very differently if it’s to run on a CPU with memory protection built into its circuitry. Intel recognized that it would take time for such operating systems to be created for the new chip, and recognized that compatibility with the earlier 8086/8088 chips would be a very good thing to have in the meantime. They therefore built two possible operating modes into the 80286. In “protected mode” — the mode they hoped would eventually come to be used almost universally — the chip’s full potential would be realized, including memory protection and the ability to address up to 16 MB of memory. In “real mode,” the 80286 would function essentially like a turbocharged 8086/8088, with no memory-protection capabilities and with the old limitation on addressable memory of 1 MB still in place. Assuming that in the early days at least the new chip would need to run on operating systems with no knowledge of its full capabilities, Intel made the 80286 default to real mode on startup. An operating system which did know about the 80286 and wanted to bring out its full potential could switch it to protected mode at boot-up and be off to the races.

It’s at the intersection between the 80286 and the operating system that Intel’s grand plans for the future of their new chip went awry. An overwhelming percentage of the early 80286s were used in IBM PC/ATs and clones, and an overwhelming percentage of those machines were running MS-DOS. Microsoft’s erstwhile “quick and dirty” operating system knew nothing of the 80286’s full capabilities. Worse, trying to give it knowledge of those capabilities would have to entail a complete rewrite which would break compatibility with all existing MS-DOS software. Yet the whole reason MS-DOS was popular in the first place — it certainly wasn’t because of a generous feature set, a friendly interface, or any aesthetic appeal — was that very same huge base of business software. Getting users to make the leap to some hypothetical new operating system in the absence of software to run on it would be as difficult as getting developers to write programs for an operating system with no users. It was a chicken-or-the-egg situation, and neither chicken nor egg was about to stick its neck out anytime soon.

IBM was soon shipping thousands upon thousands of PC/ATs every month, and the clone makers were soon shipping even more 80286-based machines of their own. Yet at least 95 percent of those machines were idling along at only a fraction of their potential, thanks to the already creakily archaic MS-DOS. For all these users, the old 640 K barrier remained as high as ever. They could stuff their machines full of extended memory if they liked, but they still couldn’t access it. And of course the multitasking that the 80286 was supposed to have enabled remained as foreign a concept to MS-DOS as a GPS unit to a Model T. The only solution IBM offered those who complained about the situation was to run another operating system. And indeed, there were a number of alternatives to MS-DOS available for the PC/AT and other 80286-based machines, including several variants of the old institutional-computing favorite Unix — one of them even from Microsoft — and new creations like Digital Research’s Concurrent DOS, which struggled with mixed results to wedge in some degree of MS-DOS compatibility. Still, the only surefire way to take full advantage of MS-DOS’s huge software base was to run the real — in more ways than one now! — MS-DOS, and this is what the vast majority of people with 80286-equipped machines wound up doing.

Meanwhile the very people making the software which kept MS-DOS the only viable choice for most users were feeling the pinch of being confined to 640 K more painfully almost by the month. Finally Lotus Corporation —  makers of the Lotus 1-2-3 spreadsheet package that ruled corporate America, the greatest single business-software success story of their era — decided to use their clout to do something about it. They convinced Intel to join them in devising a scheme for breaking the 640 K barrier without abandoning MS-DOS. What they came up with was one mother of an ugly kludge — a description the scheme has in common with virtually all efforts to break through the 640 K barrier.

Looking through the sparsely populated high-memory area which the designers of the original IBM PC had so generously carved out, Lotus and Intel realized it should be possible on almost any extant machine to identify a contiguous 64 K chunk of those addresses which wasn’t being used for anything. This chunk, they decided, would be the gateway to potentially many more megabytes installed elsewhere in the machine. Using a combination of software and hardware, they implemented what’s known as a bank-switching scheme. The 64 K chunk of high-memory addresses was divided into four segments of 16 K, each of which could serve as a lens focused on a 16 K segment of additional memory above and beyond 1 MB. When the processor accessed the addresses in high memory, the data it would actually access would be the data at whatever sections of the additional memory their lenses were currently pointing to. The four lenses could be moved around at will, giving access, albeit in a roundabout way, to however much extra memory the user had installed. The additional memory unlocked by the scheme was dubbed “expanded memory.”  The name’s unfortunate similarity to “extended memory” would cause much confusion over the years to come; from here on, we’ll call it by its common acronym of “EMS.”

All those gobs of extra memory wouldn’t quite come for free: applications would have to be altered to check for the existence of EMS memory and make use of it, and there would remain a distinct difference between conventional memory and EMS memory with which programmers would always have to reckon. Likewise, the overhead of constantly moving those little lenses around made EMS memory considerably slower to access than conventional memory. On the brighter side, though, EMS worked under MS-DOS with only the addition of a single device driver during startup. And, since the hardware mechanism for moving the lenses around was completely external to the CPU, it would even work on machines that weren’t equipped with the new 80286.

This diagram shows the different types of memory available on PCs of the mid-1980s. In blue, we see the original 1 MB memory map of the IBM PC. In green, we see a machine equipped with additional extended memory. And in orange we see a machine equipped with additional expanded memory.

Shortly before the scheme made its official debut at a COMDEX trade show in May of 1985, Lotus and Intel convinced a crucial third partner to come aboard: Microsoft. “It’s garbage! It’s a kludge!” said Bill Gates. “But we’re going to do it.” With the combined weight of Lotus, Intel, and Microsoft behind it, EMS took hold as the most practical way of breaking the 640 K barrier. Imperfect and kludgy though it was, software developers hurried to add support for EMS memory to whatever programs of theirs could practically make use of it, while hardware manufacturers rushed EMS memory boards onto the market. EMS may have been ugly, but it was here today and it worked.

At the same time that EMS was taking off, however, extended memory wasn’t going away. Some hardware makers — most notably IBM themselves — didn’t want any part of EMS’s ugliness. Software makers therefore continued to probe at the limits of machines equipped with extended memory, still looking for a way to get at it from within the confines of MS-DOS. What if they momentarily switched the 80286 into protected mode, just for as long as they needed to manipulate data in extended memory, then went back into real mode? It seemed like a reasonable idea — except that Intel, never anticipating that anyone would want to switch modes on the fly like this, had neglected to provide a way to switch an 80286 in protected mode back into real mode. So, proponents of extended memory had to come up with a kludge even uglier than the one that allowed EMS memory to function. They could force the 80286 back into real mode, they realized, by resetting it entirely, just as if the user had rebooted her computer. The 80286 would go through its self-check again — a process that admittedly absorbed precious milliseconds — and then pick back up where it left off. It was, as Microsoft’s Gordon Letwin memorably put it, like “turning off the car to change gears.” It was staggeringly kludgy, it was horribly inefficient, but it worked in its fashion. Given the inefficiencies involved, the scheme was mostly used to implement virtual disks stored in the extended memory, which wouldn’t be subject to the constant access of an application’s data space.

In 1986, the 32-bit 80386, Intel’s latest and greatest chip, made its public bow at the heart of the Compaq Deskpro 386 rather than an IBM machine, a landmark moment signaling the slow but steady shift of business computing’s power center from IBM to Microsoft and the clone makers using their operating system. While working on the new chip, Intel had had time to see how the 80286 was actually being used in the wild, and had faced the reality that MS-DOS was likely destined to be cobbled onto for years to come rather than replaced in its entirety with something better. They therefore made a simple but vitally important change to the 80386 amidst its more obvious improvements. In addition to being able to address an inconceivable total of 4 GB of memory in protected mode thanks to its 32-bit address space, the 80386 could be switched between protected mode and real mode on the fly if one desired, without needing to be constantly reset.

In freeing programmers from that massive inefficiency, the 80386 cracked open the door that much further to making practical use of extended memory in MS-DOS. In 1988, the old EMS consortium of Lotus, Intel, and Microsoft came together once again, this time with the addition to their ranks of the clone manufacturer AST; the absence of IBM is, once again, telling. Together they codified a standard approach to extended memory on 80386 and later processors, which corresponded essentially to the scheme I’ve already described in the context of the 80286, but with a simple command to the 80386 to switch back to real mode replacing the resets. They called it the eXtended Memory Specification; memory accessed in this way soon became known universally as “XMS” memory. Under XMS as under EMS, a new device driver would be loaded into MS-DOS. Ordinary real-mode programs could then call this driver to access extended memory; the driver would do the needful switching to protected mode, copy blocks of data from extended memory into conventional memory or vice versa, then switch the processor back to real mode when it was time to return control to the program. It was still inelegant, still a little inefficient, and still didn’t use the capabilities of Intel’s latest processors in anything like the way Intel’s engineers had intended them to be used; true multitasking still remained a pipe dream somewhere off in a shadowy future. Owners of sexier machines like the Macintosh and Amiga, in other words, still had plenty of reason to mock and scoff. In most circumstances, working with XMS memory was actually slower than working with EMS memory. The primary advantage of XMS was that it let programs work with much bigger chunks of non-conventional memory at one time than the four 16 K chunks that EMS allowed. Whether any given program chose EMS or XMS came to depend on which set of advantages and disadvantages best suited its purpose.

The arrival of XMS along with the ongoing use of EMS memory meant that MS-DOS now had two competing memory-management solutions. Buyers now had to figure out not only whether they had enough extra memory to run a program but whether they had the right kind of extra memory. Ever accommodating, hardware manufacturers began shipping memory boards that could be configured as either EMS or XMS memory — whatever the application you were running at the moment happened to require.

The next stage in the slow crawl toward parity with other computing platforms in the realm of memory management would be the development of so-called “DOS extenders,” software to allow applications themselves to run in protected mode, thus giving them direct access to extended memory without having to pass their requests through an inefficient device driver. An application built using a DOS extender would only need to switch the processor to real mode when it needed to communicate with the operating system. The development of DOS extenders was driven by Microsoft’s efforts to turn Windows, which like seemingly everything else in business computing ran on top of MS-DOS, into a viable alternative to the command line and a viable challenger to the Macintosh. That story is thus best reserved for a future article, when we look more closely at Windows itself. As it is, the story that I’ve told so far today moves us nicely into the era of computer-gaming history we’ve reached on the blog in general.

In said era, the MS-DOS machines that had heretofore been reserved for business applications were coming into homes, where they were often used to play a new generation of games taking advantage of the VGA graphics, sound cards, and mice sported by the latest systems. Less positively, all of the people wanting to play these new games had to deal with the ramifications of a 640 K barrier that could still be skirted only imperfectly. As we’ve seen, both EMS and XMS imposed to one degree or another a performance penalty when accessing non-conventional memory. What with games being the most performance-sensitive applications of all, that made that first 640 K of lightning-fast conventional memory most precious of all for them.

In the first couple of years of MS-DOS’s gaming dominance, developers dealt with all of the issues that came attached to using memory beyond 640 K by the simple expedient of not using any memory beyond 640 K. But that solution was compatible neither with developers’ growing ambitions for their games nor with the gaming public’s growing expectations of them.

The first harbinger of what was to come was Origin Systems’s September 1990 release Wing Commander, which in its day was renowned — and more than a little feared — for pushing the contemporary state of the art in hardware to its limits. Even Wing Commander didn’t go so far as to absolutely require memory beyond 640 K, but it did use it to make the player’s audiovisual experience snazzier if it was present. Setting a precedent future games would largely follow, it was quite inflexible in its approach, demanding EMS — as opposed to XMS — memory. In the future, gamers would have to become all too familiar with the differences between the two standards, and how to configure their machines to use one or the other. Setting another precedent, Wing Commander‘s “installation guide” included a section on “memory usage” that was required reading in order to get things working properly. In the future, such sections would only grow in length and complexity, and would need to be pored over by long-suffering gamers with far more concentrated attention than anything in the manual having anything to do with how to actually play the games they purchased.

In Accolade’s embarrassing Leisure Suit Larry knockoff Les Manley in: Lost in LA, the title character explains EMS and XMS memory to some nubile companions. The ironic thing was that anyone who wished to play the latest games on an MS-DOS machine really did need to know this stuff, or at least have a friend who did.

Thus began the period of almost a decade, remembered with chagrin but also often with an odd sort of nostalgia by old-timers today, in which gamers spent hours monkeying about with MS-DOS’s “config.sys” and “autoexec.bat” files and swapping in and out various third-party utilities in the hope of squeezing out that last few kilobytes of conventional memory that Game X needed to run. The techniques they came to employ were legion.

In the process of developing Windows, Microsoft had discovered that the kernel of MS-DOS itself, a fairly tiny program thanks to its sheer age, could be stashed into the first 64 K of memory beyond 1 MB and still accessed like conventional memory on an 80286 or later processor in real mode thanks to what was essentially an undocumented technical glitch in the design of those processors. Gamers thus learned to include the line “DOS=HIGH” in their configuration files, freeing up a precious block of conventional memory. Likewise, there was enough unused space scattered around in the 384 K of high memory on most machines to stash many or all of MS-DOS’s device drivers there instead of in conventional memory. Thus “DOS=HIGH” soon became “DOS=HIGH,UMB,” the second parameter telling the computer to make use of these so-called “upper-memory blocks” and thereby save that many kilobytes more.

These were the most basic techniques, the starting points. Suffice to say that things got a lot more complicated from there, turning into a baffling tangle of tweaks, some saving mere bytes rather than kilobytes of conventional memory, but all of them important if one was to hope to run games that by 1993 would be demanding 604 K of 640 K for their own use. That owners of machines which by that point typically contained memories in the multi-megabytes should have to squabble with the operating system over mere handfuls of bytes was made no less vexing by being so comically absurd. And every new game seemed to up the ante, seemed to demand that much more conventional memory. Those with a sunnier disposition or a more technical bent of mind took the struggle to get each successive purchase running as the game before the game got started, as it were. Everyone else gnashed their teeth and wondered for the umpteenth time if they might not have been better off buying a console where games Just Worked. The only thing that made it all worthwhile was the mixture of relief, pride, and satisfaction that ensued when you finally got it all put together just right and the title screen came up and the intro music sprang to life — if, that is, you’d managed to configure your sound card properly in the midst of all your other travails. Such was the life of the MS-DOS gamer.

Before leaving the issue of the 640 K barrier behind in exactly the way that all those afflicted by it for so many years were so conspicuously unable to do, we have to address Bill Gates’s famous claim, allegedly made at a trade show in 1981, that “640 K ought to be enough for anybody.” The quote has been bandied about for years as computer-industry legend, seeming to confirm as it does the stereotype of Bill Gates as the unimaginative dirty trickster of his industry, as opposed to Steve Jobs the guileless visionary (the truth is, needless to say, far more complicated). Sadly for the stereotypers, however, the story of the quote is similar to all too many legends in the sense that it almost certainly never happened. Gates himself, for one, vehemently denies ever having said any such thing. Fred Shapiro, for another, editor of The Yale Book of Quotations, conducted an exhaustive search for a reputable source for the quote in 2008, going so far as to issue a public plea in The New York Times for anyone possessing knowledge of such a source to contact him. More than a hundred people did so, but none of them could offer up the smoking gun Shapiro sought, and he was left more certain than ever that the comment was “apocryphal.” So, there you have it. Blame Bill Gates all you want for the creaky operating system that was the real root cause of all of the difficulties I’ve spent this article detailing, but don’t ever imagine he was stupid enough to say that. “No one involved in computers would ever say that a certain amount of memory is enough for all time,” said Gates in 2008. Anyone doubting the wisdom of that assertion need only glance at the history of the IBM PC.

(Sources: the books Upgrading and Repairing PCs, 3rd edition by Scott Mueller and Principles of Operating Systems by Brian L. Stuart; Computer Gaming World of June 1993; Byte of January 1982, November 1984, and March 1992; Byte‘s IBM PC special issues of Fall 1985 and Fall 1986; PC Magazine of May 14 1985, January 14 1986, May 30 1989, June 13 1989, and June 27 1989; the episode of the Computer Chronicles television show entitled “High Memory Management”; the online article “The ‘640K’ quote won’t go away — but did Gates really say it?” on Computerworld.)

Footnotes

Footnotes
1 Yes, that is quite possibly the nerdiest thing I’ve ever written.
 
 

Tags: , , ,

Ultima VI

After Richard Garriott and his colleagues at Origin Systems finished each Ultima game — after the manic final crunch of polishing and testing, after the release party, after the triumphant show appearances and interviews in full Lord British regalia — there must always arise the daunting question of what to do next. Garriott had set a higher standard for the series than that of any of its competitors almost from the very beginning, when he’d publicly declared that no Ultima would ever reuse the engine of its predecessor, that each new entry in the series would represent a significant technological leap over what had come before. And just to add to that pressure, starting with Ultima IV he’d begun challenging himself to make each new Ultima a major thematic statement that also built on what had come before. Both of these bars became harder and harder to meet as the series advanced.

As if that didn’t present enough of a burden, each individual entry in the series came with its own unique psychological hurdles for Garriott to overcome. For example, by the time he started thinking about what Ultima V should be he’d reached the limits of what a single talented young man like himself could design, program, write, and draw all by himself on his trusty Apple II. It had taken him almost a year — a rather uncomfortable year for his brother Robert and the rest of Origin’s management — to accept that reality and to begin to work in earnest on Ultima V with a team of others.

The challenge Garriott faced after finishing and releasing that game in March of 1988 was in its way even more emotionally fraught: the challenge of accepting that, just as he’d reached the limits of what he could do alone on the Apple II a couple of years ago, he’d now reached the limits of what any number of people could do on Steve Wozniak’s humble little 8-bit creation. Ultima V still stands today as one of the most ambitious things anyone has ever done on an Apple II; it was hard at the time and remains hard today to imagine how Origin could possibly push the machine much further. Yet that wasn’t even the biggest problem associated with sticking with the platform; the biggest problem could be seen on each monthly sales report, which showed the Apple II’s numbers falling off even faster than those of the Commodore 64, the only other viable 8-bit computer remaining in the American market.

After serving as the main programmer on Ultima V, John Miles’s only major contribution to Ultima VI was the opening sequence. The creepy poster of a pole-dancing centaur hanging on the Avatar’s wall back on Earth has provoked much comment over the years…

Garriott was hardly alone at Origin in feeling hugely loyal to the Apple II, the only microcomputer he’d ever programmed. While most game developers in those days ported their titles to many platforms, almost all had one which they favored. Just as Epyx knew the Commodore 64 better than anyone else, Sierra had placed their bets on MS-DOS, and Cinemaware was all about the Commodore Amiga, Origin was an Apple II shop through and through. Of the eleven games they’d released from their founding in 1983 through to the end of 1988, all but one had been born and raised on an Apple II.

Reports vary on how long and hard Origin tried to make Ultima VI work on the Apple II. Richard Garriott, who does enjoy a dramatic story even more than most of us, has claimed that Origin wound up scrapping nine or even twelve full months of work; John Miles, who had done the bulk of the programming for Ultima V and was originally slated to fill the same role for the sequel, estimated to me that “we probably spent a few months on editors and other utilities before we came to our senses.” At any rate, by March of 1989, the one-year anniversary of Ultima V‘s release, the painful decision had been made to switch not only Ultima VI but all of Origin’s ongoing and future projects to MS-DOS, the platform that was shaping up as the irresistible force in American computer gaming. A slightly petulant but nevertheless resigned Richard Garriott slapped an Apple sticker over the logo of the anonymous PC clone now sitting on his desk and got with the program.

Richard Garriott with an orrery, one of the many toys he kept at the recently purchased Austin house he called Britannia Manor.

Origin was in a very awkward spot. Having frittered away a full year recovering from the strain of making the previous Ultima, trying to decide what the next Ultima should be, and traveling down the technological cul de sac that was now the Apple II, they simply had to have Ultima VI finished — meaning designed and coded from nothing on an entirely new platform — within one more year if the company was to survive. Origin had never had more than a modestly successful game that wasn’t an Ultima; the only way their business model worked was if Richard Garriott every couple of years delivered a groundbreaking new entry in their one and only popular franchise and it sold 200,000 copies or more.

John Miles, lacking a strong background in MS-DOS programming and the C language in which all future Ultimas would be coded, was transferred off the team to get himself up to speed and, soon enough, to work on middleware libraries and tools for the company’s other programmers. Replacing him on the project in Origin’s new offices in Austin, Texas, were Herman Miller and Cheryl Chen, a pair of refugees from the old offices in New Hampshire, which had finally been shuttered completely in January of 1989. It was a big step for both of them to go from coding what until quite recently had been afterthought MS-DOS versions of Origin’s games to taking a place at the center of the most critical project in the company. Fortunately, both would prove more than up to the task.

Just as Garriott had quickly learned to like the efficiency of not being personally responsible for implementing every single aspect of Ultima V, he soon found plenty to like about the switch to MS-DOS. The new platform had four times the memory of the Apple II machines Origin had been targeting before, along with (comparatively) blazing-fast processors, hard drives, 256-color VGA graphics, sound cards, and mice. A series that had been threatening to burst the seams of the Apple II now had room to roam again. For the first time with Ultima VI, time rather than technology was the primary restraint on Garriott’s ambitions.

But arguably the real savior of Ultima VI was not a new computing platform but a new Origin employee: one Warren Spector, who would go on to join Garriott and Chris Roberts — much more on him in a future article — as one of the three world-famous game designers to come out of the little collective known as Origin Systems. Born in 1955 in New York City, Spector had originally imagined for himself a life in academia as a film scholar. After earning his Master’s from the University of Texas in 1980, he’d spent the next few years working toward his PhD and teaching undergraduate classes. But he had also discovered tabletop gaming at university, from Avalon Hill war games to Dungeons & Dragons. When a job as a research archivist which he’d thought would be his ticket to the academic big leagues unexpectedly ended after just a few months, he wound up as an editor and eventually a full-fledged game designer at Steve Jackson Games, maker of card games, board games, and RPGs, and a mainstay of Austin gaming circles. It was through Steve Jackson, like Richard Garriott a dedicated member of Austin’s local branch of the Society for Creative Anachronism, that Spector first became friendly with the gang at Origin; he also discovered Ultima IV, a game that had a profound effect on him. He left Austin in March of 1987 for a sojourn in Wisconsin with TSR, the makers of Dungeons & Dragons, but, jonesing for the warm weather and good barbecue of the city that had become his adopted hometown, he applied for a job with Origin two years later. Whatever role his acquaintance with Richard Garriott and some of the other folks there played in getting him an interview, it certainly didn’t get him a job all by itself; Spector claims that Dallas Snell, Robert Garriott’s right-hand man running the business side of the operation, grilled him for an incredible nine hours before judging him worthy of employment. (“May you never have to live through something like this just to get a job,” he wishes for all and sundry.) Starting work at Origin on April 12, 1989, he was given the role of producer on Ultima VI, the high man on the project totem pole excepting only Richard Garriott himself.

Age 33 and married, Spector was one of the oldest people employed by this very young company; he realized to his shock shortly after his arrival that he had magazine subscriptions older than Origin’s up-and-coming star Chris Roberts. A certain wisdom born of his age, along with a certain cultural literacy born of all those years spent in university circles, would serve Origin well in the seven years he would remain there. Coming into a company full of young men who had grand dreams of, as their company’s tagline would have it, “creating worlds,” but whose cultural reference points didn’t usually reach much beyond Lord of the Rings and Star Wars, Spector was able to articulate Origin’s ambitions for interactive storytelling in a way that most of the others could not, and in time would use his growing influence to convince management of the need for a real, professional writing team to realize those ambitions. In the shorter term — i.e., in the term of the Ultima VI project — he served as some badly needed adult supervision, systematizing the process of development by providing everyone on his team with clear responsibilities and by providing the project as a whole with the when and what of clear milestone goals. The project was so far behind that everyone involved could look forward to almost a year of solid crunch time as it was; Spector figured there was no point in making things even harder by letting chaos reign.

On the Ultima V project, it had been Dallas Snell who had filled the role of producer, but Snell, while an adept organizer and administrator, wasn’t a game designer or a creative force by disposition. Spector, though, proved himself capable of tackling the Ultima VI project from both sides, hammering out concrete design documents from the sometimes abstracted musings of Richard Garriott, then coming up with clear plans to bring them to fruition. In the end, the role he would play in the creation of Ultima VI was as important as that of Garriott himself. Having learned to share the technical burden with Ultima V — or by now to pass it off entirely; he never learned C and would never write a single line of code for any commercial game ever again — Garriott was now learning to share the creative burden as well, another necessary trade-off if his ever greater ambitions for his games were to be realized.

If you choose not to import an Ultima V character into Ultima VI, you go through the old Ultima IV personality test, complete with gypsy soothsayer, to come up with your personal version of the Avatar. By this time, however, with the series getting increasingly plot-heavy and the Avatar’s personality ever more fleshed-out within the games, the personality test was starting to feel a little pointless. Blogger Chet Bolingbroke, the “CRPG Addict,” cogently captured the problems inherent in insisting that all of these disparate Ultima games had the same hero:
 
Then there’s the Avatar. Not only is it unnecessary to make him the hero of the first three games, as if the Sosarians and Britannians are so inept they always need outside help to solve their problems, but I honestly think the series should have abandoned the concept after Ultima IV. In that game, it worked perfectly. The creators were making a meta-commentary on the very nature of playing role-playing games. The Avatar was clearly meant to be the player himself or herself, warped into the land through the “moongate” of his or her computer screen, represented as a literal avatar in the game window. Ultima IV was a game that invited the player to act in a way that was more courageous, more virtuous, more adventurous than in the real world. At the end of the game, when you’re manifestly returned to your real life, you’re invited to “live as an example to thine own people”–to apply the lesson of the seven virtues to the real world. It was brilliant. They should have left it alone.
 
Already in Ultima V, though, they were weakening the concept. In that game, the Avatar is clearly not you, but some guy who lives alone in his single-family house of a precise layout. But fine, you rationalize, all that is just a metaphor for where you actually do live. By Ultima VI, you have some weird picture of a pole-dancing centaur girl on your wall, you’re inescapably a white male with long brown hair.

Following what had always been Richard Garriott’s standard approach to making an Ultima, the Ultima VI team concentrated on building their technology and then building a world around it before adding a plot or otherwise trying to turn it all into a real game with a distinct goal. Garriott and others at Origin would always name Times of Lore, a Commodore 64 action/CRPG hybrid written by Chris Roberts and published by Origin in 1988, as the main influence on the new Ultima VI interface, the most radically overhauled version of same ever to appear in an Ultima title. That said, it should be noted that Times of Lore itself lifted many or most of its own innovations from The Faery Tale Adventure, David Joiner’s deeply flawed but beautiful and oddly compelling Commodore Amiga action/CRPG of 1987. By way of completing the chain, much of Times of Lore‘s interface was imported wholesale into Ultima VI; even many of the onscreen icons looked exactly the same. The entire game could now be controlled, if the player liked, with a mouse, with all of the keyed commands duplicated as onscreen buttons; this forced Origin to reduce the “alphabet soup” that had been previous Ultima interfaces, which by Ultima V had used every letter in the alphabet plus some additional key combinations, to ten buttons, with the generic “use” as the workhorse taking the place of a multitude of specifics.

Another influence, one which Origin was for obvious reasons less eager to publicly acknowledge than that of Times of Lore, was FTL’s landmark 1987 CRPG Dungeon Master, a game whose influence on its industry can hardly be overstated. John Miles remembers lots of people at Origin scrambling for time on the company’s single Atari ST in order to play it soon after its release. Garriott himself has acknowledged being “ecstatic” for his first few hours playing it at all the “neat new things I could do.” Origin co-opted  Dungeon Master‘s graphical approach to inventory management, including the soon-to-be ubiquitous “paper doll” method of showing what characters were wearing and carrying.

Taking a cue from theories about good interface design dating back to Xerox PARC and Apple’s Macintosh design team, The Faery Tale Adventure, Times of Lore, and Dungeon Master had all abandoned “modes”: different interfaces — in a sense entirely different programs — which take over as the player navigates through the game. The Ultima series, like most 1980s CRPGs, had heretofore been full of these modes. There was one mode for wilderness travel; another for exploring cities, towns, and castles; another, switching from a third-person overhead view to a first-person view like Wizardry (or, for that matter, Dungeon Master), for dungeon delving. And when a fight began in any of these modes, the game switched to yet another mode for resolving the combat.

Ultima VI collapsed all of these modes down into a single unified experience. Wilderness, cities, and dungeons now all appeared on a single contiguous map on which combat also occurred, alongside everything else possible in the game; Ultima‘s traditionally first-person dungeons were now displayed using an overhead view like the rest of the game. From the standpoint of realism, this was a huge step back; speaking in strictly realistic terms, either the previously immense continent of Britannia must now be about the size of a small suburb or the Avatar and everyone else there must now be giants, building houses that sprawled over dozens of square miles. But, as we’ve had plenty of occasion to discuss in previous articles, the most realistic game design doesn’t always make the best game design. From the standpoint of creating an immersive, consistent experience for the player, the new interface was a huge step forward.

As the world of Britannia had grown more complex, the need to give the player a unified window into it had grown to match, in ways that were perhaps more obvious to the designers than they might have been to the players. The differences between the first-person view used for dungeon delving and the third-person view used for everything else had become a particular pain. Richard Garriott had this to say about the problems that were already dogging him when creating Ultima V, and the changes he thus chose to make in Ultima VI:

Everything that you can pick up and use [in Ultima V] has to be able to function in 3D [i.e., first person] and also in 2D [third person]. That meant I had to either restrict the set of things players can use to ones that I know I can make work in 3D or 2D, or make them sometimes work in 2D but not always work in 3D or vice versa, or they will do different things in one versus the other. None of those are consistent, and since I’m trying to create an holistic world, I got rid of the 3D dungeons.

Ultima V had introduced the concept of a “living world” full of interactive everyday objects, along with characters who went about their business during the course of the day, living lives of their own. Ultima VI would build on that template. The world was still constructed, jigsaw-like, from piles of tile graphics, an approach dating all the way back to Ultima I. Whereas that game had offered 16 tiles, however, Ultima VI offered 2048, all or almost all of them drawn by Origin’s most stalwart artist, Denis Loubet, whose association with Richard Garriott stretched all the way back to drawing the box art for the California Pacific release of Akalabeth. Included among these building blocks were animated tiles of several frames — so that, for instance, a water wheel could actually spin inside a mill and flames in a fireplace could flicker. Dynamic, directional lighting of the whole scene was made possible by the 256 colors of VGA. While Ultima V had already had a day-to-night cycle, in Ultima VI the sun actually rose in the east and set in the west, and torches and other light sources cast a realistic glow onto their surroundings.

256 of the 2048 tiles from which the world of Ultima VI was built.

In a clear signal of where the series’s priorities now lay, other traditional aspects of CRPGs were scaled back, moving the series further from its roots in tabletop Dungeons & Dragons. Combat, having gotten as complicated and tactical as it ever would with Ultima V, was simplified, with a new “auto-combat” mode included for those who didn’t want to muck with it at all; the last vestiges of distinct character races and classes were removed; ability scores were boiled down to just three numbers for Strength, Dexterity, and Intelligence. The need to mix reagents in order to cast spells, one of the most mind-numbingly boring aspects of a series that had always made you do far too many boring things, was finally dispensed with; I can’t help but imagine legions of veteran Ultima players breathing a sigh of relief when they read in the manual that “the preparation of a spell’s reagents is performed at the moment of spellcasting.” The dodgy parser-based conversation system of the last couple of games, which had required you to try typing in every noun mentioned by your interlocutor on the off chance that it would elicit vital further information, was made vastly less painful by the simple expedient of highlighting in the text those subjects into which you could inquire further.

Inevitably, these changes didn’t always sit well with purists, then or now. Given the decreasing interest in statistics and combat evinced by the Ultima series as time went on, as well as the increasing emphasis on what we might call solving the puzzles of its ever more intricate worlds, some have accused later installments of the series of being gussied-up adventure games in CRPG clothing; “the last real Ultima was Ultima V” isn’t a hard sentiment to find from a vocal minority on the modern Internet. What gives the lie to that assertion is the depth of the world modeling, which makes these later Ultimas flexible in ways that adventure games aren’t. Everything found in the world has, at a minimum, a size, a weight, and a strength. Say, then, that you’re stymied by a locked door. There might be a set-piece solution for the problem in the form of a key you can find, steal, or trade for, but it’s probably also possible to beat the door down with a sufficiently big stick and a sufficiently strong character, or if all else fails to blast it open with a barrel of dynamite. Thus your problems can almost never become insurmountable, even if you screw up somewhere else. Very few other games from Ultima VI‘s day made any serious attempt to venture down this path. Infocom’s Beyond Zork tried, somewhat halfheartedly, and largely failed at it; Sierra’s Hero’s Quest was much more successful at it, but on nothing like the scale of an Ultima. Tellingly, almost all of the “alternate solutions” to Ultima VI‘s puzzles emerge organically from the simulation, with no designer input whatsoever. Richard Garriott:

I start by building a world which you can interact with as naturally as possible. As long as I have the world acting naturally, if I build a world that is prolific enough, that has as many different kinds of natural ways to act and react as possible, like the real world does, then I can design a scenario for which I know the end goal of the story. But exactly whether I have to use a key to unlock the door, or whether it’s an axe I pick up to chop down the door, is largely irrelevant.

The complexity of the world model was such that Ultima VI became the first installment that would let the player get a job to earn money in lieu of the standard CRPG approach of killing monsters and taking their loot. You can buy a sack of grain from a local farmer, take the grain to a mill and grind it into flour, then sell the flour to a baker — or sneak into his bakery at night to bake your own bread using his oven. Even by the standards of today, the living world inside Ultima VI is a remarkable achievement — not to mention a godsend to those of us bored with killing monsters; you can be very successful in Ultima VI whilst doing very little killing at all.

A rare glimpse of Origin’s in-house Ultima VI world editor, which looks surprisingly similar to the game itself.

Plot spoilers begin!

It wasn’t until October of 1989, just five months before the game absolutely, positively had to ship, that Richard Garriott turned his attention to the Avatar’s reason for being in Britannia this time around. The core idea behind the plot came to him during a night out on Austin’s Sixth Street: he decided he wanted to pitch the Avatar into a holy war against enemies who, in classically subversive Ultima fashion, turn out not to be evil at all. In two or three weeks spent locked together alone in a room, subsisting on takeout Chinese food, Richard Garriott and Warren Spector created the “game” part of Ultima VI from this seed, with Spector writing it all down in a soy-sauce-bespattered notebook. Here Spector proved himself more invaluable than ever. He could corral Garriott’s sometimes unruly thoughts into a coherent plan on the page, whilst offering plenty of contributions of his own. And he, almost uniquely among his peers at Origin, commanded enough of Garriott’s respect — was enough of a creative force in his own right — that he could rein in the bad and/or overambitious ideas that in previous Ultimas would have had to be attempted and proved impractical to their originator. Given the compressed development cycle, this contribution too was vital. Spector:

An insanely complicated process, plotting an Ultima. I’ve written a novel, I’ve written [tabletop] role-playing games, I’ve written board games, and I’ve never seen a process this complicated. The interactions among all the characters — there are hundreds of people in Britannia now, hundreds of them. Not only that, but there are hundreds of places and people that players expect to see because they appeared in five earlier Ultimas.

Everybody in the realm ended up being a crucial link in a chain that adds up to this immense, huge, wonderful, colossal world. It was a remarkably complicated process, and that notebook was the key to keeping it all under control.

The chain of information you follow in Ultima VI is, it must be said, far clearer than in any of the previous games. Solving this one must still be a matter of methodically talking to everyone and assembling a notebook full of clues — i.e., of essentially recreating Garriott and Spector’s design notebook — but there are no outrageous intuitive leaps required this time out, nor any vital clues hidden in outrageously out-of-the-way locations. For the first time since Ultima I, a reasonable person can reasonably be expected to solve this Ultima without turning it into a major life commitment. The difference is apparent literally from your first moments in the game: whereas Ultima V dumps you into a hut in the middle of the wilderness — you don’t even know where in the wilderness — with no direction whatsoever, Ultima VI starts you in Lord British’s castle, and your first conversation with him immediately provides you with your first leads to run down. From that point forward, you’ll never be at a total loss for what to do next as long as you do your due diligence in the form of careful note-taking. Again, I have to attribute much of this welcome new spirit of accessibility and solubility to the influence of Warren Spector.

Ultima VI pushes the “Gargoyles are evil!” angle hard early on, going so far as to have the seemingly demonic beasts nearly sacrifice you to whatever dark gods they worship. This of course only makes the big plot twist, when it arrives, all the more shocking.

At the beginning of Ultima VI, the Avatar — i.e., you — is called back to Britannia from his homeworld of Earth yet again by the remarkably inept monarch Lord British to deal with yet another crisis which threatens his land. Hordes of terrifyingly demonic-looking Gargoyles are pouring out of fissures which have opened up in the ground everywhere and making savage war upon the land. They’ve seized and desecrated the eight Shrines of Virtue, and are trying to get their hands on the Codex of Ultimate Wisdom, the greatest symbol of your achievements in Ultima IV.

But, in keeping with the shades of gray the series had begun to layer over the Virtues with Ultima V, nothing is quite as it seems. In the course of the game, you discover that the Gargoyles have good reason to hate and fear humans in general and you the Avatar in particular, even if those reasons are more reflective of carelessness and ignorance on the part of you and Lord British’s peoples than they are of malice. To make matters worse, the Gargoyles are acting upon a religious prophecy — conventional religion tends to take a beating in Ultima games — and have come to see the Avatar as nothing less than the Antichrist in their own version of the Book of Revelation. As your understanding of their plight grows, your goal shifts from that of ridding the land of the Gargoyle scourge by violent means to that of walking them back from attributing everything to a foreordained prophecy and coming to a peaceful accommodation with them.

Ultima VI‘s subtitle, chosen very late in the development process, is as subtly subversive as the rest of the plot. Not until very near the end of the game do you realize that The False Prophet is in fact you, the Avatar. As the old cliché says, there are two sides to every story. Sadly, the big plot twist was already spoiled by Richard Garriott in interviews before Ultima VI was even released, so vanishingly few players have ever gotten to experience its impact cold.

When discussing the story of Ultima VI, we shouldn’t ignore the real-world events that were showing up on the nightly news while Garriott and Spector were writing it. Mikhail Gorbachev had just made the impossibly brave decision to voluntarily dissolve the Soviet empire and let its vassal states go their own way, and just like that the Cold War had ended, not in the nuclear apocalypse so many had anticipated as its only possible end game but rather in the most blessed of all anticlimaxes in human history. For the first time in a generation, East was truly meeting West again, and each side was discovering that the other wasn’t nearly as demonic as they had been raised to believe. On November 10, 1989, just as Garriott and Spector were finishing their design notebook, an irresistible tide of mostly young people burst through Berlin’s forbidding Checkpoint Charlie to greet their counterparts on the other side, as befuddled guards, the last remnants of the old order, looked on and wondered what to do. It was a time of extraordinary change and hope, and the message of Ultima VI resonated with the strains of history.

Plot spoilers end.

When Garriott and Spector emerged from their self-imposed quarantine, the first person to whom they gave their notebook was an eccentric character with strong furry tendencies who had been born as David Shapiro, but who was known to one and all at Origin as Dr. Cat. Dr. Cat had been friends with Richard Garriott for almost as long as Denis Loubet, having first worked at Origin for a while when it was still being run out of Richard’s parents’ garage in suburban Houston. A programmer by trade — he had done the Commodore 64 port of Ultima V — Dr. Cat was given the de facto role of head writer for Ultima VI, apparently because he wasn’t terribly busy with anything else at the time. Over the next several months, he wrote most of the dialog for most of the many characters the Avatar would need to speak with in order to finish the game, parceling the remainder of the work out among a grab bag of other programmers and artists, whoever had a few hours or days to spare.

Origin Systems was still populating the games with jokey cameos drawn from Richard Garriott’s friends, colleagues, and family as late as Ultima VI. Thankfully, this along with other aspects of the “programmer text” syndrome would finally end with the next installment in the series, for which a real professional writing team would come aboard. More positively, do note the keyword highlighting in the screenshot above, which spared players untold hours of aggravating noun-guessing.

Everyone at Origin felt the pressure by now, but no one carried a greater weight on his slim shoulders than Richard Garriott. If Ultima VI flopped, or even just wasn’t a major hit, that was that for Origin Systems. For all that he loved to play His Unflappable Majesty Lord British in public, Garriott was hardly immune to the pressure of having dozens of livelihoods dependent on what was at the end of the day, no matter how much help he got from Warren Spector or anyone else, his game. His stress tended to go straight to his stomach. He remembers being in “constant pain”; sometimes he’d just “curl up in the corner.” Having stopped shaving or bathing regularly, strung out on caffeine and junk food, he looked more like a homeless man than a star game designer — much less a regal monarch — by the time Ultima VI hit the homestretch. On the evening of February 9, 1990, with the project now in the final frenzy of testing, bug-swatting, and final-touch-adding, he left Origin’s offices to talk to some colleagues having a smoke just outside. When he opened the security door to return, a piece of the door’s apparatus — in fact, an eight-pound chunk of steel — fell off and smacked him in the head, opening up an ugly gash and knocking him out cold. His panicked colleagues, who at first thought he might be dead, rushed him to the emergency room. Once he had had his head stitched up, he set back to work. What else was there to do?

Ultima VI shipped on time in March of 1990, two years almost to the day after Ultima V, and Richard Garriott’s fears (and stomach cramps) were soon put to rest; it became yet another 200,000-plus-selling hit. Reviews were uniformly favorable if not always ecstatic; it would take Ultima fans, traditionalists that so many of them were, a while to come to terms with the radically overhauled interface that made this Ultima look so different from the Ultimas of yore. Not helping things were the welter of bugs, some of them of the potentially showstopping variety, that the game shipped with (in years to come Origin would become almost as famous for their bugs as for their ambitious virtual world-building). In time, most if not all old-school Ultima fans were comforted as they settled in and realized that at bottom you tackled this one pretty much like all the others, trekking around Britannia talking to people and writing down the clues they revealed until you put together all the pieces of the puzzle. Meanwhile Origin gradually fixed the worst of the bugs through a series of patch disks which they shipped to retailers to pass on to their customers, or to said customers directly if they asked for them. Still, both processes did take some time, and the reaction to this latest Ultima was undeniably a bit muted — a bit conflicted, one might even say — in comparison to the last few games. It perhaps wasn’t quite clear yet where or if the Ultima series fit on these newer computers in this new decade.

Both the muted critical reaction and that sense of uncertainty surrounding the game have to some extent persisted to this day. Firmly ensconced though it apparently is in the middle of the classic run of Ultimas, from Ultima IV through Ultima VII, that form the bedrock of the series’s legacy, Ultima VI is the least cherished of that cherished group today, the least likely to be named as the favorite of any random fan. It lacks the pithy justification for its existence that all of the others can boast. Ultima IV was the great leap forward, the game that dared to posit that a CRPG could be about more than leveling up and collecting loot. Ultima V was the necessary response to its predecessor’s unfettered idealism; the two games together can be seen to form a dialog on ethics in the public and private spheres. And, later, Ultima VII would be the pinnacle of the series in terms not only of technology but also, and even more importantly, in terms of narrative and thematic sophistication. But where does Ultima VI stand in this group? Its plea for understanding rather than extermination is as important and well-taken today as it’s ever been, yet its theme doesn’t follow as naturally from Ultima V as that game’s had from Ultima IV, nor is it executed with the same sophistication we would see in Ultima VII. Where Ultima VI stands, then, would seem to be on a somewhat uncertain no man’s land.

Indeed, it’s hard not to see Ultima VI first and foremost as a transitional work. On the surface, that’s a distinction without a difference; every Ultima, being part of a series that was perhaps more than any other in the history of gaming always in the process of becoming, is a bridge between what had come before and what would come next. Yet in the case of Ultima VI the tautology feels somehow uniquely true. The graphical interface, huge leap though it is over the old alphabet soup, isn’t quite there yet in terms of usability. It still lacks a drag-and-drop capability, for instance, to make inventory management and many other tasks truly intuitive, while the cluttered onscreen display combines vestiges of the old, such as a scrolling textual “command console,” with this still imperfect implementation of the new. The prettier, more detailed window on the world is welcome, but winds up giving such a zoomed-in view in the half of a screen allocated to it that it’s hard to orient yourself. The highlighted keywords in the conversation engine are also welcome, but are constantly scrolling off the screen, forcing you to either lawnmower through the same conversations again and again to be sure not to miss any of them or to jot them down on paper as they appear. There’s vastly more text in Ultima VI than in any of its predecessors, but perhaps the kindest thing to be said about Dr. Cat as a writer is that he’s a pretty good programmer. All of these things would be fixed in Ultima VII, a game — or rather games; there were actually two of them, for reasons we’ll get to when the time comes — that succeeded in becoming everything Ultima VI had wanted to be. To use the old playground insult, everything Ultima VI can do Ultima VII can do better. One thing I can say, however, is that the place the series was going would prove so extraordinary that it feels more than acceptable to me to have used Ultima VI as a way station en route.

But in the even more immediate future for Origin Systems was another rather extraordinary development. This company that the rest of the industry jokingly referred to as Ultima Systems would release the same year as Ultima VI a game that would blow up even bigger than this latest entry in the series that had always been their raison d’être. I’ll tell that improbable story soon, after a little detour into some nuts and bolts of computer technology that were becoming very important — and nowhere more so than at Origin — as the 1990s began.

(Sources: the books Dungeons and Dreamers: The Rise of Computer Game Culture from Geek to Chic by Brad King and John Borland, The Official Book of Ultima, Second Edition by Shay Addams, and Ultima: The Avatar Adventures by Rusel DeMaria and Caroline Spector; ACE of April 1990; Questbusters of November 1989, January 1990, March 1990, and April 1990; Dragon of July 1987; Computer Gaming World of March 1990 and June 1990; Origin’s in-house newsletter Point of Origin of August 7 1991. Online sources include Matt Barton’s interviews with Dr. Cat and Warren Spector’s farewell letter from the Wing Commander Combat Information Center‘s document archive. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Ultima VI is available for purchase from GOG.com in a package that also includes Ultima IV and Ultima V.)

 

Tags: , , ,

Opening the Gold Box, Part 5: All That Glitters is Not Gold

SSI entered 1989 a transformed company. What had been a niche maker of war games for grognards had now become one of the computer-game industry’s major players thanks to the first fruits of the coveted TSR Dungons & Dragons license. Pool of Radiance, the first full-fledged Dungeons & Dragons CRPG and the first in a so-called “Gold Box” line of same, was comfortably outselling the likes of Ultima V and The Bard’s Tale III, and was well on its way to becoming SSI’s best-selling game ever by a factor of four. To accommodate their growing employee rolls, SSI moved in 1989 from their old offices in Mountain View, California, which had gotten so crowded that some people were forced to work in the warehouse using piles of boxed games for desks, to much larger, fancier digs in nearby Sunnyvale. Otherwise it seemed that all they had to do was keep on keeping on, keep on riding Dungeons & Dragons for all it was worth — and, yes, maybe release a war game here and there as well, just for old times’ sake.

One thing that did become more clear than ever over the course of the year, however, was that not all Dungeons & Dragons products were created equal. Dungeon Masters Assistant Volume II: Characters & Treasures sold just 13,516 copies, leading to the quiet ending of the line of computerized aids for the tabletop game that had been one of the three major pillars of SSI’s original plans for Dungeons & Dragons. A deviation from that old master plan called War of the Lance, an attempt to apply SSI’s experience with war games to TSR’s Dragonlance campaign setting, did almost as poorly, selling 15,255 copies. Meanwhile the second of the “Silver Box” line of action-oriented games that made up the second of the pillars continued to perform well: Dragons of Flame sold 55,711 copies. Despite that success, though, 1989 would also mark the end of the line for the Silver Box, due to a breakdown in relations with the British developers behind those games. Going into the 1990s, then, Dungeons & Dragons on the computer would be all about the Gold Box line of turn-based traditional CRPGs, the only one of SSI’s three pillars still standing.

Thankfully, what Pool of Radiance had demonstrated in 1988 the events of 1989 would only confirm. What players seemed to hunger for most of all in the context of Dungeons & Dragons on the computer was literally Dungeons & Dragons on the computer: big CRPGs that implemented as many of the gnarly details of the rules as possible. Even Hillsfar, a superfluous and rather pointless sort of training ground for characters created in Pool of Radiance, sold 78,418 copies when SSI released it in March as a stopgap to give the hardcore something to do while they waited for the real Pool sequel.

Every female warrior knows that cleavage is more important than protection, right?

They didn’t have too long to wait. The big sequel dropped in June in the form of Curse of the Azure Bonds, and it mostly maintained the high design standard set by Pool of Radiance. Contrarians could and did complain that the free-roaming wilderness map of its predecessor had been replaced by a simple menu of locations to visit, but for this player anyway Pool‘s overland map always felt more confusing than necessary. A more notable loss in my view is the lack of any equivalent in Curse to the satisfying experience of slowly reclaiming the village of Phlan block by block from the forces of evil in Pool, but that brilliant design stroke was perhaps always doomed to be a one-off. Ditto Pool‘s unique system of quests to fulfill, some of them having little or nothing to do with the main plot.

What players did get in Curse of the Azure Bonds was the chance to explore a much wider area around Phlan with the same characters they had used last time, fighting a selection of more powerful and interesting monsters appropriate to their party’s burgeoning skills. At the beginning of the game, the party wakes up with a set of tattoos on their bodies —  the “azure bonds” of the title — and no memory of how they got there. (I would venture to guess that many of us have experienced something similar at one time or another…) It turns out that the bonds can be used to force the characters to act against their own will. Thus the quest is on to get them removed; each of the bonds has a different source, corresponding to a different area you will need to visit and hack and slash your way through in order to have it removed. By the end of Curse, your old Pool characters — or the new ones you created just for this game, who start at level 5 — will likely be in the neighborhood of levels 10 to 12, just about the point in Dungeons & Dragons where leveling up begins to lose much of its interest.

TSR was once again heavily involved in the making of Curse of the Azure Bonds, if not quite to the same extent as Pool of Radiance. As they had for Pool, they provided for Curse an official tie-in novel and tabletop adventure module. I can’t claim to have understood all of the nuances of the plot, such as they are, when I played the game; a paragraph book is once again used, but much of what I was told to read consisted of people that I couldn’t remember or never knew who they were babbling on about stuff I couldn’t remember or never knew what it was. But then, I know nothing about the Forgotten Realms setting other than what I learned in Pool of Radiance and never read the novel, so I’m obviously not the ideal audience. (Believe me, readers, I’ve done some painful things for this blog, but reading a Dungeons & Dragons novel was just a bridge too far…) Still, my cluelessness never interfered with my pleasure in mapping out each area and bashing things with my steadily improving characters; the standard of design in Curse remains as high as the writing remains breathlessly, entertainingly overwrought. Curse of the Azure Bonds did almost as well as its predecessor for SSI, selling 179,795 copies and mostly garnering the good reviews it deserved.

It was only with the third game of the Pool of Radiance series, 1990’s Secret of the Silver Blades, that some of the luster began to rub off of the Gold Box in terms of design, if not quite yet in that ultimate metric of sales. The reasons that Secret is regarded as such a disappointment by so many players — it remains to this day perhaps the least liked of the entire Gold Box line — are worth dwelling on for a moment.

One of the third game’s problems is bound up inextricably with the Dungeons & Dragons rules themselves. Secret of the Silver Blades allows you to take your old party from Pool of Radiance and/or Curse of the Azure Bonds up to level 15, but by this stage gaining a level is vastly less interesting than it was back in the day. Mostly you just get a couple of hit points, some behind-the-scenes improvements in to-hit scores, and perhaps another spell slot or two somewhere. Suffice to say that there’s no equivalent to, say, that glorious moment when you first gain access to the Fireball spell in Pool of Radiance.

The tabletop rules suggest that characters who reach such high levels should cease to concern themselves with dungeon delving in lieu of building castles and becoming generals or political leaders. Scorpia, Computer Gaming World‘s adventure and CRPG columnist, was already echoing these sentiments in the context of the Pool of Radiance series at the conclusion of her article on Curse of the Azure Bonds: “Characters have reached (by game’s end) fairly high levels, where huge amounts of experience are necessary to advance. If character transfer is to remain a part of the series (which I certainly hope it does), then emphasis needs to be placed on role-playing, rather than a lot of fighting. The true heart of AD&D is not rolling the dice, but the relationship between the characters and their world.” But this sort of thing, of course, the Gold Box engine was utterly unequipped to handle. In light of this, SSI probably should have left well enough alone, making Curse the end of the line for the Pool characters, but players were strongly attached to the parties they’d built up and SSI for obvious reasons wanted to keep them happy. In fact, they would keep them happy to the tune of releasing not just one but two more games which allowed players to use their original Pool of Radiance parties. By the time these characters finally did reach the end of the line, SSI would have to set them against the gods themselves in order to provide any semblance of challenge.

But by no means can all of the problems with Secret of the Silver Blades be blamed on high-level characters. The game’s other issues provide an interesting example of the unanticipated effects which technical affordances can have on game design, as well as a snapshot of changing cultures within both SSI and TSR.

A Gold Box map is built on a grid of exactly 16 by 16 squares, some of which can be “special” squares. When the player’s party enters one of the latter, a script runs to make something unusual happen — from something as simple as some flavor text appearing on the screen to something as complicated as an encounter with a major non-player character. The amount of special content allowed on any given map is restricted, however, by a limitation, stemming from the tiny memories of 8-bit machines like the Commodore 64 and Apple II, on the total size of all of the scripts associated with any given map.

One of the neat 16 by 16 maps found in Pool of Radiance and Curse of the Azure Bonds.

The need for each map to be no larger than 16 by 16 squares couldn’t help but have a major effect on the designs that were implemented with the Gold Box engine. In Pool of Radiance, for example, the division of the city of Phlan into a set of neat sections, to be cleared out and reclaimed one by one, had its origins as much in these technical restrictions as it did in design methodology. In that case it had worked out fantastically well, but by the time development began on Secret of the Silver Blades all those predictably uniform square maps had begun to grate on Dave Shelley, that game’s lead designer. Shelley and his programmers thus came up with a clever way to escape the system of 16 by 16 dungeons.

One of the things a script could do was to silently teleport the player’s party to another square on the map. Shelley and company realized that by making clever use of this capability they could create dungeon levels that gave the illusion of sprawling out wildly and asymmetrically, like real underground caverns would. Players who came into Secret of the Silver Blades expecting the same old 16 by 16 grids would be surprised and challenged. They would have to assume that the Gold Box engine had gotten a major upgrade. From the point of view of SSI, this was the best kind of technology refresh: one that cost them nothing at all. Shelley sketched out a couple of enormous underground complexes for the player to explore, each larger almost by an order of magnitude than anything that had been seen in a Gold Box game before.

A far less neat map from Secret of the Silver Blades. It may be more realistic in its way, but which would you rather try to draw on graph paper? It may help you to understand the scale of this map to know that the large empty squares at the bottom and right side of this map each represent a conventional 16 by 16 area like the one shown above.

But as soon as the team began to implement the scheme, the unintended consequences began to ripple outward. Because the huge maps were now represented internally as a labyrinth of teleports, the hugely useful auto-map had to be disabled for these sections. And never had the auto-map been needed more, for the player who dutifully mapped the dungeons on graph paper could no longer count on them being a certain size; they were constantly spilling off the page, forcing her to either start over or go to work on a fresh page stuck onto the old with a piece of tape. Worst of all, placing all of those teleports everywhere used just about all of the scripting space that would normally be devoted to providing other sorts of special squares. So, what players ended up with was an enormous but mind-numbingly boring set of homogeneous caverns filled with the same handful of dull random-monster encounters, coming up over and over and over. This was not, needless to say, an improvement on what had come before. In fact, it was downright excruciating.

At the same time that this clever technical trick was pushing the game toward a terminal dullness, other factors were trending in the same direction. Shelley himself has noted that certain voices within SSI were questioning whether all of those little extras found in Pool of Radiance and Curse of the Azure Bonds, like the paragraph books and the many scripted special encounters, were really necessary at all — or, at the least, perhaps it wasn’t necessary to do them with quite so much loving care. SSI was onto a good thing with these Gold Box games, said these voices — found mainly in the marketing department — and they ought to strike while the iron was hot, cranking them out as quickly as possible. While neither side would entirely have their way on the issue, the pressure to just make the games good enough rather than great in order to get them out there faster can be sensed in every Gold Box game after the first two. More and more graphics were recycled; fewer and fewer of those extra, special touches showed up. SSI never fully matched Pool of Radiance, much less improved on it, over the course of the ten Gold Box games that followed it. That SSI’s founder and president Joel Billings, as hardcore a gamer as any gaming executive ever, allowed this stagnation to take root is unfortunate, but isn’t difficult to explain. His passion was for the war games he’d originally founded SSI to make; all this Dungeons & Dragons stuff, while a cash cow to die for, was largely just product to him.

A similar complaint could be levied — and has been levied, loudly and repeatedly, by legions of hardcore Dungeons & Dragons fans over the course of decades — against Lorraine Williams, the wealthy heiress who had instituted a coup against Gary Gygax in 1985 to take over TSR. The idea that TSR’s long, slow decline and eventual downfall is due solely to Williams is more than a little dubious, given that Gygax and his cronies had already done so much to mismanage the company down that path before she ever showed up. Still, her list of wise strategic choices, at least after her very wise early decision to finally put Dungeons & Dragons on computers, is not a long one.

At the time they were signing the contract with SSI, TSR had just embarked on the most daunting project in the history of the company: a project to reorganize the Advanced Dungeons & Dragons rules, which had sprawled into eight confusing and sometimes contradictory hardcover books by that point, into a trio of books of relatively streamlined and logically organized information, all of it completely rewritten in straightforward modern English (as opposed to the musty diction of Gary Gygax, which read a bit like a cross of Samuel Johnson with H.P. Lovecraft). The fruits of the project appeared in 1989 in the form of a second-edition Player’s Handbook, Dungeon Master’s Guide, and Monstrous Compendium.

And then, right after expending so much effort to clean things up, TSR proceeded to muddy the second-edition waters even more indiscriminately than they had those of the first edition. Every single character class got its own book, and players with a hankering to play Dungeons & Dragons as a Viking or one of Charlemagne’s paladins were catered to. Indeed, TSR went crazy with campaign settings. By 1993, boxed sets were available to let you play in the Forgotten Realms, in the World of Greyhawk, or in Dragonlance‘s world of Krynn, or to play the game as a Jules Verne-esque science-fiction/fantasy hybrid called Spelljammer. You could also play Dungeons & Dragons as Gothic horror if you bought the Ravenloft set, as vaguely post-apocalyptic dark fantasy if you bought Dark Sun, as a set of tales from the Arabian Nights if you bought Al-Qadim, or as an exercise in surreal Expressionism worthy of Alfred Kubin if you bought Planescape.

Whatever the artistic merits behind all these disparate approaches — and some of them did, it should be said, have much to recommend them over the generic cookie-cutter fantasy that was vanilla Dungeons & Dragons — the commercial pressures that led Lorraine Williams to approve this glut of product aren’t hard to discern. The base of tabletop Dungeons & Dragons players hadn’t grown appreciably for many years. Just the opposite, in fact: it’s doubtful whether even half as many people were actively playing Dungeons & Dragons in 1990 as at the height of the brief-lived fad for the game circa 1982. After the existing player base had dutifully rushed out to buy the new second-edition core books, in other words, very few new players were discovering the game and thus continuing to drive their sales. Unless and until they could find a way to change that situation, the only way for TSR to survive was to keep generating gobs of new product to sell to their existing players. Luckily for them, hardcore Dungeons & Dragons players were tremendously loyal and tremendously dedicated to their hobby. Many would buy virtually everything TSR put out, even things that were highly unlikely ever to make it to their gaming tables, just out of curiosity and to keep up with the state of the art, as it were. It would take two or three years for players to start to evince some fatigue with the sheer volume of product pouring out of TSR’s Lake Geneva offices, much of it sorely lacking in play-testing and basic quality control, and to start giving large swathes of it a miss — and that, in turn, would spell major danger for TSR’s bottom line.

Lorraine Williams wasn’t unaware of the trap TSR’s static customer base represented; on the contrary, she recognized as plainly as anyone that TSR needed to expand into new markets if it was to have a bright long-term future. She made various efforts in that direction even as her company sustained itself by flooding the hardcore Dungeons & Dragons market. In fact, the SSI computer games might be described as one of these efforts — but even those, successful as they were on their own terms, were still playing at least partially to that same old captive market. In 1989, Williams opened a new TSR office on the West Coast in an attempt to break the company out of its nerdy ghetto. Run by Flint Dille, Williams’s brother, one of TSR West’s primary goals was to get Dungeons & Dragons onto television screens or, better yet, onto movie screens. Williams was ironically pursuing the same chimera that her predecessor Gary Gygax — now her sworn, lifetime arch-enemy — had so zealously chased. She was even less successful at it than he had been. Whereas Gygax had managed to get a Saturday morning cartoon on the air for a few seasons, Flint Dille’s operation managed bupkis in three long years of trying.

Another possible ticket to the mainstream, to be pursued every bit as seriously in Hollywood as a Dungeons & Dragons deal, was Buck Rogers, the source of the shared fortune of Lorraine Williams and Flint Dille. Their grandfather had been John F. Dille, owner of a newspaper syndicator known as the National Newspaper Service. In this capacity, the elder Dille had discovered the character that would become Buck Rogers — at the time, he was known as Anthony Rogers — in Armageddon 2419 A.D., a pulp novella written by Philip Francis Nowlan and published in Amazing Stories in 1928. Dille himself had come up with the nickname of “Buck” for the lead character, and convinced Nowlan to turn his adventures in outer space into a comic strip for his syndicator. It ended up running from 1929 until 1967 — only the first ten of those years under the stewardship of Nowlan — and was also turned into very popular radio and movie serials during the 1930s, the height of the character’s popularity. Having managed to secure all of the rights to Buck from a perhaps rather naive Nowlan, John Dille and his family profited hugely.

In marked contrast to her attitude toward TSR’s other intellectual properties, Lorraine Williams’s determination to return Buck Rogers to the forefront of pop culture was apparently born as much from a genuine passion for her family’s greatest legacy as it was from the dispassionate calculus of business. In addition to asking TSR West to lobby — once again fruitlessly, as it would transpire — for a Buck Rogers revival on television or film, she pushed a new RPG through the pipeline, entitled Buck Rogers XXVc and published in 1990. TSR supported the game fairly lavishly for several years in an attempt to get it to take off, releasing source books, adventure modules, and tie-in novels to little avail. With all due deference to Buck Rogers’s role as a formative influence on Star Wars among other beloved contemporary properties, in the minds of the Dungeons & Dragons generation it was pure cheese, associated mainly with the Dille family’s last attempt to revive the character, the hilariously campy 1979 television series Buck Rogers in the 25th Century. The game might have had a chance with some players had Williams been willing to recognize the cheese factor and let her designers play it up, but taken with a straight face? No way.

SSI as well was convinced — or coerced — to adapt the Gold Box engine from fantasy to science fiction for a pair of Buck Rogers computer games, 1990’s Countdown to Doomsday and 1992’s Matrix Cubed. SSI’s designers must have breathed a sigh of relief when they saw that the rules for the Buck Rogers tabletop RPG, much more so than any of TSR’s previous non-Dungeons & Dragons RPGs, had been based heavily on those of the company’s flagship game; thus the process of adaptation wasn’t quite so onerous as it might otherwise have been. That said, most agree that the end results are markedly less interesting than the other Gold Box games when it comes to combat, the very thing at which the engine normally excels; a combat system designed to include magic becomes far less compelling in its absence. Benefiting doubtless from its association with the Dungeons & Dragons Gold Box line, for which enthusiasm remained fairly high, the first Buck Rogers game sold a relatively healthy 51,528 copies; the second managed a somewhat less healthy 38,086 copies.

All of these competing interests do much to explain why TSR, after involving themselves so closely in the development of Pools of Radiance and Curse of the Azure Bonds, withdrew from the process almost entirely after those games and just left SSI to it. And that fact in turn is yet one more important reason why the Gold Box games not only failed to evolve but actually devolved in many ways. TSR’s design staff might not have had a great understanding of computer technology, but they did understand their settings and rules, and had pushed SSI to try to inject at least a little bit of what made for a great tabletop-role-playing experience into the computer games. Absent that pressure, SSI was free to fall back on what they did best — which meant, true to their war-game roots, lots and lots of combat. In both Pool and Curse, random encounters cease on most maps after you’ve had a certain number of them — ideally, just before they get boring. Tellingly, in Secret of the Silver Blades and most of the other later Gold Box games that scheme is absent. The monsters just keep on coming, ad infinitum.

Despite lukewarm reviews that were now starting to voice some real irritation with the Gold Box line’s failure to advance, Secret of the Silver Blades was another huge hit, selling 167,214 copies. But, in an indication that some of those who purchased it were perhaps disappointed enough by the experience not to continue buying Gold Box games, it would be the last of the line to break the 100,000-copy barrier. The final game in the Pool of Radiance series, Pools of Darkness, sold just 52,793 copies upon its release in 1991.

In addition to the four-game Pool series, SSI also released an alternate trilogy of Dungeons & Dragons Gold Box games set in Krynn, the world of the Dragonlance setting. Champions of Krynn was actually released before Secret of the Silver Blades, in January of 1990, and sold 116,693 copies; Death Knights of Krynn was released in 1991 and sold 61,958 copies; and The Dark Queen of Krynn, the very last Gold Box game, was released in 1992 and sold 40,640 copies. Another modest series of two games was developed out-of-house by Beyond Software (later to be renamed Stormfront Studios): Gateway to the Savage Frontier (1991, 62,581 copies sold) and Treasures of the Savage Frontier (1992, 31,995 copies sold). In all, then, counting the two Buck Rogers games but not counting the oddball Hillsfar, SSI released eleven Gold Box games over a period of four years.

While Secret of the Silver Blades still stands as arguably the line’s absolute nadir in design terms, the sheer pace at which SSI pumped out Gold Box games during the latter two years of this period in particular couldn’t help but give all of them a certain generic, interchangeable quality. It all began to feel a bit rote — a bit cheap, in stark contrast to the rarefied atmosphere of a Big Event that had surrounded Pool of Radiance, a game which had been designed and marketed to be a landmark premium product and had in turn been widely perceived as exactly that. Not helping the line’s image was the ludicrous knockoff-Boris Vallejo cover art sported by so many of the boxes, complete with lots of tawny female skin and heaving bosoms. Susan Manley has described the odd and somewhat uncomfortable experience of being a female artist asked to draw this sort of stuff.

They pretty much wanted everybody [female] to be the chainmail-bikini babes, as we called them. I said, “Look, not everybody wants to be a chainmail-bikini babe.” They said, “All the guys want that, and we don’t have very many female players.” I said, “You’re never going to have female players if you continue like this. Functional armor that would actually protect people would play a little bit better.”

Tom [Wahl, SSI’s lead artist] and I actually argued over whether my chest size was average or not, which was an embarrassing conversation to have. He absolutely thought that everybody needed to look like they were stepping out of a Victoria’s Secret catalog if they were female. I said, “Gee, how come all the guys don’t have to be super-attractive?” They don’t look like they’re off of romance-novel covers, let’s put it that way. They get to be rugged, they get to be individual, they get to all have different costumes. They get to all have different hairstyles, but the women all had to have long, flowing locks and lots of cleavage.

By 1991, the Gold Box engine was beginning to seem rather like a relic from technology’s distant past. In a sense, the impression was literally correct. When SSI had begun to build the Gold Box engine back in 1987, the Commodore 64 had still ruled the roost of computer gaming, prompting SSI to make the fateful decision not only to make sure the Gold Box games could run on that sharply limited platform, but also to build most of their development tools on it. Pool of Radiance then appeared about five minutes before the Commodore 64’s popularity imploded in the face of Nintendo. The Gold Box engine did of course run on other platforms, but it remained throughout its life subject to limitations born of its 8-bit origins — things like the aforementioned maps of exactly 16 by 16 squares and the strict bounds on the amount of custom scripting that could be included on a single one of those maps. Even as the rest of the industry left the 8-bit machines behind in 1989 and 1990, SSI was reluctant to do so in that the Commodore 64 still made up a major chunk of Gold Box sales: Curse of the Azure Bonds sold 68,622 copies on the Commodore 64, representing more than a third of its total sales, while Secret of the Silver Blades still managed a relatively healthy 40,425 Commodore 64 versions sold. Such numbers likely came courtesy of diehard Commodore 64 owners who had very few other games to buy in an industry that was moving more and more to MS-DOS as its standard platform. SSI was thus trapped for some time in something of a Catch-22, wanting to continue to reap the rewards of being just about the last major American publisher to support the Commodore 64 but having to compromise the experience of users with more powerful machines in order to do so.

SSI had managed to improve the Gold Box graphics considerably by the time of The Dark Queen of Krynn, the last game in the line.

When SSI finally decided to abandon the Commodore 64 in 1991, they did what they could to enhance the Gold Box engine to take advantage of the capabilities of the newer machines, introducing more decorative displays and pictures drawn in 256-color VGA along with some mouse support. Yet the most fundamental limitations changed not all; the engine was now aged enough that SSI wasn’t enthused about investing in a more comprehensive overhaul. And thus the Gold Box games seemed more anachronistic than ever. As SSI’s competitors worked on a new generation of CRPGs that took advantage of 32-bit processors and multi-megabyte memories, the Gold Box games remained the last surviving relics of the old days of 8 bits and 64 K. Looking at The Dark Queen of Krynn and the technical tour de force that was Origin’s Ultima VII side by side, it’s difficult to believe that the two games were released in the same year, much less that they were, theoretically at least, direct competitors.

It’s of course easy for us to look back today and say what SSI should have done. Instead of flooding the market with so many generic Gold Box games, they should have released just one game every year or eighteen months, each release reflecting a much more serious investment in writing and design as well as real, immediately noticeable technical improvements. They should, in other words, have strained to make every new Gold Box game an event like Pool of Radiance had been in its day. But this had never been SSI’s business model; they had always released lots of games, very few of which sold terribly well by the standard of the industry at large, but whose sales in the aggregate were enough to sustain them. When, beginning with Pool of Radiance, they suddenly were making hits by anybody’s standards, they had trouble adjusting their thinking to their post-Pool situation, had trouble recognizing that they could sell more units and make more money by making fewer but better games. Such is human nature; making such a paradigm shift would doubtless challenge any of us.

Luckily, just as the Gold Box sales began to tail off SSI found an alternative approach to Dungeons & Dragons on the computer from an unlikely source. Westwood Associates was a small Las Vegas-based development company, active since 1985, who had initially made their name doing ports of 8-bit titles to more advanced machines like the Commodore Amiga and Atari ST (among these projects had been ports of Epyx’s Winter Games, World Games, and California Games). What made Westwood unique and highly sought after among porters was their talent for improving their 8-bit source material enough, in terms of both audiovisuals and game play, that the end results would be accepted almost as native sons by the notoriously snobbish owners of machines like the Amiga. Their ambition was such that many publishers came to see the biggest liability of employing them as a tendency to go too far, to such an extent that their ports could verge on becoming new games entirely; for example, their conversion of Epyx’s Temple of Apshai on the Macintosh from turn-based to real-time play was rejected as being far too much of a departure.

Westwood first came to the attention of Gold Box fans when they were given the job of implementing Hillsfar, the stopgap “character training grounds” which SSI released between Pool of Radiance and Curse of the Azure Bonds. Far more auspicious were Westwood’s stellar ports of the mainline Gold Box games to the Amiga, which added mouse support and improved the graphics well before SSI’s own MS-DOS versions made the leap to VGA. But Brett Sperry and Louis Castle, Westwood’s founders, had always seen ports merely as a way of getting their foot in the door of the industry. Already by the time they began working with SSI, they were starting to do completely original games of their own for Electronic Arts and Mediagenic/Activision. (Their two games for the latter, both based on a board-game line called BattleTech, were released under the Infocom imprint, although the “real” Cambridge-based Infocom had nothing to do with them.) Westwood soon convinced SSI as well to let them make an original title alongside the implementation assignments: what must be the strangest of all the SSI Dungeons & Dragons computer games, a dragon flight simulator (!) called Dragon Strike. Released in 1990, it wasn’t quite an abject flop but neither was it a hit, selling 34,296 copies. With their next original game for SSI, however, Westwood would hit pay dirt.

Eye of the Beholder was conceived as Dungeons & Dragons meets Dungeon Master, bringing the real-time first-person game play of FTL’s seminal 1987 dungeon crawl to SSI’s product line. In a measure of just how ahead-of-its-time Dungeon Master had been in terms not only of technology but also of fundamental design, nothing had yet really managed to equal it over the three years since its release. Eye of the Beholder arguably didn’t fully manage that feat either, but it did at the very least come closer than most other efforts — and of course it had the huge advantage of the Dungeons & Dragons license. When a somewhat skeptical SSI sent an initial shipment of 20,000 copies into the distribution pipeline in February of 1991, “they all disappeared” in the words of Joel Billings: “We put them out and boom!, they were gone.” Eye of the Beholder went on to sell 129,234 copies, nicely removing some of the sting from the slow commercial decline of the Gold Box line and, indeed, finally giving SSI a major Dungeons & Dragons hit that wasn’t a Gold Box game. The inevitable sequel, released already in December of 1991, sold a more modest but still substantial 73,109 copies, and a third Eye of the Beholder, developed in-house this time at SSI, sold 50,664 copies in 1993. The end of the line for this branch of the computerized Dungeons & Dragons family came with the pointless Dungeon Hack, a game that, as its name implies, presented its player with an infinite number of generic randomly generated dungeons to hack her way through; it sold 27,110 copies following its release at the end of 1993.

This chart from the April 1991 Software Publishers Association newsletter shows just how quickly Eye of the Beholder took off. Unfortunately, this would mark the last time an SSI Dungeons & Dragons game would be in this position.

Despite their popularity in their heyday, the Eye of the Beholder games in my view have aged less gracefully than their great progenitor Dungeon Master, or for that matter even the early Gold Box games. If what you wished for more than anything when playing Dungeon Master was lots more — okay, any — story and lore to go along with the mapping, the combat, and the puzzles, these may be just the games for you. For the rest of us, though, the Dungeons & Dragons rules make for an awkward fit to real-time play, especially in contrast to Dungeon Master‘s designed-from-scratch-for-real-time systems of combat, magic, and character development. The dungeon designs and even the graphics similarly underwhelm; Eye of the Beholder looks a bit garish today in contrast to the clean minimalism of Dungeon Master. The world would have to wait more than another year, until the release of Ultima Underworld, to see a game that truly and comprehensively improved on the model of Dungeon Master. In the meantime, though, the Eye of the Beholder games would do as runners-up for folks who had played Dungeon Master and its sequel and still wanted more, or for those heavily invested in the Dungeons & Dragons rules and/or the Forgotten Realms setting.

For SSI, the sales of the Eye of the Beholder games in comparison to those of the latest Gold Box titles provided all too clear a picture of where the industry was trending. Players were growing tired of the Gold Box games; they hungered after faster-paced CRPGs that were prettier to look at and easier to control. While Eye of the Beholder was still high on the charts, TSR and SSI agreed to extend their original five-year contract, which was due to expire on January 1, 1993, by eighteen months to mid-1994. The short length of the extension may be indicative of growing doubts on the part of TSR about SSI’s ability to keep up with the competition in the CRPG market; one might see it as a way of putting them on notice that the TSR/SSI partnership was by no means set in stone for all time. At any rate, a key provision of the extension was that SSI must move beyond the fading Gold Box engine, must develop new technology to suit the changing times and to try to recapture those halcyon early days when Pool of Radiance ruled the charts and the world of gaming was abuzz with talk of Dungeons & Dragons on the computer. Accordingly, SSI put a bow on the Gold Box era in March of 1993 with the release of Unlimited Adventures, a re-packaging of their in-house development tools that would let diehard Gold Box fans make their own games to replace the ones SSI would no longer be releasing. It sold just 32,362 copies, but would go on to spawn a loyal community of adventure-makers that to some extent still persists to this day. As for what would come next for computerized Dungeons & Dragons… well, that’s a story for another day.

By way of wrapping up today’s story, I should note that my take on the Gold Box games, while I believe it dovetails relatively well with the consensus of the marketplace at the time, is by no means the only one in existence. A small but committed group of fans still loves these games — yes, all of them — for their approach to tactical combat, which must surely mark the most faithful implementation of the tabletop game’s rules for same ever to make it to the computer. “It’s hard to imagine a truly bad game being made with it,” says blogger Chester Bolingbroke — better known as the CRPG Addict — of the Gold Box engine. (Personally, I’d happily nominate Secret of the Silver Blades for that designation.)

Still, even the Gold Box line’s biggest fans will generally acknowledge that the catalog is very front-loaded in terms of innovation and design ambition. For those of you like me who aren’t CRPG addicts, I highly recommend Pool of Radiance and Curse of the Azure Bonds, which together let you advance the same party of characters just about as far as remains fun under the Dungeons & Dragons rules, showing off the engine at its best in the process. If the Gold Box games that came afterward wind up a bit of an anticlimactic muddle, we can at least still treasure those two genuine classics. And if you really do want more Gold Box after playing those two, Lord knows there’s plenty of it out there, enough to last most sane people a lifetime. Just don’t expect any of it to quite rise to the heights of the first games and you’ll be fine.

(Sources: This article is largely drawn from the collection of documents that Joel Billings donated to the Strong Museum of Play, which includes lots of internal SSI documents and some press clippings. Also, the book Designers & Dragons Volume 1 by Shannon Appelcline; Computer Gaming World of September 1989; Retro Gamer 52 and 89; Matt Barton’s video interviews with Joel Billings, Susan Manley, and Dave Shelley and Laura Bowen.

Many of the Gold Box games and the Eye of the Beholder trilogy are available for purchase from GOG.com. You may also wish to investigate The Gold Box Companion, which adds many modern conveniences to the original games.)

 
 

Tags: , ,

What’s the Matter with Covert Action?

Covert Action‘s cover is representative of the thankfully brief era when game publishers thought featuring real models on their boxes would drive sales. The results almost always ended up looking like bad romance-novel covers; this is actually one of the least embarrassing examples. (For some truly cringeworthy examples of artfully tousled machismo, see the Pirates! reissue or Space Rogue.)

In the lore of gaming there’s a subset of spectacular failures that have become more famous than the vast majority of successful games. From E.T.: The Extra-Terrestrial to Daikatana to Godus, this little rogue’s gallery inhabits its own curious corner of gaming history. The stories behind these games, carrying with them the strong scent of excess and scandal, can’t help but draw us in.

But there are also other, less scandalous cases of notable failure to which some of us continually return for reasons other than schadenfreude. One such case is that of Covert Action, Sid Meier and Bruce Shelley’s 1990 game of espionage. Covert Action, while not a great or even a terribly good game, wasn’t an awful game either. And, while it wasn’t a big hit, nor was it a major commercial disaster. By all rights it should have passed into history unremarked, like thousands of similarly middling titles before and after it. The fact that it has remained a staple of discussion among game designers for some twenty years now in the context of how not to make a game is due largely to Sid Meier himself, a very un-middling designer who has never quite been able to get Covert Action, one of his few disappointing games, out of his craw. Indeed, he dwells on it to such an extent that the game and its real or perceived problems still tends to rear its head every time he delivers a lecture on the art of game design. The question of just what’s the matter with Covert Action — the question of why it’s not more fun — continues to be asked and answered over and over, in the form of Meier’s own design lectures, extrapolations on Meier’s thesis by others, and even the occasional contrarian apology telling us that, no, actually, nothing‘s wrong with Covert Action.

What with piling onto the topic having become such a tradition in design circles, I couldn’t bear to let Covert Action‘s historical moment go by without adding the weight of this article to the pile. But first, the basics for those of you who wouldn’t know Covert Action if it walked up and invited you to dinner.

As I began to detail in my previous article, Covert Action‘s development at MicroProse, the company at which Sid Meier and Bruce Shelley worked during the period in question, was long by the standards of its time, troubled by the standards of any time, and more than a little confusing to track in our own time. Begun in early 1988 as a Commodore 64 game by Lawrence Schick, another MicroProse designer, it was conceived from the beginning as essentially an espionage version of Sid Meier’s earlier hit Pirates! — as a set of mini-games the player engaged in to affect the course of an overarching strategic game. But Schick found that he just couldn’t get the game to work, and moved on to something else. And that would have been that — except that Sid Meier had become intrigued by the idea, and picked it up for his own next project, moving it in the process from the Commodore 64 to MS-DOS, where it would have a lot more breathing room.

In time, though, the enthusiasm of Meier and his assistant designer Bruce Shelley also began to evaporate; they started spending more and more time dwelling on an alternative design. By August of 1989, they were steaming ahead with Railroad Tycoon, and all work on Covert Action for the nonce had ceased.

After Railroad Tycoon was completed and released in April of 1990, Meier and Shelley returned to Covert Action only under some duress from MicroProse’s head Bill Stealey. With the idea that would become Civilization already taking shape in Meier’s head, his enthusiasm for Covert Action was lower than ever, but needs must. As Shelley tells the story, Meier’s priorities were clear in light of the idea he had waiting in the wings. “We’re just getting this game done,” Meier said of Covert Action when Shelley tried to suggest ways of improving the still somehow unsatisfying design. “I’ve got to get this game finished.” It’s hard to avoid the impression that in the end Meier simply gave up on Covert Action. Yet, given the frequency with which he references it to this day, it’s seems equally clear that that capitulation has never sat well with him.

Covert Action casts you as the master spy Max Remington — or, in a nice nod to gender equality that was still unusual in a game of this era, as Maxine Remington. Max is the guy the CIA calls when they need someone to crack the really tough cases. The game presents you with a series of said tough cases, each involving a plot by some combination of criminal and/or terrorist groups to do something very bad somewhere in the world. Your objective is to figure out what group or groups are involved, figure out precisely what they’re up to, and foil their plot before they bring it to fruition. As usual for a Sid Meier game, you can play on any of four difficulty levels to ensure that everyone, from the rank beginner to the most experienced super-sleuth, can be challenged without being overwhelmed. If you do your job well, you will arrest the person at the top of the plot’s org chart, one of the game’s 26 evil masterminds. Once no more masterminds are left to arrest, Max can walk off into the sunset and enjoy a pleasant retirement, confident that he has made the world a safer place. (If only counter-terrorism was that easy in real life, right?)

The game lets Max/Maxine score with progressively hotter members of the opposite sex as he/she cracks more cases.

The strategic decisions you make in directing the course of your investigation will lead to naught if you don’t succeed at the various mini-games. These include rewiring a junction box to tap a suspect’s phone (Covert Action presents us with a weirdly low-tech version of espionage, even for its own day); cracking letter-substitution codes to decipher a suspect’s message traffic; tailing or chasing a suspect’s car; and, in the most elaborate of the mini-games, breaking into a group’s hideaway to either collect intelligence or make an arrest.

Covert Action seems to have all the makings of a good game — perhaps even another classic like its inspiration, Pirates!. But, as Sid Meier and most of the people who have played it agree, it doesn’t ever quite come together to become an holistically satisfying experience.

It’s not immediately obvious just why that should be the case; thus all of the discussion the game has prompted over the years. Meier does have his theory, to which he’s returned enough that he’s come to codify it into a universal design dictum he calls “The Covert Action rule.” For my part… well, I have a very different theory. So, first I’ll tell you about Meier’s theory, and then I’ll tell you about my own.

Meier’s theory hinges on the nature of the mini-games. He doesn’t believe that any of them are outright bad by any means, but does feel that they don’t blend well with the overarching strategic game, resulting in a lumpy stew of an experience that the player has trouble digesting. He’s particularly critical of the breaking-and-entering mini-game — a “mini-game” complicated enough that one could easily imagine it being released as a standalone game for the previous generation of computers (or, for that matter, for Covert Action‘s contemporaneous generation of consoles). Before you begin the breaking-and-entering game, you must choose what Max will carry with him: depending on your goals for this mission, you can give him some combination of a pistol, a sub-machine gun, a camera, several types of grenades, bugs, a Kevlar vest, a gas mask, a safe-cracking kit, and a motion detector. The underground hideaways and safe houses you then proceed to explore are often quite large, and full of guards, traps, and alarms to avoid or foil as you snoop for evidence or try to spirit away a suspect. You can charge in with guns blazing if you like, but, especially at the higher difficulty levels, that’s not generally a recipe for success. This is rather a game of stealth, of lurking in the shadows as you identify the guards’ patrol patterns, the better to avoid or quietly neutralize them. A perfectly executed mission in many circumstances will see you get in and out of the building without having to fire a single shot.

The aspect of this mini-game which Meier pinpoints as its problem is, somewhat ironically, the very ambition and complexity which makes it so impressive when considered alone. A spot of breaking and entering can easily absorb a very tense and intense half an hour of your time. By the time you make it out of the building, Meier theorizes, you’ve lost track of why you went in in the first place — lost track, in other words, of what was going on in the strategic game. Meier codified his theory in what has for almost twenty years been known in design circles as “the Covert Action rule.” In a nutshell, the rule states that “one good game is better than two great ones” in the context of a single game design. Meier believes that the mini-games of Covert Action, and the breaking-and-entering game in particular, can become so engaging and such a drain on the player’s time and energies that they clash with the strategic game; we end up with two “great games” that never make a cohesive whole. This dissonance never allows the player to settle into that elusive sense of total immersion which some call “flow.” Meier believes that Pirates! works where Covert Action doesn’t because the former’s mini-games are much shorter and much less complicated — getting the player back to the big picture, as it were, quickly enough that she doesn’t lose the plot of what the current situation is and what she’s trying to accomplish.

It’s an explanation that makes a certain sense on its face, yet I must say that it’s not one that really rings true to my own experiences with either games in general or Covert Action in particular. Certainly one can find any number of games which any number of players have hugely enjoyed that seemingly violate the Covert Action rule comprehensively. We could, for instance, look to the many modern CRPGs which include “sub-quests” that can absorb many hours of the player’s time, to no detriment to the player’s experience as a whole, at least if said players’ own reports are to be believed. If that’s roaming too far afield from the type of game which Covert Action is, consider the case of the strategy classic X-Com, one of the most frequently cited of the seeming Covert Action rule violators that paradoxically succeed as fun designs. It merges an overarching strategic game with a game of tactical combat that’s far more time-consuming and complicated than even the breaking-and-entering part of Covert Action. And yet it must place high in any ranking of the most beloved strategy games of all time. As we continue to look at specific counterexamples like X-Com or, for that matter, Pirates!, we can only continue to believe in the Covert Action rule by applying lots of increasingly tortured justifications for why this or that seemingly blatant violator nevertheless works as a game. So, X-Com, Meier tells us, works because the strategic game is relatively less complicated than the tactical game, leaving enough of the focus on the tactical game that the two don’t start to pull against one another. And Pirates!, of course, is just the opposite.

I can only say that when the caveats and exceptions to any given rule start to pile up, one is compelled to look back to the substance of the rule itself. As nice as it might be for the designers of Covert Action to believe the game’s biggest problem is that its individual parts were just each too darn ambitious, too darn good, I don’t think that’s the real reason the game doesn’t work.

So, we come back to the original question: just what is the matter with Covert Action? I don’t believe that Covert Action‘s core malady can be found in the mini-games, nor for that matter in the strategic game per se. I rather believe the problem is with the mission design and with the game’s fiction — which, as in so many games, are largely one and the same in this one. The cases you must crack in Covert Action are procedurally generated by the computer, using a set of templates into which are plugged different combinations of organizations, masterminds, and plots to create what is theoretically a virtually infinite number of potential cases to solve. My thesis is that it’s at this level — the level of the game’s fiction — where Covert Action breaks down; I believe that things have already gone awry as soon as the game generates the case it will ask you to solve, well before you make your first move. The, for lack of a better word, artificiality of the cases is never hard to detect. Even before you start to learn which of the limited number of templates are which, the stories just feel all wrong.

Literary critics have a special word, “mimesis,” which they tend to deploy when a piece of fiction conspicuously passes or fails the smell test of immersive believability. Dating back to classical philosophy, “mimesis” technically means the art of “showing” a story — as opposed to “diegesis,” the art of telling. It’s been adopted by theorists of textual interactive fiction as well as a stand-in for all those qualities of a game’s fiction that help to immerse the player in the story, that help to draw her in. “Crimes against Mimesis” — the name of an influential Usenet post written in 1996 by Roger Giner-Sorolla — are all those things, from problems with the interface to obvious flaws in the story’s logic to things that just don’t ring true somehow, that cast the player jarringly out of the game’s fiction — that reveal, in other words, the mechanical gears grinding underneath the game’s fictional veneer. Covert Action is full of these crimes against mimesis, full of these gears poking above the story’s surface. Groups that should hate each other ally with one another: the Colombian Cartel, the Mafia, the Palestine Freedom Organization (some names have been changed to protect the innocent or not-so-innocent), and the Stasi might all concoct a plot together. Why not? In the game’s eyes, they’re just interchangeable parts with differing labels on the front; they might as well have been called “Group A,” “Group B,” etc. When they send messages to one another, the diction almost always rings horribly, jarringly wrong in the ears of those of us who know what the groups represent. Here’s an example in the form of the Mafia talking like Jihadists.

If Covert Action had believable, mimetic, tantalizing — or at least interesting — plots to foil, I submit that it could have been a tremendously compelling game, without changing anything else about it. Instead, though, it’s got this painfully artificial box of whirling gears. Writing in the context of the problems of procedural generation in general, Kate Compton has called this the “10,000 Bowls of Oatmeal Problem.”

I can easily generate 10,000 bowls of plain oatmeal, with each oat being in a different position and different orientation, and mathematically speaking they will all be completely unique. But the user will likely just see a lot of oatmeal. Perceptual uniqueness is the real metric, and it’s darn tough. It is the difference between an actor being a face in a crowd scene and a character that is memorable.

Assuming that we can agree to agree, at least for now, that we’ve hit upon Covert Action‘s core problem, it’s not hard to divine how to fix it. I’m imagining a version of the game that replaces the infinite number of procedurally-generated cases with 25 or 30 hand-crafted plots, each with its own personality and its own unique flavor of intrigue. Such an approach would fix another complaint that’s occasionally levied against Covert Action: that it never becomes necessary to master or even really engage with all of its disparate parts because it’s very easy to rely just on those mini-games you happen to be best at to ferret out all of the relevant information. In particular, you can discover just about everything you need in the files you uncover during the breaking-and-entering game, without ever having to do much of anything in the realm of wire-tapping suspects, tailing them, or cracking their codes. This too feels like a byproduct of the generic templates used to construct the cases, which tend to err on the safe side to ensure that the cases are actually soluble, preferring — justifiably, in light of the circumstances — too many clues to too few. But this complaint could easily be fixed using hand-crafted cases. Different cases could be consciously designed to emphasize different aspects of the game: one case could be full of action, another more cerebral and puzzle-like, etc. This would do yet more to give each case its own personality and to keep the game feeling fresh throughout its length.

The most obvious argument against hand-crafted cases, other than the one, valid only from the developers’ standpoint, of the extra resources it would take to create them, is that it would exchange a game that is theoretically infinitely replayable for one with a finite span. Yet, given that Covert Action isn’t a hugely compelling game in its historical form, one has to suspect that my proposed finite version of it would likely yield more actual hours of enjoyment for the average player than the infinite version. Is a great game that lasts 30 hours and then is over better than a mediocre one that can potentially be played forever? The answer must depend on individual circumstances as well as individual predilections, but I know where I stand, at least as long as this world continues to be full of more cheap and accessible games than I can possibly play.

But then there is one more practical objection to my proposed variation of Covert Action, or rather one ironclad reason why it could never have seen the light of day: this simply isn’t how Sid Meier designs his games. Meier, you see, stands firmly on the other side of a longstanding divide that has given rise to no small dissension over the years in the fields of game design and academic game studies alike.

In academia, the argument has raged for twenty years between the so-called ludologists, who see games primarily as dynamic systems, and the narratologists, who see them primarily as narratives. Yet at its core the debate is actually far older even than that. In the December 1987 issue of his Journal of Computer Game Design, Chris Crawford fired what we might regard as the first salvo in this never-ending war via an article entitled “Process Intensity.” The titular phrase meant, he explained, “the degree to which a program emphasizes processes instead of data.” While all games must have some amount of data — i.e., fixed content, including fixed story content — a more process-intensive game — one that tips the balance further in favor of dynamic code as opposed to static data — is almost always a better game in Crawford’s view. That all games aren’t extremely process intensive, he baldly states, is largely down to the laziness of their developers.

The most powerful resistance to process intensity, though, is unstated. It is a mental laziness that afflicts all of us. Process intensity is so very hard to implement. Data intensity is easy to put into a program. Just get that artwork into a file and read it onto the screen; store that sound effect on the disk and pump it out to the speaker. There’s instant gratification in these data-intensive approaches. It looks and sounds great immediately. Process intensity requires all those hours mucking around with equations. Because it’s so indirect, you’re never certain how it will behave. The results always look so primitive next to the data-intensive stuff. So we follow the path of least resistance right down to data intensity.

Crawford, in other words, is a ludologist all the way. There’s always been a strongly prescriptive quality to the ludologists’ side of the ludology-versus-narratology debate, an ideology of how games ought to be made. Because processing is, to use Crawford’s words again, “the very essence of what a computer does,” the capability that in turn enables the interactivity that makes computer games unique as a medium, games that heavily emphasize processing are purer than those that rely more heavily on fixed data.

It’s a view that strikes me as short-sighted in a number of ways. It betrays, first of all, a certain programmer and systems designer’s bias against the artists and writers who craft all that fixed data; I would submit that the latter skills are every bit as worthy of admiration and every bit as valuable on most development teams as the former. Although even Crawford acknowledges that “data endows a game with useful color and texture,” he fails to account for the appeal of games where that very color and texture — we might instead say the fictional context — is the most important part of the experience. He and many of his ludologist colleagues are like most ideologues in failing to admit the possibility that different people may simply want different things, in games as in any other realm. Given the role that fixed stories have come to play in even many of the most casual modern games, too much ludologist rhetoric verges on telling players that they’re wrong for liking the games they happen to like. This is not to apologize for railroaded experiences that give the player no real role to play whatsoever and thereby fail to involve her in their fictions. It’s rather to say that drawing the line between process and data can be more complicated than saying “process good, data bad” and proceeding to act accordingly. Different games are at their best with different combinations of pre-crafted and generative content. Covert Action fails as a game because it draws that line in the wrong place. It’s thanks to the same fallacy, I would argue, that Chris Crawford has been failing for the last quarter century to create the truly open-ended interactive-story system he calls Storytron.

Sid Meier is an endlessly gracious gentleman, and thus isn’t so strident in his advocacy as many other ludologists. But despite his graciousness, there’s no doubt on which side of the divide he stands. Meier’s games never, ever include rigid pre-crafted scenarios or fixed storylines of any stripe. In most cases, this has been fine because his designs have been well-suited to the more open-ended, generative styles of play he favors. Covert Action, however, is the glaring exception, revealing one of the few blind spots of this generally brilliant game designer. Ironically, Meier had largely been drawn to Covert Action by what he calls the “intriguing” problem of its dynamic case generator. The idea of being able to use the computer to do the hard work of generating stories, and thereby to be able to churn out infinite numbers of the things at no expense, has always enticed him. He continues to muse today about a Sherlock Holmes game built using computer-generated cases, working backward from the solution of a crime to create a trail of clues for player to follow.

Meier is hardly alone in the annals of computer science and game design in finding the problem of automated story-making intriguing. Like his Sherlock Holmes idea, many experiments with procedurally-generated narratives have worked with mystery stories, that most overtly game-like of all literary genres; Covert Action‘s cases as well can be considered variations on the mystery theme.  As early as 1971, Sheldon Klein, a professor at the University of Wisconsin, created something he called an “automatic novel writer” for auto-generating “2100-word murder-mystery stories.” In 1983, Electronic Arts released Jon Freeman and Paul Reiche III’s Murder on the Zinderneuf as one of their first titles; it allowed the player to solve an infinite number of randomly generated mysteries occurring aboard its titular Zeppelin airship. That game’s flaws feel oddly similar to those of Covert Action. As in Covert Action, in Murder on the Zinderneuf the randomized cases never have the resonance of a good hand-crafted mystery story. That, combined with their occasional incongruities and the patterns that start to surface as soon as you’ve played a few times, means that you can never forget their procedural origins. These tales of intrigue never manage to truly intrigue.

Suffice to say that generating believable fictions, whether in the sharply delimited realm of a murder mystery taking place aboard a Zeppelin or the slightly less delimited realm of a contemporary spy thriller, is a tough nut to crack. Even one of the most earnest and concentrated of the academic attempts at tackling the problem, a system called Tale-Spin created by James Meehan at Yale University, continued to generate more unmimetic than mimetic stories after many years of work — and this system was meant only to generate standalone static stories, not interactive mysteries to be solved. And as for Chris Crawford’s Storytron… well, as of this writing it is, as its website says, in a “medically induced coma” for the latest of many massive re-toolings.

In choosing to pick up Covert Action primarily because of the intriguing problem of its case generator and then failing to consider whether said case generator really served the game, Sid Meier may have run afoul of another of his rules for game design, one that I find much more universally applicable than what Meier calls the Covert Action rule. A designer should always ask, Meier tells us, who is really having the fun in a game — the designer/programmer/computer or the player? The procedurally generated cases may have been an intriguing problem for Sid Meier the designer, but they don’t serve the player anywhere near as well as hand-crafted cases might have done.

The model that comes to mind when I think of my ideal version of Covert Action is Killed Until Dead, an unjustly obscure gem from Accolade which, like Murder on the Zinderneuf, I wrote about in an earlier article. Killed Until Dead is very similar to Murder on the Zinderneuf in that it presents the player with a series of mysteries to solve, all of which employ the same cast of characters, the same props, and the same setting. Unlike Murder on the Zinderneuf, however, the mysteries in Killed Until Dead have all been lovingly hand-crafted. They not only hang together better as a result, but they’re full of wit and warmth and the right sort of intrigue — they intrigue the player. If you ask me, a version of Covert Action built along similar lines, full of exciting plotlines with a ripped-from-the-headlines feel, could have been fantastic — assuming, of course, that MicroProse could have found writers and scenario designers with the chops to bring the spycraft to life.

It’s of course possible that my reaction to Covert Action is hopelessly subjective, inextricably tied to what I personally value in games. As my longtime readers are doubtless aware by now, I’m an experiential player to the core, more interested in lived experiences than twiddling the knobs of a complicated system just exactly perfectly. In addition to guaranteeing that I’ll never win any e-sports competitions — well, that and my aging reflexes that were never all that great to begin with — this fact colors the way I see a game like Covert Action. The jarring qualities of Covert Action‘s fiction may not bother some of you one bit. And thus the debate about what really is wrong with Covert Action, that strange note of discordance sandwiched between the monumental Sid Meier masterpieces Railroad Tycoon and Civilization, can never be definitely settled. Ditto the more abstract and even more longstanding negotiation between ludology and narratology. Ah, well… if nothing else, it ensures that readers and writers of blogs like this one will always have something to talk about. So, let the debate rage on.

(Sources: the books Expressive Processing by Noah Wardrip-Fruin and On Interactive Storytelling by Chris Crawford; Game Developer of February 2013. Links to online sources are scattered through the article.

If you’d like to enter the Covert Action debate for yourself, you can buy it from GOG.com.)

 
 

Tags: , , ,