RSS

Tag Archives: wing commander

From Squadron to Wingleader

Chris Roberts and Richard Garriott, 1988

At the Summer Consumer Electronics Show in June of 1989, Origin Systems and Brøderbund Software announced that they wouldn’t be renewing the distribution contract the former had signed with the latter two years before. It was about as amicable a divorce as has ever been seen in the history of business; in this respect, it could hardly have stood in greater contrast to the dust-up that had ended Origin’s relationship with Electronic Arts, their previous distributor, in 1987. Each company was full of rosy praise and warm wishes for the other at a special “graduation party” Brøderbund threw for Origin at the show. “Brøderbund has been one of the few affiliated-label programs that truly helps a small company grow to a size where it can stand on its own and enter the real world,” said Origin’s Robert Garriott, making oblique reference to the more predatory approach of Electronic Arts. In response, Brøderbund’s Gary Carlston toasted that “it’s been rewarding to have helped Origin pursue its growth, and it’s exciting to see the company take this step,” confirming yet one more time Brøderbund’s well-earned reputation as the nice guys of their industry who somehow kept managing to finish first. And so, with a last slap on the rump and a final chorus of “Kumbaya,” Brøderbund sent Origin off to face the scary “world of full-service software publishing” alone.

It was a bold step for Origin, especially given that they still hadn’t solved a serious problem that had dogged them since their founding in the Garriott brothers’ family garage six years earlier. The first two games released by the young company back in 1983 had been Ultima III, the latest installment in Richard Garriott’s genre-defining CRPG series, and Caverns of Callisto, an action game written by Richard’s high-school buddy Chuck Bueche. Setting the frustrating pattern for what was to come, Ultima III soared up the bestseller charts, while Caverns of Callisto disappeared without a trace. In the years that followed, Origin released some non-Ultima games that were moderately successful, but never came close to managing a full-on hit outside of their signature franchise. This failure left them entirely dependent for their survival on Richard Garriott coming up with a new and groundbreaking Ultima game every couple of years, and on that game then proceeding to sell over 200,000 copies. Robert Garriott, as shrewd a businessman as any in his industry, knew that staking his company’s entire future on a single game every two years was at best a risky way to run things. Yet, try as he might, he couldn’t seem to break the pattern.

Origin had a number of factors working against them in their efforts to diversify, but the first and most ironic among them must be the very outsize success of Ultima itself. The company had become so identified with Ultima that many gamers barely realized that they did anything else. As for other folks working in the industry, they had long jokingly referred to Origin Systems as “Ultima Systems.” Everyone knew that the creator of Ultima was also the co-founder of Origin, and the brother of the man who directed its day-to-day operations. In such a situation, there must be a real question of whether any other game project, even a potentially great one, could avoid being overshadowed by the signature franchise, could find enough oxygen to thrive. Added to these concerns, which would be applicable to any company in such a situation, must be the unique nature of the cast of characters at Origin. Richard Garriott’s habit of marching around trade-show floors in full Lord British regalia, his entourage in tow, didn’t always endear him to the rest of the industry. There were, it sometimes seemed, grounds to question whether Richard himself knew that he wasn’t actually a monarch, just a talented kid from suburban Houston with nary a drop of royal blood coursing through his veins. At times, Origin Systems could feel perilously close to a cult of personality. Throw in the company’s out-of-the-way location in Austin, Texas, and attracting really top-flight projects became quite a challenge for them.

So, when it came to games that weren’t Ultima Origin had had to content themselves with projects one notch down from the top tier — projects which, whether because they weren’t flashy enough or were just too nichey, weren’t of huge interest to the bigger publishers. Those brought in enough revenue to justify their existence but not much more, and thus Robert Garriott continued to bet the company every two years on his brother’s latest Ultima. It was a nerve-wracking way to live.

And then, in 1990, all that changed practically overnight. This article and the one that follows will tell the story of how the house that Ultima built found itself with an even bigger franchise on its hands.


Chris Roberts

By the end of the 1980s, the North American and European computer-game industries, which had heretofore existed in almost total isolation from one another, were becoming slowly but steadily more interconnected. The major American publishers were setting up distribution arms in Europe, and the smaller ones were often distributing their wares through the British importer U.S. Gold. Likewise, the British Firebird and Rainbird labels had set up offices in the United States, and American publishers like Cinemaware were doing good business importing British games for American owners of the Commodore Amiga, a platform that was a bit neglected by domestic developers. But despite these changes, the industry as a whole remained a stubbornly bifurcated place. European developers remained European, American developers remained American, and the days of a truly globalized games industry remained far in the future. The exceptions to these rules stand out all the more thanks to their rarity. And one of these notable exceptions was Chris Roberts, the young man who would change Origin Systems forever.

With a British father and an American mother, Chris Roberts had been a trans-Atlantic sort of fellow right from the start. His father, a sociologist at the University of Manchester, went with his wife to Guatemala to do research shortly after marrying, and it was there that Chris was conceived in 1967. The mother-to-be elected to give birth near her family in Silicon Valley. (From the first, it seems, computers were in the baby’s blood.) After returning for a time to Guatemala, where Chris’s father was finishing his research, the little Roberts clan settled back in Manchester, England. A second son arrived to round out the family in 1970.

His first international adventure behind him, Chris Roberts grew up as a native son of Manchester, developing the distinct Mancunian intonation he retains to this day along with his love of Manchester United football. When first exposed to computers thanks to his father’s position at Manchester University, the boy was immediately smitten. In 1982, when Chris was 14, his father signed him up for his first class in BASIC programming and bought a BBC Micro for him to practice on at home. As it happened, the teacher of that first programming class became a founding editor of the new magazine BBC Micro User. Hungry for content, the magazine bought two of young Chris’s first simple BASIC games to publish as type-in listings. Just like that, he was a published game developer.

Britain at the time was going absolutely crazy for computers and computer games, and many of the new industry’s rising stars were as young or younger than Roberts. It thus wasn’t overly difficult for him to make the leap to designing and coding boxed games to be sold in stores. Imagine Software published his first such, a platformer called Wizadore, in 1985; Superior Software published a second, a side-scrolling shooter called Stryker’s Run, in 1986. But the commercial success these titles could hope to enjoy was limited by the fact that they ran on the BBC Micro, a platform which was virtually unknown outside of Britain and even inside of its home country was much less popular than the Sinclair Spectrum as a gaming machine. Being amply possessed of the contempt most BBC Micro owners felt toward the cheap and toy-like “Speccy,” Roberts decided to shift his attention instead to the Commodore 64, the most popular gaming platform in the world at the time. This decision, combined with another major decision made by his parents, set him on his unlikely collision course with Origin Systems in far-off Austin, Texas.

In early 1986, Roberts’s father got an offer he couldn’t refuse in the form of a tenured professorship at the University of Texas. After finishing the spring semester that year, he, his wife, and his younger son thus traded the gray skies of Manchester for the sunnier climes of Austin. Chris was just finishing his A-Levels at the time. Proud Mancunian that he was, he declared that he had no intention of leaving England — and certainly not for a hick town in the middle of Texas. But he had been planning all along to take a year off before starting at the University of Manchester, and his parents convinced him to at least join the rest of the family in Austin for the summer. He agreed, figuring that it would give him a chance to work free of distractions on a new action/adventure game he had planned as his first project for the Commodore 64. Yet what he actually found in Austin was lots of distractions — eye-opening distractions to warm any young man’s heart. Roberts:

The weather was a little nicer in Austin. The American girls seemed to like the English accent, which wasn’t bad, and there was definitely a lot… everything seemed like it was cheaper and there was more of it, especially back then. Now, the world’s become more homogenized so there’s not things you can only get in America that you don’t get in England as well. Back then it was like, the big American movies would come out in America and then they would come out in England a year later and stuff. So I came over and was like, “Ah, you know, this is pretty cool.”

There were also the American computers to consider; these tended to be much more advanced than their British counterparts, sporting disk drives as universal standard equipment at a time when most British games — including both of Roberts’s previous games — were still published on cassette tapes. In light of all these attractions, it seems doubtful whether Roberts would have kept his resolution to return to Manchester in any circumstances. But there soon came along the craziest of coincidences to seal the deal.

Roberts had decided that he really needed to find an artist to help him with his Commodore 64 game-in-progress. Entering an Austin tabletop-gaming shop one day, he saw a beautiful picture of a gladiator hanging on the wall. The owner of the shop told him the picture had been drawn by a local artist, and offered to call the artist for him right then and there if Roberts was really interested in working with him. Roberts said yes, please do. The artist in question was none other than Denis Loubet, whose professional association with Richard Garriott stretched back to well before Origin Systems had existed, to when he’d drawn the box art for the California Pacific release of Akalabeth in 1980.

Denis Loubet

After years of working as a contractor, Loubet was just about to be hired as Origin’s first regular in-house artist. Nevertheless, he liked Roberts and thought his game had potential, and agreed to do the art for it as a moonlighting venture. Loubet soon showed what he was working on to Richard Garriott and Dallas Snell, the latter of whom tended to serve as a sort of liaison between the business side of the company, in the person of Robert Garriott, and the creative side, in the person of Richard. All three parties were as impressed by the work-in-progress as Loubet had been, and they invited Chris to Origin’s offices to ask if he’d be interested in publishing it through them. Prior to this point, Roberts had never even heard of Origin Systems or the Ultima series; he’d grown up immersed in the British gaming scene, where neither had any presence whatsoever. But he liked the people at Origin, liked the atmosphere around the place, and perhaps wasn’t aware enough of what the company represented to be leery of it in the way of other developers who were peddling promising projects around the industry. “After my experiences in England, which is like swimming in a big pool of sharks,” he remembers, “I felt comfortable dealing with Origin.”

Times of Lore

All thoughts of returning to England had now disappeared. Working from Origin’s offices, albeit still as a contracted outside developer rather than an employee, Roberts finished his game, which came to be called Times of Lore. In the course of its development, the game grew considerably in scope and ambition, and, as seemed only appropriate given the company that was to publish it, took on some light CRPG elements as well. In much of this, Roberts was inspired by David Joiner’s 1987 action/CRPG The Faery Tale Adventure. American influences aside, though, Times of Lore still fit best of all into the grand British tradition of free-scrolling, free-roaming 8-bit action/adventures, a sub-genre that verged on completely unknown to American computer gamers. Roberts made sure the whole game could fit into the Commodore 64’s memory at once to facilitate a cassette-based version for the European market.

Unfortunately, his game got to enjoy only a middling level of sales success in return for all his efforts. As if determined to confirm the conventional wisdom that had caused so many developers to steer clear of them, Origin released Times of Lore almost simultaneously with the Commodore 64 port of Ultima V in 1988, leaving Roberts’s game overshadowed by Lord British’s latest. And in addition to all the baggage that came with the Origin logo in the United States, Times of Lore suffered all the disadvantages of being a pioneer of sorts in Europe, the first Origin title to be pushed aggressively there via a new European distribution contract with MicroProse. While that market would undoubtedly have understood the game much better had they given it a chance, no one there yet knew what to make of the company whose logo was on the box. Despite its strengths, Times of Lore thus failed to break the pattern that had held true for Origin for so long. It turned into yet another non-Ultima that was also a non-hit.

Times of Lore

But whatever the relative disappointments, Times of Lore at least wasn’t a flop, and Chris Roberts stayed around as a valued member of the little Origin family. Part of the reason the Origin people wanted to keep him around was simply because they liked him so much. He nursed the same passions for fantasy and science fiction as most of them, with just enough of a skew provided by his British upbringing to make him interesting. And he positively radiated energy and enthusiasm. He’s never hard to find in Origin group shots of the time. His face stands out like that of a nerdy cherub — he had never lost his facial baby fat, making him look pudgier in pictures than he was in real life — as he beams his thousand-kilowatt smile at all and sundry. Still, it was hardly his personality alone that made him such a valued colleague; the folks at Origin also came to have a healthy respect for his abilities. Indeed, and as we’ve already seen in an earlier article, the interface of Times of Lore had a huge influence on that of no less vital an Origin game than Ultima VI.

Alas, Roberts’s own next game for Origin would be far less influential. After flirting for a while with the idea of doing a straightforward sequel to Times of Lore, he decided to adapt the engine to an even more action-oriented post-apocalyptic scenario. Roberts’s first game for MS-DOS, Bad Blood was created in desultory fits and starts, one of those projects that limps to completion more out of inertia than passion. Released at last in 1990, it was an ugly flop on both sides of the Atlantic. Roberts blames marketplace confusion at least partially for its failure: “People who liked arcade-style games didn’t buy it because they thought Bad Blood would be another fantasy-role-play-style game. It was the worst of both worlds, a combination of factors that contributed to its lack of success.” In reality, though, the most telling factor of said combination was just that Bad Blood wasn’t very good, evincing little of the care that so obviously went into Times of Lore. Reviewers roundly panned it, and buyers gave it a wide berth. Thankfully for Chris Roberts’s future in the industry, the game that would make his name was already well along at Origin by the time Bad Blood finally trickled out the door.

Bad Blood

Had it come to fruition in its original form, Roberts’s third game for Origin would have marked even more of a departure for him than the actual end result would wind up being. Perhaps trying to fit in better with Origin’s established image, he had the idea of doing, as he puts it, “a space-conquest game where you take over star systems, move battleships around, and invade planets. It was going to be more strategic than my earlier games.” But Roberts always craved a little more adrenaline in his designs than such a description would imply, and it didn’t take him long to start tinkering with the formula. The game moved gradually from strategic battles between slow-moving dreadnoughts in space to manic dogfights between fighter planes in space. In other words, to frame the shift the way the science-fiction-obsessed Roberts might well have chosen, his inspiration for his space battles changed from Star Trek to Star Wars. He decided “it would be more fun flying around in a fighter than moving battleships around the screen”; note the (unconscious?) shift in this statement from the player as a disembodied hand “moving” battleships around to the player as an embodied direct participant “flying around” herself in fighters. Roberts took to calling his work-in-progress Squadron.

To bring off his idea for an embodied space-combat experience, Roberts would have to abandon the overhead views used by all his games to date in favor of a first-person out-the-cockpit view, like that used by a game he and every other BBC Micro veteran knew well, Ian Bell and David Braben’s Elite. “It was the first space game in which I piloted a ship in combat,” says Roberts of Elite, “and it opened my eyes to the possibilities of where it could go.” On the plus side, Roberts knew that this and any other prospective future games he might make for Origin would be developed on an MS-DOS machine with many times the processing power of the little BBC Micro (or, for that matter, the Commodore 64). On the negative side, Roberts wasn’t a veritable mathematics genius like Ian Bell, the mastermind behind Elite‘s 3D graphics. Nor could he get away in the current marketplace with the wire-frame graphics of Elite. So, he decided to cheat a bit, both to simplify his life and to up the graphics ante. Inspired by the graphics of the Lucasfilm Games flight simulator Battlehawks 1942, he used pre-rendered bitmap images showing ships from several different sides and angles, which could then be scaled to suit the player’s out-the-cockpit view, rather than making a proper, mathematically rigorous 3D engine built out of polygons. As becomes clear all too quickly to anyone who plays the finished game, the results could be a little wonky, with views of the ships suddenly popping into place rather than smoothly rotating. Nevertheless, the ships themselves looked far better than anything Roberts could possibly have hoped to achieve on the technology of the time using a more honest 3D engine.

Denis Loubet, Roberts’s old partner in crime from the early days of Times of Lore, agreed to draw a cockpit as part of what must become yet another moonlighting gig for both of them; Roberts was officially still supposed to be spending his days at Origin on Bad Blood, while Loubet was up to his eyebrows in Ultima VI. Even at this stage, they were incorporating little visceral touches into Squadron, like the pilot’s hand moving the joystick around in time with what the player was doing with her own joystick in front of the computer screen. As the player’s ship got shot up, the damage was depicted visually there in the cockpit. Like the sparks and smoke that used to burst from the bridge controls on the old Star Trek episodes, it might not have made much logical sense — haven’t any of these space-faring societies invented fuses? — but it served the purpose of creating an embodied, visceral experience. Roberts:

It really comes from wanting to put the player in the game. I don’t want you to think you’re playing a simulation, I want you to think you’re really in that cockpit. When I visualized what it would be like to sit in a cockpit, those are the things I thought of.

I took the approach that I didn’t want to sacrifice that reality due to the game dynamics. If you would see wires hanging down after an explosion, then I wanted to include it, even if it would make it harder to figure out how to include all the instruments and readouts. I want what’s taking place inside the cockpit to be as real as what I’m trying to show outside it, in space. I’d rather show you damage as if you were there than just display something like “damage = 20 percent.” That’s abstract. I want to see it.

Squadron, then, was already becoming an unusually cinematic space-combat “simulation.” Because every action-movie hero needs a sidekick, Roberts added a wingman to the game, another pilot who would fly and fight at the player’s side. The player could communicate with the wingman in the midst of battle, passing him orders, and the wingman in turn would communicate back, showing his own personality; he might even refuse to obey orders on occasion.

As a cinematic experience, Squadron felt very much in tune with the way things in general were trending at Origin, to such an extent that one might well ask who was influencing whom. Like so many publishers in this era in which CD-ROM and full-motion video hovered alluringly just out of view on the horizon, Origin had begun thinking of themselves more and more in the terms of Hollywood. The official “product development structure” that was put in place around this time by Dallas Snell demanded an executive producer, a producer, an assistant producer, a director, an assistant director, and a lead writer for every game; of all the positions on the upper rungs of the chart, only that of lead artist and lead programmer wouldn’t have been listed in the credits of a typical Hollywood film. Meanwhile Origin’s recent hire Warren Spector, who came to them with a Masters in film studies, brought his own ideas about games as interactive dramas that were less literal than Snell’s, but that would if anything prove even more of an influence on his colleagues’ developing views of just what it was Origin Systems really ought to be about. Just the previous year, Origin had released a game called Space Rogue, another of that long line of non-Ultima middling sellers, that had preceded Squadron in attempting to do Elite one better. A free-form player-directed game of space combat and trading, Space Rogue was in some ways much more ambitious than the more railroaded experience Roberts was now proposing. Yet there was little question of which game fit better with the current zeitgeist at Origin.

All of which does much to explain the warm reception accorded to Squadron when Chris Roberts, with Bad Blood finally off his plate, pitched it to Origin’s management formally in very early 1990. Thanks to all those moonlighting hours — as well as, one suspects, more than a few regular working hours — Roberts already had a 3D space-combat game that looked and played pretty great. A year or two earlier, that likely would have been that; Origin would have simply polished it up a little and shipped it. But now Roberts had the vision of building a movie around the game. Between flying a series of scripted missions, you would get to know your fellow pilots and follow the progress of a larger war between humanity and the Kilrathi, a race of savage cats in space.

Having finally made the hard decision to abandon the 8-bit market at the beginning of 1989, Origin was now pushing aggressively in the opposite direction from their old technological conservatism, being determined to create games that showed what the very latest MS-DOS machines could really do. Like Sierra before them, they had decided that if the only way to advance the technological state of the art among ordinary consumers was to release games whose hardware requirements were ahead of the curve — a reversal of the usual approach among game publishers, who had heretofore almost universally gone where the largest existing user base already was — then that’s what they would do. Squadron could become the first full expression of this new philosophy, being unapologetically designed to run well only on a cutting-edge 80386-based machine. In what would be a first for the industry, Chris Roberts even proposed demanding expanded memory beyond the traditional 640 K for the full audiovisual experience. For Roberts, stepping up from a Commodore 64, it was a major philosophical shift indeed. “Sod this, trying to make it work for the lowest common denominator—I’m just going to try and push it,” he said, and Origin was happy to hear it.

Ultima VI had just been completed, freeing personnel for another major project. Suspecting that Squadron might be the marketplace game changer he had sought for so long for Origin, Robert Garriott ordered a full-court press in March of 1990. He wanted his people to help Chris Roberts build his movie around his game, and he wanted them to do it in less than three months. They should have a preview ready to go for the Summer Consumer Electronics Show at the beginning of June, with the final product to ship very shortly thereafter.

Jeff George

Responsibility for the movie’s script was handed to Jeff George, one of the first of a number of fellow alumni of the Austin tabletop-game publisher Steve Jackson Games who followed Warren Spector to Origin. George was the first Origin employee hired explicitly to fill the role of “writer.” This development, also attributable largely to the influence of Spector, would have a major impact on Origin’s future games.

Obviously inspired by the ethical quandaries the Ultima series had become so known for over its last few installments, Chris Roberts had imagined a similarly gray-shaded world for his game, with scenarios that would cause the player to question whether the human empire she was fighting for was really any better than that of the Kilrathi. But George, to once again frame the issue in terms Roberts would have appreciated, pushed the game’s fiction toward the clear-cut good guys and bad guys of Star Wars, away from the more complicated moral universe of Star Trek. All talk of a human “empire,” for one thing, would have to go; everyone at Origin knew what their players thought of first when they thought of empires in space. Jeff George:

In the context of a space opera, empire had a bad connotation that would make people think they were fighting for the bad guys. The biggest influence I had on the story was to make it a little more black and white, where Chris had envisioned something grittier, with more shades of gray. I didn’t want people to worry about moral dilemmas while they were flying missions. That’s part of why it worked so well. You knew what you were doing, and knew why you were doing it. The good guys were really good, the bad guys were really bad.

The decision to simplify the political situation and sand away the thorny moral dilemmas demonstrates, paradoxical though it may first seem, a more sophisticated approach to narrative rather than the opposite. Some interactive narratives, like some non-interactive ones, are suited to exploring moral ambiguity. In others, though, the player just wants to fight the bad guys. While one can certainly argue that gaming has historically had far too many of the latter type and far too few of the former, there nevertheless remains an art to deciding which games are best suited for which.

Glen Johnson

Five more programmers and four more artists would eventually join what had been Chris Roberts and Denis Loubet’s little two-man band. With the timetable so tight, the artists were left to improvise large chunks of the narrative along with the game’s visuals. By imagining and drawing the “talking head” portraits of the various other pilots with which the player would interact, artist Glen Johnson wound up playing almost as big a role as Jeff George in crafting the fictional context for the game’s dogfights in space. Johnson:

I worked on paper first, producing eleven black-and-white illustrations. In most games, I would work from a written description of the character’s likes, dislikes, and personality. In this case, I just came up with the characters out of thin air, although I realized they wanted a mixture of men and women pilots. I assigned a call sign to each portrait.

Despite the lack of time at their disposal, the artists were determined to fit the movements of the characters’ mouths to the words of dialog that appeared on the screen, using techniques dating back to classic Disney animation. Said techniques demanded that all dialog be translated into its phonetic equivalent, something that could only be done by hand. Soon seemingly half the company was doing these translations during snatches of free time. Given that many or most players never even noticed the synchronized speech in the finished game, whether it was all worth it is perhaps a valid question, but the determination to go that extra mile in this regard does say much about the project’s priorities.

The music wound up being farmed out to a tiny studio specializing in videogame audio, one of vanishingly few of its kind at the time, which was run by a garrulous fellow named George Sanger, better known as “The Fat Man.” (No, he wasn’t terribly corpulent; that was sort of the joke.) Ever true to his influences, Chris Roberts’s brief to Sanger was to deliver something “between Star Wars and Star Trek: The Motion Picture.” Sanger and his deputy Dave Govett delivered in spades. Hugely derivative of John Williams’s work though the soundtrack was — at times it threatens to segue right into Williams’s famous Star Wars theme — it contributed hugely to the cinematic feel of the game. Origin was particularly proud of the music that played in the background when the player was actually flying in space; the various themes ebbed and swelled dynamically in response to the events taking place on the computer screen. It wasn’t quite the first time anyone had done something like this in a game, but no one had ever managed to do it in quite this sophisticated a way.

The guiding theme of the project remained the determination to create an embodied experience for the player. Chris Roberts cites the interactive movies of Cinemaware, which could be seen as the prototypes for the sort of game he was now trying to perfect, as huge influences in this respect as in many others. Roberts:

I didn’t want anything that made you sort of… pulled you out of being in this world. I didn’t want that typical game UI, or “Here’s how many lives you’ve got, here’s what high score you’ve got.” I always felt that broke the immersion. If you wanted to save the game you’d go to the barracks and you’d click on the bunk. If you wanted to exit, you’d click on the airlock. It was all meant to be in that world and so that was what the drive was. I love story and narrative and I think you can use that story and narrative to tie your action together and that will give your action meaning and context in a game. That was my idea and that was what really drove what I was doing.

The approach extended to the game’s manual. Harking back to the beloved scene-setting packaging of Infocom, the manual, which was written by freelancer Aaron Allston, took the form of Claw Marks, “The Onboard Magazine of TCS Tiger’s Claw” — the Tiger’s Claw being the name of the spaceborne aircraft carrier from which the player would be flying all of the missions. Like the artists, Allston would wind up almost inadvertently creating vital pieces of the game as a byproduct of the compressed schedule. “I couldn’t really determine everything at that point in development,” he remembers, “so, in some cases, specifically for the tactics information, we made some of it up and then retrofitted it and adjusted the code in the game to make it work.”

Once again in the spirit of creating a cohesive, embodied experience for the player, Roberts wanted to get away from the save-and-restore dance that was so typical of ludic narratives of the era. Therefore, instead of structuring the game’s 40 missions as a win-or-go-home linear stream, he created a branching mission tree in which the player’s course through the narrative would be dictated by her own performance. There would, in other words, be no way to definitively lose other than by getting killed. Roberts would always beg players to play the game “honestly,” beg them not to reload and replay each mission until they flew it perfectly. Only in this way would they get the experience he had intended for them to have.

Warren Spector

As the man responsible for tying all of the elements together to create the final experience, Roberts bore the titles of director and producer under Origin’s new cinematic nomenclature. He worked under the watchful eye of Squadron‘s co-producer Warren Spector, who, being older and in certain respects wiser, was equipped to handle the day-to-day administrative tasks that Roberts wasn’t. Spector:

When I came on as producer, Chris was really focused on the direction he wanted to take with the game. He knew exactly where he was going, and it would have been hard to deflect him from that course. It would have been crazy to even want to, so Chris and I co-produced the game. Where his talent dropped out, mine started, and vice versa. We did a task breakdown, and I ended up updating, adjusting, and tracking scheduling and preparing all the documentation. He handled the creative and qualitative issues. We both juggled the resources.

In implying that his own talent “dropped out” when it came to creative issues, Spector is selling himself about a million dollars short. He was a whirling dervish of creative energy throughout the seven years he spent with Origin, if anything even more responsible than Richard Garriott for the work that came out of the company under the Ultima label during this, the franchise’s most prolific period. But another of the virtues which allowed him to leave such a mark on the company was an ability to back off, to defer to the creative visions of others when it was appropriate. Recognizing that no one knew Chris Roberts’s vision like Chris Roberts, he was content in the case of Squadron to act strictly as the facilitator of that vision. In other words, he wasn’t too proud to just play the role of organizer when it was appropriate.

Still, it became clear early on that no combination of good organization and long hours would allow Squadron to ship in June. The timetable slipped to an end-of-September ship date, perfect to capitalize on the Christmas rush.

Although Squadron wouldn’t ship in June, the Summer Consumer Electronics Show loomed with as much importance as ever as a chance to show off the game-to-be and to drum up excitement that might finally end the sniggering about Ultima Systems. Just before the big show, Origin’s lawyers delivered the sad news that calling the game Squadron would be a bad idea thanks to some existing trademarks on the name. After several meetings, Wingleader emerged as the consensus choice for a new name, narrowly beating out Wing Commander. It was thus under the former title that the world at large got its first glimpse of what would turn into one of computer gaming’s most iconic franchises. Martin Davies, Origin’s Vice President of Sales:

I kicked hard to have a demo completed for the show. It was just a gut reaction, but I knew I needed to flood retail and distribution channels with the demo. Before the release of the game, I wanted the excitement to grow so that the confidence level would be extremely high. If we could get consumers beating a path in and out of the door, asking whether the game was out, distribution would respond.

With Wingleader still just a bunch of art and sound assets not yet wired up to the core game they were meant to complement, an interactive demo was impossible. Instead Chris Roberts put together a demo on videotape, alternating clips of the battles in space with clips of whatever other audiovisual elements he could assemble from what the artists and composers had managed to complete. Origin brought a big screen and a booming sound system out to Chicago for the show; the latter prompted constant complaints from other exhibitors. The noise pollution was perfect for showing the world that there was now more to Origin Systems than intricate quests and ethical dilemmas — that they could do aesthetic maximalism as well as anyone in their industry, pushing all of the latest hardware to its absolute limit in the process. It was a remarkable transformation for a company that just eighteen months before had been doing all development on the humble little 8-bit Apple II and Commodore 64. Cobbled together though it was, the Wingleader demo created a sensation at CES.

Indeed, one can hardly imagine a better demonstration of how the computer-game industry as a whole was changing than the game that had once been known as Squadron, was now known as Wingleader, and would soon go onto fame as Wing Commander. In my next article, I’ll tell the story of how the game would come to be finished and sold, along with the even more important story of what it would mean for the future of digital entertainment.

(Sources: the books Wing Commander I and II: The Ultimate Strategy Guide by Mike Harrison and Game Design Theory and Practice by Richard Rouse III; Retro Gamer 59 and 123; Questbusters of July 1989, August 1990, and April 1991; Computer Gaming World of September 1989 and November 1992; Amiga Computing of December 1988. Online sources include documents hosted at the Wing Commander Combat Information Center, US Gamer‘s profile of Chris Roberts, The Escapist‘s history of Wing Commander, Paul Dean’s interview with Chris Roberts, and Matt Barton’s interview with George “The Fat Man” Sanger. Last but far from least, my thanks to John Miles for corresponding with me via email about his time at Origin, and my thanks to Casey Muratori for putting me in touch with him.

Wing Commander I and II can be purchased in a package together with all of their expansion packs from GOG.com.)

 
 

Tags: , ,

The 640 K Barrier

There was a demon in memory. They said whoever challenged him would lose. Their programs would lock up, their machines would crash, and all their data would disintegrate.

The demon lived at the hexadecimal memory address A0000, 655,360 in decimal, beyond which no more memory could be allocated. He lived behind a barrier beyond which they said no program could ever pass. They called it the 640 K barrier.

— with my apologies to The Right Stuff1

The idea that the original IBM PC, the machine that made personal computing safe for corporate America, was a hastily slapped-together stopgap has been vastly overstated by popular technology pundits over the decades since its debut back in August of 1981. Whatever the realities of budgets and scheduling with which its makers had to contend, there was a coherent philosophy behind most of the choices they made that went well beyond “throw this thing together as quickly as possible and get it out there before all these smaller companies corner the market for themselves.” As a design, the IBM PC favored robustness, longevity, and expandability, all qualities IBM had learned the value of through their many years of experience providing businesses and governments with big-iron solutions to their most important data–processing needs. To appreciate the wisdom of IBM’s approach, we need only consider that today, long after the likes of the Commodore Amiga and the original Apple Macintosh architecture, whose owners so loved to mock IBM’s unimaginative beige boxes, have passed into history, most of our laptop and desktop computers — including modern Macs — can trace the origins of their hardware back to what that little team of unlikely business-suited visionaries accomplished in an IBM branch office in Boca Raton, Florida.

But of course no visionary has 20-20 vision. For all the strengths of the IBM PC, there was one area where all the jeering by owners of sexier machines felt particularly well-earned. Here lay a crippling weakness, born not so much of the hardware found in that first IBM PC as the operating system the marketplace chose to run on it, that would continue to vex programmers and ordinary users for two decades, not finally fading away until Microsoft’s release of Windows XP in 2001 put to bed the last legacies of MS-DOS in mainstream computing. MS-DOS, dubbed the “quick and dirty” operating system during the early days of its development, is likely the piece of software in computing history with the most lopsided contrast between the total number of hours put it into its development and the total number of hours it spent in use, on millions and millions of computers all over the world. The 640 K barrier, the demon all those users spent so much time and energy battling for so many years, was just one of the more prominent consequences of corporate America’s adoption of such a blunt instrument as MS-DOS as its standard. Today we’ll unpack the problem that was memory management under MS-DOS, and we’ll also examine the problem’s multifarious solutions, all of them to one degree or another ugly and imperfect.


 

The original IBM PC was built around an Intel 8088 microprocessor, a cost-reduced and somewhat crippled version of an earlier chip called the 8086. (IBM’s decision to use the 8088 instead of the 8086 would have huge importance for the expansion buses of this and future machines, but the differences between the two chips aren’t important for our purposes today.) Despite functioning as a 16-bit chip in most ways, the 8088 had a 20-bit address space, meaning it could address a maximum of 1 MB of memory. Let’s consider why this limitation should exist.

Memory, whether in your brain or in your computer, is of no use to you if you can’t keep track of where you’ve put things so that you can retrieve them again later. A computer’s memory is therefore indexed by bytes, with every single byte having its own unique address. These addresses, numbered from 0 to the upper limit of the processor’s address space, allow the computer to keep track of what is stored where. The biggest number that can be represented in 20 bits is 1,048,575, or 1 MB. Thus this is the maximum amount of memory which the 8088, with its 20-bit address bus, can handle. Such a limitation hardly felt like a deal breaker to the engineers who created the IBM PC. Indeed, it’s difficult to overemphasize what a huge figure 1 MB really was when they released the machine in 1981, in which year the top-of-the-line Apple II had just 48 K of memory and plenty of other competing machines shipped with no more than 16 K.

A processor needs to address other sorts of memory besides the pool of general-purpose RAM which is available for running applications. There’s also ROM memory — read-only memory, burned inviolably into chips — that contains essential low-level code needed for the computer to boot itself up, along with, in the case of the original IBM PC, an always-available implementation of the BASIC programming language. (The rarely used BASIC in ROM would be phased out of subsequent models.) And some areas of RAM as well are set aside from the general pool for special purposes, like the fully 128 K of addresses given to video cards to keep track of the onscreen display in the original IBM PC. All of these special types of memory must be accessed by the CPU, must be given their own unique addresses to facilitate that, and must thus be subtracted from the address space available to the general pool.

IBM’s engineers were quite generous in drawing the boundary between their general memory pool and the area of addresses allocated to special purposes. Focused on expandability and longevity as they were, they reserved big chunks of “special” memory for purposes that hadn’t even been imagined yet. In all, they reserved the upper three-eighths of the available addresses for specialized purposes actual or potential, leaving the lower five-eighths — 640 K — to the general pool. In time, this first 640 K of memory would become known as “conventional memory,” the remaining 384 K — some of which would be ROM rather than RAM — as “high memory.” The official memory map which IBM published upon the debut of the IBM PC looked like this:

It’s important to understand when looking at a memory map like this one that the existence of a logical address therein doesn’t necessarily mean that any physical memory is connected to that address in any given real machine. The first IBM PC, for instance, could be purchased with as little as 16 K of conventional memory installed, and even a top-of-the-line machine had just 256 K, leaving most of the conventional-memory space vacant. Similarly, early video cards used just 32 K or 64 K of the 128 K of address space offered to them in high memory. The 640 K barrier was thus only a theoretical limitation early on, one few early users or programmers ever even noticed.

That blissful state of affairs, however, wouldn’t last very long. As IBM’s creations — joined, soon enough, by lots of clones — became the standard for American business, more and more advanced applications appeared, craving more and more memory alongside more and more processing power. Already by 1984 the 640 K barrier had gone from a theoretical to a very real limitation, and customers were beginning to demand that IBM do something about it. In response, IBM that year released the PC/AT, built around Intel’s new 80286 microprocessor, which boasted a 24-bit address space good for 16 MB of memory. To unlock all that potential extra memory, IBM made the commonsense decision to extend the memory map above the specialized high-memory area that ended at 1 MB, making all addresses beyond 1 MB a single pool of “extended memory” available for general use.

Problem solved, right? Well, no, not really — else this would be a much shorter article. Due more to software than hardware, all of this potential extended memory proved not to be of much use for the vast majority of people who bought PC/ATs. To understand why this should be, we need to examine the deadly embrace between the new processor and the old operating system people were still running on it.

The 80286 was designed to be much more than just a faster version of the old 8086/8088. Developing the chip before IBM PCs running MS-DOS had come to dominate business computing, Intel hadn’t allowed the need to stay compatible with that configuration to keep them from designing a next-generation chip that would help to take computing to where they saw it as wanting to go. Intel believed that microcomputers were at the stage at which the big institutional machines had been a couple of decades earlier, just about ready to break free of what computer scientist Brian L. Stuart calls the “Triangle of Ones”: one user running one program at a time on one machine. At the very least, Intel believed, the second leg of the Triangle must soon fall; everyone recognized that multitasking — running several programs at a time and switching freely between them — was a much more efficient way to do complex work than laboriously shutting down and starting up application after application. But unfortunately for MS-DOS, the addition of multitasking complicates the life of an operating system to an absolutely staggering degree.

Operating systems are of course complex subjects worthy of years or a lifetime of study. We might, however, collapse their complexities down to a few fundamental functions: to provide an interface for the user to work with the computer and manage her programs and files; to manage the various tasks running on the computer and allocate resources among them; and to act as a buffer or interface between applications and the underlying hardware of the computer. That, anyway, is what we expect at a minimum of our operating systems today. But for a computer ensconced within the Triangle of Ones, the second and third functions were largely moot: with only one program allowed to run at a time, resource-management concerns were nonexistent, and, without the need for a program to be concerned about clashing with other programs running at the same time, bare-metal programming — manipulating the hardware directly, without passing requests through any intervening layer of operating-system calls — was often considered not only acceptable but the expected approach. In this spirit, MS-DOS provided just 27 function calls to programmers, the vast majority of them dealing only with disk and file management. (Compare that, my fellow programmers, with the modern Windows or OS X APIs!) For everything else, banging on the bare metal was fine.

We can’t even begin here to address all of the complications that are introduced when we add multitasking into the equation, asking the operating system in the process to fully embrace all three of the core functions listed above. Memory management alone, the one aspect we will look deeper into today, becomes complicated enough. A program which is sharing a machine with other programs can no longer have free run of the memory map, placing whatever it wants to wherever it wants to; to do so risks overwriting the code or data of another program running on the system. Instead the operating system must demand that individual programs formally request the memory they’d like to use, and then must come up with a way to keep a program, whether due to bugs or malice, from running roughshod over areas of memory that it hasn’t been granted.

Or perhaps not. The Commodore Amiga, the platform which pioneered multitasking on personal computers in 1985, didn’t so much solve the latter part of this problem as punted it away. An application program is expected to request from the Amiga’s operating system any memory that it requires. The operating system then returns a pointer to a block of memory of the requested size, and trusts the application not to write to  memory outside of these bounds. Yet nothing besides the programmer’s skill and good nature absolutely prevents such unauthorized memory access from happening. Every application on the Amiga, in other words, can write to any address in the machine’s memory, whether that address be properly allocated to it or not. Screen memory, free memory, another program’s data, another program’s code — all are fair game to the errant program. Such unauthorized memory access will almost always eventually result in a total system crash. A non-malicious programmer who wants her program to a good citizen would of course never intentionally write to memory she hasn’t properly requested, but bugs of this nature are notoriously easy to create and notoriously hard to track down, and on the Amiga a single instance of one can bring down not only the offending program but the entire operating system. With all due respect to the Amiga’s importance as the first multitasking personal computer, this is obviously not the ideal way to implement it.

A far more sustainable approach is to take the extra step of tracking and protecting the memory that has been allocated to each program. Memory protection is usually accomplished using  what’s known as virtual memory: when a program requests memory, it’s returned not a true address within the system’s memory pool but rather a virtual address that’s translated back into the real address to which it corresponds every time the program accesses its data. Each program is thus effectively sandboxed from everything else, allowed to read from and write to only its own data. Only the lowest levels of the operating system have global access to the memory pool as a whole.

Implementing such memory protection in software alone, however, must be an untenable drain on the resources available to systems engineers in the 1980s — a fact which does everything to explain its absence from the Amiga. Intel therefore decided to give software a leg up via hardware. They built into the 80286 a memory-management unit that could automatically translate from virtual to real memory addresses and vice versa, making this constantly ongoing process fairly transparent even to the operating system.

Nevertheless, the operating system must know about this capability, must in fact be written very differently if it’s to run on a CPU with memory protection built into its circuitry. Intel recognized that it would take time for such operating systems to be created for the new chip, and recognized that compatibility with the earlier 8086/8088 chips would be a very good thing to have in the meantime. They therefore built two possible operating modes into the 80286. In “protected mode” — the mode they hoped would eventually come to be used almost universally — the chip’s full potential would be realized, including memory protection and the ability to address up to 16 MB of memory. In “real mode,” the 80286 would function essentially like a turbocharged 8086/8088, with no memory-protection capabilities and with the old limitation on addressable memory of 1 MB still in place. Assuming that in the early days at least the new chip would need to run on operating systems with no knowledge of its full capabilities, Intel made the 80286 default to real mode on startup. An operating system which did know about the 80286 and wanted to bring out its full potential could switch it to protected mode at boot-up and be off to the races.

It’s at the intersection between the 80286 and the operating system that Intel’s grand plans for the future of their new chip went awry. An overwhelming percentage of the early 80286s were used in IBM PC/ATs and clones, and an overwhelming percentage of those machines were running MS-DOS. Microsoft’s erstwhile “quick and dirty” operating system knew nothing of the 80286’s full capabilities. Worse, trying to give it knowledge of those capabilities would have to entail a complete rewrite which would break compatibility with all existing MS-DOS software. Yet the whole reason MS-DOS was popular in the first place — it certainly wasn’t because of a generous feature set, a friendly interface, or any aesthetic appeal — was that very same huge base of business software. Getting users to make the leap to some hypothetical new operating system in the absence of software to run on it would be as difficult as getting developers to write programs for an operating system with no users. It was a chicken-or-the-egg situation, and neither chicken nor egg was about to stick its neck out anytime soon.

IBM was soon shipping thousands upon thousands of PC/ATs every month, and the clone makers were soon shipping even more 80286-based machines of their own. Yet at least 95 percent of those machines were idling along at only a fraction of their potential, thanks to the already creakily archaic MS-DOS. For all these users, the old 640 K barrier remained as high as ever. They could stuff their machines full of extended memory if they liked, but they still couldn’t access it. And of course the multitasking that the 80286 was supposed to have enabled remained as foreign a concept to MS-DOS as a GPS unit to a Model T. The only solution IBM offered those who complained about the situation was to run another operating system. And indeed, there were a number of alternatives to MS-DOS available for the PC/AT and other 80286-based machines, including several variants of the old institutional-computing favorite Unix — one of them even from Microsoft — and new creations like Digital Research’s Concurrent DOS, which struggled with mixed results to wedge in some degree of MS-DOS compatibility. Still, the only surefire way to take full advantage of MS-DOS’s huge software base was to run the real — in more ways than one now! — MS-DOS, and this is what the vast majority of people with 80286-equipped machines wound up doing.

Meanwhile the very people making the software which kept MS-DOS the only viable choice for most users were feeling the pinch of being confined to 640 K more painfully almost by the month. Finally Lotus Corporation —  makers of the Lotus 1-2-3 spreadsheet package that ruled corporate America, the greatest single business-software success story of their era — decided to use their clout to do something about it. They convinced Intel to join them in devising a scheme for breaking the 640 K barrier without abandoning MS-DOS. What they came up with was one mother of an ugly kludge — a description the scheme has in common with virtually all efforts to break through the 640 K barrier.

Looking through the sparsely populated high-memory area which the designers of the original IBM PC had so generously carved out, Lotus and Intel realized it should be possible on almost any extant machine to identify a contiguous 64 K chunk of those addresses which wasn’t being used for anything. This chunk, they decided, would be the gateway to potentially many more megabytes installed elsewhere in the machine. Using a combination of software and hardware, they implemented what’s known as a bank-switching scheme. The 64 K chunk of high-memory addresses was divided into four segments of 16 K, each of which could serve as a lens focused on a 16 K segment of additional memory above and beyond 1 MB. When the processor accessed the addresses in high memory, the data it would actually access would be the data at whatever sections of the additional memory their lenses were currently pointing to. The four lenses could be moved around at will, giving access, albeit in a roundabout way, to however much extra memory the user had installed. The additional memory unlocked by the scheme was dubbed “expanded memory.”  The name’s unfortunate similarity to “extended memory” would cause much confusion over the years to come; from here on, we’ll call it by its common acronym of “EMS.”

All those gobs of extra memory wouldn’t quite come for free: applications would have to be altered to check for the existence of EMS memory and make use of it, and there would remain a distinct difference between conventional memory and EMS memory with which programmers would always have to reckon. Likewise, the overhead of constantly moving those little lenses around made EMS memory considerably slower to access than conventional memory. On the brighter side, though, EMS worked under MS-DOS with only the addition of a single device driver during startup. And, since the hardware mechanism for moving the lenses around was completely external to the CPU, it would even work on machines that weren’t equipped with the new 80286.

This diagram shows the different types of memory available on PCs of the mid-1980s. In blue, we see the original 1 MB memory map of the IBM PC. In green, we see a machine equipped with additional extended memory. And in orange we see a machine equipped with additional expanded memory.

Shortly before the scheme made its official debut at a COMDEX trade show in May of 1985, Lotus and Intel convinced a crucial third partner to come aboard: Microsoft. “It’s garbage! It’s a kludge!” said Bill Gates. “But we’re going to do it.” With the combined weight of Lotus, Intel, and Microsoft behind it, EMS took hold as the most practical way of breaking the 640 K barrier. Imperfect and kludgy though it was, software developers hurried to add support for EMS memory to whatever programs of theirs could practically make use of it, while hardware manufacturers rushed EMS memory boards onto the market. EMS may have been ugly, but it was here today and it worked.

At the same time that EMS was taking off, however, extended memory wasn’t going away. Some hardware makers — most notably IBM themselves — didn’t want any part of EMS’s ugliness. Software makers therefore continued to probe at the limits of machines equipped with extended memory, still looking for a way to get at it from within the confines of MS-DOS. What if they momentarily switched the 80286 into protected mode, just for as long as they needed to manipulate data in extended memory, then went back into real mode? It seemed like a reasonable idea — except that Intel, never anticipating that anyone would want to switch modes on the fly like this, had neglected to provide a way to switch an 80286 in protected mode back into real mode. So, proponents of extended memory had to come up with a kludge even uglier than the one that allowed EMS memory to function. They could force the 80286 back into real mode, they realized, by resetting it entirely, just as if the user had rebooted her computer. The 80286 would go through its self-check again — a process that admittedly absorbed precious milliseconds — and then pick back up where it left off. It was, as Microsoft’s Gordon Letwin memorably put it, like “turning off the car to change gears.” It was staggeringly kludgy, it was horribly inefficient, but it worked in its fashion. Given the inefficiencies involved, the scheme was mostly used to implement virtual disks stored in the extended memory, which wouldn’t be subject to the constant access of an application’s data space.

In 1986, the 32-bit 80386, Intel’s latest and greatest chip, made its public bow at the heart of the Compaq Deskpro 386 rather than an IBM machine, a landmark moment signaling the slow but steady shift of business computing’s power center from IBM to Microsoft and the clone makers using their operating system. While working on the new chip, Intel had had time to see how the 80286 was actually being used in the wild, and had faced the reality that MS-DOS was likely destined to be cobbled onto for years to come rather than replaced in its entirety with something better. They therefore made a simple but vitally important change to the 80386 amidst its more obvious improvements. In addition to being able to address an inconceivable total of 4 GB of memory in protected mode thanks to its 32-bit address space, the 80386 could be switched between protected mode and real mode on the fly if one desired, without needing to be constantly reset.

In freeing programmers from that massive inefficiency, the 80386 cracked open the door that much further to making practical use of extended memory in MS-DOS. In 1988, the old EMS consortium of Lotus, Intel, and Microsoft came together once again, this time with the addition to their ranks of the clone manufacturer AST; the absence of IBM is, once again, telling. Together they codified a standard approach to extended memory on 80386 and later processors, which corresponded essentially to the scheme I’ve already described in the context of the 80286, but with a simple command to the 80386 to switch back to real mode replacing the resets. They called it the eXtended Memory Specification; memory accessed in this way soon became known universally as “XMS” memory. Under XMS as under EMS, a new device driver would be loaded into MS-DOS. Ordinary real-mode programs could then call this driver to access extended memory; the driver would do the needful switching to protected mode, copy blocks of data from extended memory into conventional memory or vice versa, then switch the processor back to real mode when it was time to return control to the program. It was still inelegant, still a little inefficient, and still didn’t use the capabilities of Intel’s latest processors in anything like the way Intel’s engineers had intended them to be used; true multitasking still remained a pipe dream somewhere off in a shadowy future. Owners of sexier machines like the Macintosh and Amiga, in other words, still had plenty of reason to mock and scoff. In most circumstances, working with XMS memory was actually slower than working with EMS memory. The primary advantage of XMS was that it let programs work with much bigger chunks of non-conventional memory at one time than the four 16 K chunks that EMS allowed. Whether any given program chose EMS or XMS came to depend on which set of advantages and disadvantages best suited its purpose.

The arrival of XMS along with the ongoing use of EMS memory meant that MS-DOS now had two competing memory-management solutions. Buyers now had to figure out not only whether they had enough extra memory to run a program but whether they had the right kind of extra memory. Ever accommodating, hardware manufacturers began shipping memory boards that could be configured as either EMS or XMS memory — whatever the application you were running at the moment happened to require.

The next stage in the slow crawl toward parity with other computing platforms in the realm of memory management would be the development of so-called “DOS extenders,” software to allow applications themselves to run in protected mode, thus giving them direct access to extended memory without having to pass their requests through an inefficient device driver. An application built using a DOS extender would only need to switch the processor to real mode when it needed to communicate with the operating system. The development of DOS extenders was driven by Microsoft’s efforts to turn Windows, which like seemingly everything else in business computing ran on top of MS-DOS, into a viable alternative to the command line and a viable challenger to the Macintosh. That story is thus best reserved for a future article, when we look more closely at Windows itself. As it is, the story that I’ve told so far today moves us nicely into the era of computer-gaming history we’ve reached on the blog in general.

In said era, the MS-DOS machines that had heretofore been reserved for business applications were coming into homes, where they were often used to play a new generation of games taking advantage of the VGA graphics, sound cards, and mice sported by the latest systems. Less positively, all of the people wanting to play these new games had to deal with the ramifications of a 640 K barrier that could still be skirted only imperfectly. As we’ve seen, both EMS and XMS imposed to one degree or another a performance penalty when accessing non-conventional memory. What with games being the most performance-sensitive applications of all, that made that first 640 K of lightning-fast conventional memory most precious of all for them.

In the first couple of years of MS-DOS’s gaming dominance, developers dealt with all of the issues that came attached to using memory beyond 640 K by the simple expedient of not using any memory beyond 640 K. But that solution was compatible neither with developers’ growing ambitions for their games nor with the gaming public’s growing expectations of them.

The first harbinger of what was to come was Origin Systems’s September 1990 release Wing Commander, which in its day was renowned — and more than a little feared — for pushing the contemporary state of the art in hardware to its limits. Even Wing Commander didn’t go so far as to absolutely require memory beyond 640 K, but it did use it to make the player’s audiovisual experience snazzier if it was present. Setting a precedent future games would largely follow, it was quite inflexible in its approach, demanding EMS — as opposed to XMS — memory. In the future, gamers would have to become all too familiar with the differences between the two standards, and how to configure their machines to use one or the other. Setting another precedent, Wing Commander‘s “installation guide” included a section on “memory usage” that was required reading in order to get things working properly. In the future, such sections would only grow in length and complexity, and would need to be pored over by long-suffering gamers with far more concentrated attention than anything in the manual having anything to do with how to actually play the games they purchased.

In Accolade’s embarrassing Leisure Suit Larry knockoff Les Manley in: Lost in LA, the title character explains EMS and XMS memory to some nubile companions. The ironic thing was that anyone who wished to play the latest games on an MS-DOS machine really did need to know this stuff, or at least have a friend who did.

Thus began the period of almost a decade, remembered with chagrin but also often with an odd sort of nostalgia by old-timers today, in which gamers spent hours monkeying about with MS-DOS’s “config.sys” and “autoexec.bat” files and swapping in and out various third-party utilities in the hope of squeezing out that last few kilobytes of conventional memory that Game X needed to run. The techniques they came to employ were legion.

In the process of developing Windows, Microsoft had discovered that the kernel of MS-DOS itself, a fairly tiny program thanks to its sheer age, could be stashed into the first 64 K of memory beyond 1 MB and still accessed like conventional memory on an 80286 or later processor in real mode thanks to what was essentially an undocumented technical glitch in the design of those processors. Gamers thus learned to include the line “DOS=HIGH” in their configuration files, freeing up a precious block of conventional memory. Likewise, there was enough unused space scattered around in the 384 K of high memory on most machines to stash many or all of MS-DOS’s device drivers there instead of in conventional memory. Thus “DOS=HIGH” soon became “DOS=HIGH,UMB,” the second parameter telling the computer to make use of these so-called “upper-memory blocks” and thereby save that many kilobytes more.

These were the most basic techniques, the starting points. Suffice to say that things got a lot more complicated from there, turning into a baffling tangle of tweaks, some saving mere bytes rather than kilobytes of conventional memory, but all of them important if one was to hope to run games that by 1993 would be demanding 604 K of 640 K for their own use. That owners of machines which by that point typically contained memories in the multi-megabytes should have to squabble with the operating system over mere handfuls of bytes was made no less vexing by being so comically absurd. And every new game seemed to up the ante, seemed to demand that much more conventional memory. Those with a sunnier disposition or a more technical bent of mind took the struggle to get each successive purchase running as the game before the game got started, as it were. Everyone else gnashed their teeth and wondered for the umpteenth time if they might not have been better off buying a console where games Just Worked. The only thing that made it all worthwhile was the mixture of relief, pride, and satisfaction that ensued when you finally got it all put together just right and the title screen came up and the intro music sprang to life — if, that is, you’d managed to configure your sound card properly in the midst of all your other travails. Such was the life of the MS-DOS gamer.

Before leaving the issue of the 640 K barrier behind in exactly the way that all those afflicted by it for so many years were so conspicuously unable to do, we have to address Bill Gates’s famous claim, allegedly made at a trade show in 1981, that “640 K ought to be enough for anybody.” The quote has been bandied about for years as computer-industry legend, seeming to confirm as it does the stereotype of Bill Gates as the unimaginative dirty trickster of his industry, as opposed to Steve Jobs the guileless visionary (the truth is, needless to say, far more complicated). Sadly for the stereotypers, however, the story of the quote is similar to all too many legends in the sense that it almost certainly never happened. Gates himself, for one, vehemently denies ever having said any such thing. Fred Shapiro, for another, editor of The Yale Book of Quotations, conducted an exhaustive search for a reputable source for the quote in 2008, going so far as to issue a public plea in The New York Times for anyone possessing knowledge of such a source to contact him. More than a hundred people did so, but none of them could offer up the smoking gun Shapiro sought, and he was left more certain than ever that the comment was “apocryphal.” So, there you have it. Blame Bill Gates all you want for the creaky operating system that was the real root cause of all of the difficulties I’ve spent this article detailing, but don’t ever imagine he was stupid enough to say that. “No one involved in computers would ever say that a certain amount of memory is enough for all time,” said Gates in 2008. Anyone doubting the wisdom of that assertion need only glance at the history of the IBM PC.

(Sources: the books Upgrading and Repairing PCs, 3rd edition by Scott Mueller and Principles of Operating Systems by Brian L. Stuart; Computer Gaming World of June 1993; Byte of January 1982, November 1984, and March 1992; Byte‘s IBM PC special issues of Fall 1985 and Fall 1986; PC Magazine of May 14 1985, January 14 1986, May 30 1989, June 13 1989, and June 27 1989; the episode of the Computer Chronicles television show entitled “High Memory Management”; the online article “The ‘640K’ quote won’t go away — but did Gates really say it?” on Computerworld.)


  1. Yes, that is quite possibly the nerdiest thing I’ve ever written. 

 
 

Tags: , , ,