RSS

Author Archives: Jimmy Maher

Seven Cities of Gold

Seven Cities of Gold

Shortly after completing M.U.L.E., Ozark Softscape held a week-long brainstorming retreat with their producer from Electronic Arts, Joe Ybarra, inside the Little Rock house that served as their offices. Dan Bunten [1]Dan Bunten began living as the woman Danielle Bunten Berry in November of 1992. She died five years later.

In preparing this article, I of course reviewed what I had already written about Dani. I confess it made me cringe a bit. I have long been annoyed by the habit of people who know nothing of her games of using Dani as some generic representative of her sexuality, and wanted to move the focus firmly to her work as a game designer, to leave the politics of gender identity alone and focus on why she was a such a giant in her chosen field. I now realize that in doing so I seemed to dismiss and disrespect the other parts of her story, although I genuinely didn’t intend to do so. A couple of angry commenters who, it seemed to me, wanted to force me into the very narrative template I had been trying to avoid only hardened my position. I continue to want to avoid the standard structure of the genre of “Dani Bunten Berry stories” — but I also realize that they had some points as well. I particularly regret that I never referred to Dani by her female name even in my note at the end of the article. I literally never realized I had done this until my recent rereading, and now understand some objections about how I allegedly considered her unworthy of being referred to by her real name and the like a bit better. I won’t edit the article, as doing so could only make those who commented look unreasonable in ways they really weren’t. Anyway, it perhaps serves as a better lesson just as it is. Know, however, that if I was writing it today I would handle the issue somewhat differently.

That said, you’ve surely noticed that I continue to refer to Dani as “Dan” and “he” in the article above. I understand the logic of those who would say that Dani was always a woman, merely one who was by an accident of birth born into a man’s body. Certainly this is the argument most advocates for transsexual rights would make. For myself, I am all for transsexual rights, but also believe that gender and sexual identity may be more fluid than much transsexual rhetoric would have it. In the end, I have continued to opt for clarity and reality as all of Dani’s friends and colleagues knew it in the 1980s: of her being the man Dan Bunten. Referring to “Dani” and “she” in these articles would be confusing to the reader and at least at some level anachronistic, opposed to the consensus reality shared by everyone around her. I understand that this decision may not seem ideal to everyone, and even that it runs counter to some journalistic style guidelines. If you disagree with it, I can only ask you to believe me when I say it was made in good faith, with no intention to slight. For what it’s worth, reports are that Dani was never offended in the least by being referred to as a man when conversations came around to her years of living as Dan.
arrived with a pretty good idea of the game he wanted to make next already in hand. He had recently become fascinated with a monster board game released by those famed purveyors of monster board games, Avalon Hill. Civilization places each player in charge of a single tribe at the dawn of history, about 8000 BCE, and lets her guide it down through the millennia as far as the time of the dawning of the glory that was Rome, about 230 BCE. Warfare plays a role, but Civilization isn’t really a traditional war game; equally important are exploration, culture, economics, and technological advancement. (Of course, all of these factors are inevitably interlinked.) With a scope like that, Civilization is just about as complicated and time consuming as board gaming gets. A full game generally consumes about eight hours even once you’ve learned how to play.

Dan’s colleagues dutifully set aside a day to play the game with him, to see what he was on about. They emerged blinking and befuddled, and not at all sure about the idea of a computerized version. How, after all, were they going to pack all of that complexity and grandeur into their little 48 K Atari 800s? Ybarra was likely motivated by another practical consideration: Civilization belonged to Avalon Hill, who were trying (with somewhat mixed results) to make a go of it in computer games, the place to which a dismaying quantity of their old customers were migrating. They weren’t likely to make it easy or cheap for a competitor to release a computer version of one of their big board-game titles. Better, Dan’s colleagues all argued, to make a more modest original game with some of the spirit of Civilization. This sort of conversation wasn’t that new to Ozark. Dan’s three partners, as well as Ybarra, freely acknowledged him to be a creative genius far beyond their own lights. Geniuses, however, occasionally need to be gently steered down practical roads, lest their Big Ideas overwhelm their sense of proportion; this they saw as part of their own modest contributions to Ozark.

Also in Ozark’s collection of board games was another Avalon Hill effort called Conquistador, a game of “The Age of Exploration: 1495-1600.” In it each player takes control of one of the great European powers as they explore, conquer, and eventually colonize the New World. Late in the game, wars among the players can develop as virgin territory gets harder to come by. It’s only slightly less daunting than Civilization; a complete game usually lasts five hours or so. Still, it felt like a concept that could be more readily pared down to something manageable on the Atari 800, and like one that could inspire a computer game original enough — particularly with a designer of Dan’s creativity on the project — that licensing wouldn’t become an issue. It also dealt with a subject that Dan and his brother Bill had found genuinely fascinating from an early age, when an uncle had given them their first book about the Conquistadors. And so Ozark’s next project was decided by the time the week was out.

Dan, still pining for Civilization, was initially not hugely enthusiastic. After he and Bill collected a tall stack of history books and dove in, however, he started to come around. Once again the Big Ideas started to come thick and fast. He envisioned a game played in three stages, like Conquistador. In the first, you would be exploring and dealing with the natives you encountered, which basically meant either trading and trying to establish peaceful alliances or doing a full-on Cortés and charging through their villages with swords swinging. The second stage would have you founding colonies, establishing permanent settlements and institutions in the New World rather than rushing back to Europe with each new shipful of gold. Finally, these emerging and expanding nations would have to deal with one another. Diplomacy and the results of its breakdown, wars, would ensue. Diplomacy being pretty difficult terrain for a computer opponent to navigate, Dan envisioned a game that, like M.U.L.E., would emphasize multiplayer play while offering computerized opponents merely as practice and fillers for empty human seats around the virtual table.

It was all very ambitious. Inevitably, the heartless hand of practicality — largely in the usual form of his partners — started to pare away at it pretty quickly. Stages two and three were excised entirely; this would be a game of exploration and conquest only, not politics or consolidation. You would also be restricted to playing as a representative of Spain and to exploring the lands of South and Central America that were historically conquered by Spain. The name Seven Cities of Gold was chosen as a reflection of this new emphasis on Conquistadors seeking after the wealth of exotic legends to bring home to Spain. Most painfully, it was decided to make the game single-player only, as it was ambitious enough as it was and no one knew quite how to make multiplayer work anyway. Besides, as Ybarra and everyone else at EA — not to mention the failure of M.U.L.E. — were able to attest, for whatever reason there just wasn’t a big market for multiplayer-focused strategy games. With EA and Ybarra sticking so loyally with Ozark despite M.U.L.E.‘s fate, there was perhaps a sense that Ozark owed them the best stab they could make at giving them a hit. At any rate, they owed it to themselves if they wanted to stay in this business; it was doubtful that even EA would continue to fund them if they delivered another under-performer like M.U.L.E. And so Seven Cities of Gold became the first single-player-only game Dan had ever made.

If that was a hard compromise to accept, there were consolations. The overarching Big Idea of Seven Cities, to be emphasized if necessary at the expense of everything else, would be Discovery. Dan had quickly realized when he first played Conquistador that it could not be a true recreation of the experience of exploring an uncharted continent for one very simple reason: the player came into it with a knowledge of the geography of the Americas, and with a bit of cursory outside research could know even the location of the capital of the Inca Empire. To remedy this, he proposed making a random map available in addition to the historical one. The historical map would be there out of obligation and for learning purposes; the real game would have you exploring New Worlds that were truly new to you.

The map generator turned into the most technically challenging element of the entire project. To be worthwhile, the random maps had to be as believable and logical as the historical. They couldn’t, in other words, just scatter landscape and natives about willy-nilly. From the manual:

There is a plate-tectonics model consulted for each creation. Mountain ranges are generated where the plates bump into each other. And secondary ranges (like the Allegheny Mountains on the historical map) may be created as well.

The program also consults a cultural dissemination model for its work. The influences of major civilizations are presumed to spread outward. Consequently, pueblo dwellers generally will be found between city-states and primitive agriculturalists. The model will allow for varying levels of this influence and can thus produce occasional continent arrangements which have no Incan-level civilizations. Alternately, it can make very rich and powerful arrangements, one which, like 16th-century Japan, are highly civilized from coast to coast.

The random-map generator was assigned to Jim Rushing, the purest coding mind at Ozark, who spent some four months struggling with it. It was quite a task for the little Atari; each map took a solid ten minutes of processing to generate. Frustratingly, the end result from early versions was always the same, a continent shaped like a giant peanut. Finally, Rushing found a bug in his random-number generator. Ybarra still recalls vividly the day that Dan called him to tell him that they had generated a believable alternate New World: “The energy and excitement was terrific. Dan was both elated and burnt out, but you could ‘hear’ him grinning on the other side of the phone.” Ybarra now firmly believed they were “creating another masterpiece.”

Seven Cities of Gold Seven Cities of Gold

Indeed, the game is nothing if not elegant. Like in M.U.L.E., you are always embodied in the game by an onscreen avatar who visits shipyards, the pub, and the royal palace, and of course wanders through the New World as your surrogate. This can make it feel as much adventure game as strategy game; Seven Cities is anything but dry or abstract. The overland exploration which forms the heart of the game was inspired as much by Dan’s own experiences hiking the backwoods of Arkansas as his readings in history. He had particularly vivid memories of getting himself lost on occasion, and the relief engendered by coming upon a road or other familiar landmark. Seven Cities can prompt similar feelings as you press ever deep into this vast, unknown continent. The feeling of relief at finding your trusty ship sitting right there where you left it as you stumble out of the jungle with food supplies dwindling is almost indescribable.

Seven Cities of Gold

But maybe the supreme achievement in verisimilitude is your interaction with the native tribes, as brilliant an abstraction of real-world experience into interface as the auctions of M.U.L.E. When you enter a new village, the inhabitants gather around you, surround you disconcertingly. You can give them gifts or try to “amaze” them by acting the part of one of their gods, but it’s always an uncertain, tentative communication that could erupt into violence at any time. Many times you will find yourself all but forced into massacring a village — or being massacred by them — by an inadvertent push of the joystick or a single panicked shot by a member of your own army who goes out of your control. It’s a superb simulation of how these encounters between two utterly alien cultures without a single word in common between them must have actually felt to the participants, and a lesson in just why they so often ended in violence even when both parties entered them with the best of intentions. Incredibly for a game of this vintage, the natives remember and communicate with one another. Attack one tribe in a region and the others will be much more suspicious. Try to “amaze” a group of natives too many times and it starts to become old hat — and they start to become suspicious.

For the desperately idealistic Dan, who was always eager to instil “a meaningful message,” the moral dimension of these encounters and the impact they would make on the player’s psyche were key not only to his game but to his very sense of his own worthiness as a person:

“The people I admire are the people who went to jail instead of Vietnam, or who go to India to do some good, or who are really committed to the environment. Those are the people who are really admirable. What I’m doing seems less important. I want to make a significant impact in a person’s life.”

Yet Seven Cities doesn’t preach; it leaves you to your conscience. Unquestionably, violence in Seven Cities often does pay, just as in real history, and the problematic nature of this was not lost on Dan:

“Many of the Conquistadors treated the natives horribly. Theirs was an arrogant and prideful approach to a society that had its own history and roots. But to be historically accurate required that we had to include violence. I don’t like the idea of players hurting other things, but there’s no alternative or you’re forcing your own moral decisions on an audience that ought to have a choice themselves.

“Bill and I were real Indian sympathizers when we were growing up. We always sided with the Indians instead of the cowboys. It just seems like such a neat, romantic culture to us, so in tune with the earth. Then to write a game where at least part of the game is wiping out Indians — that’s problematic.”

Seven Cities of Gold

Seven Cities comments on your behavior toward the natives in only one way: if you get truly savage, the king will eventually tell you to please stop killing so many of them. But these words are never backed by any action, and the priority always remains to keep the gold flowing. The crown refuses to acknowledge that the gold and the killing that produces it are often inseparable. Such halfhearted carping is, as Dan noted, lifted straight from history, where it provided a way for those back in Spain to feel morally absolved while still benefiting from the killing and plundering of their countrymen.

Brilliant as its individual elements are, I do tend to have a problem appreciating Seven Cities of Gold as a holistic game. There is no competitor to play against, unless you count the natives you encounter — and the power imbalance between you and them is ultimately so great that, even if you choose the warlike approach, it’s hard to think of them in that role. Nor is there any in-game way to really lose or win. There isn’t even a definite end-point; you begin in 1495 and receive a rank based on your performance in 1540, but can continue to play on after that as long as you like. After a while the thrill of filling in the map and adding to my counts of missions and ships and rivers and other landmarks discovered starts to fade, and I start wishing for some goals, some sort of more specific direction. Dan himself delivered the following answer to the question, “How do I win?”

However you want. Seven Cities is a process-type game. You go along like real life. Life doesn’t have ends and wins and things like that. It has processes that you go through, and at times you stand back and say, “Hey, I’ve done pretty good so far.” Set your own goals really high and say, “That’s how I win.” Then go for it.

All of which is fine on the face of it, but Seven Cities just doesn’t feel to me like an intentional sandbox game in the mold of the (much later) SimCity. It rather feels like a design which is missing something it was originally intended to have, a possible legacy of the process of paring it down to a manageable project. On the other hand, I’m also willing to accept that this may just be down to a failure of imagination on my part, as the game received stellar reviews in its day and is still loved by many today.

Be that as it may, it seems to me that Seven Cities of Gold is still not so significant as a game in itself as for the frontiers it opened and the new things it dared to try. Its custom worlds, randomly generated at heart but meticulously terra-formed to be as believable as our own, became a staple of grand strategy games to come. Its consistent prioritizing of friendliness and elegant playability — the entire game is controlled, simply and intuitively, with a single joystick, and its manual consumes a scant 8 pages that spend as much time on historical background as game mechanics — served as an object lesson to games that used every key on the keyboard. The random maps, the native-encounter sequences, and the way that the whole game is played in “real time” rather than discrete turns also demonstrated how to take advantage of the unique abilities of the computer, rather than just encoding the rules of what could otherwise be a board game. And of course the moral ambiguity of this whole exercise, in which even peaceful alliances are ultimately made for the purpose of conquer and plunder, is never swept under the rug. Seven Cities has something to teach us about politics and history and perhaps even human nature. It goes well beyond the equipment lists and purely tactical concerns of a typical war game of the sort being pumped out by companies like SSI by the handful by 1984. Trip Hawkins declared that Ozark had pioneered a whole new genre of computer software: “edutainment.”

M.U.L.E. offered many of the same design lessons, but few others were thinking on this level at the time of Seven Cities‘s mid-1984 release. Troy Goodfellow goes so far as to call it “the most influential strategy game ever made.” That’s a bold statement even if we assume he’s implicitly restricting the field to computerized strategy games, but just the fact that Seven Cities is in the running in a world that contains games like Civilization (the computer version) says volumes.

Speaking of Civilization: one of those entranced by Seven Cities‘s innovations was a 30-year-old programmer and designer named Sid Meier, co-owner of a small publisher called MicroProse. Seven Cities and its determination to make strategy and history attractive and approachable became the inspiration for his breakthrough title, 1987’s Pirates!. The embodied interface of that game, which blurred the lines between strategy, adventure, and RPG while always making you feel that you are there, is lifted straight from Seven Cities; even the fonts and whole visual looks of the two games are similar. If Pirates!, generally acknowledged as one of the best computer games ever created, is even better than Seven Cities of Gold, it’s also true that it never could have existed without it, as Meier himself would happily admit. Later still, of course, Meier would manage what Dan never quite could, bringing all the complexity and grandeur — and then some — of the board game Civilization to the computer in an amazingly accessible way. Again, the road to Sid Meier’s Civilization could only have passed through Seven Cities of Gold.

While it may be most notable for the games it influenced, it’s also possible to say something about Seven Cities of Gold that is all too unusual in Dan Bunten’s career: the game was a hit in its own right. EA’s faith in their backwoods savants paid off at last, as Seven Cities sold at least 150,000 copies, five times the numbers done by M.U.L.E. and enough to make it EA’s biggest new game of 1984, in versions for the Atari 8-bit (the original and always the one that Dan himself saw as definitive), Commodore 64, Apple II, IBM PC, and eventually the Macintosh and the Amiga. Most of these ports were done by Ozark themselves or people within their circle. The Apple II version, for instance, came courtesy of a young University of Arkansas student named Mark Botner who spent the summer of 1984 working with them. He makes those days in the big house by the lake sound like an idyllic summer romance:

“What fun we had that summer. We would take a walk every day around Broadmoore Lake while our programs assembled for 15 minutes or so. We flew model airplanes, floated model boats in the lake, and played many different games. And, we actually got Seven Cities Of Gold ported to the Apple II!”

They were good times indeed. Suddenly the world was onto Dan Bunten and Ozark Softscape, as they (and particularly he) found themselves in constant demand for interviews, mentioned among the elites of the game industry.

While we leave our friends to enjoy a welcome taste of fame and fortune, you might want to try Seven Cities of Gold for yourself. If so, feel free to download the definitive Atari 800 version for use in the emulator of your choice.

(Finally, sources, which are largely the same as that last article: Dan wrote a column for Computer Gaming World from the July/August 1982 issue through the September/October 1985 issue. Those are a gold mine for anyone interested in understanding his design process. Particularly wonderful is his detailed history of Seven Cities of Gold‘s development in the October 1984 issue. Other interesting articles and interviews were in the June 1984 Compute!’s Gazette, the November 1984 Electronic Games, and the January 1985 Antic. Online, you’ll find a ton of historical information on World of M.U.L.E. Salon also published a good article about Dani ten years ago. The sadly now defunct Dani Bunten Berry Memorial Site is full of anecdotes and tributes, including the quote from Mark Botner which I used above. And see the site of the (apparently stalled) remake Alpha Colony for some nice — albeit somewhat buried — historical tidbits.)

Footnotes

Footnotes
1 Dan Bunten began living as the woman Danielle Bunten Berry in November of 1992. She died five years later.

In preparing this article, I of course reviewed what I had already written about Dani. I confess it made me cringe a bit. I have long been annoyed by the habit of people who know nothing of her games of using Dani as some generic representative of her sexuality, and wanted to move the focus firmly to her work as a game designer, to leave the politics of gender identity alone and focus on why she was a such a giant in her chosen field. I now realize that in doing so I seemed to dismiss and disrespect the other parts of her story, although I genuinely didn’t intend to do so. A couple of angry commenters who, it seemed to me, wanted to force me into the very narrative template I had been trying to avoid only hardened my position. I continue to want to avoid the standard structure of the genre of “Dani Bunten Berry stories” — but I also realize that they had some points as well. I particularly regret that I never referred to Dani by her female name even in my note at the end of the article. I literally never realized I had done this until my recent rereading, and now understand some objections about how I allegedly considered her unworthy of being referred to by her real name and the like a bit better. I won’t edit the article, as doing so could only make those who commented look unreasonable in ways they really weren’t. Anyway, it perhaps serves as a better lesson just as it is. Know, however, that if I was writing it today I would handle the issue somewhat differently.

That said, you’ve surely noticed that I continue to refer to Dani as “Dan” and “he” in the article above. I understand the logic of those who would say that Dani was always a woman, merely one who was by an accident of birth born into a man’s body. Certainly this is the argument most advocates for transsexual rights would make. For myself, I am all for transsexual rights, but also believe that gender and sexual identity may be more fluid than much transsexual rhetoric would have it. In the end, I have continued to opt for clarity and reality as all of Dani’s friends and colleagues knew it in the 1980s: of her being the man Dan Bunten. Referring to “Dani” and “she” in these articles would be confusing to the reader and at least at some level anachronistic, opposed to the consensus reality shared by everyone around her. I understand that this decision may not seem ideal to everyone, and even that it runs counter to some journalistic style guidelines. If you disagree with it, I can only ask you to believe me when I say it was made in good faith, with no intention to slight. For what it’s worth, reports are that Dani was never offended in the least by being referred to as a man when conversations came around to her years of living as Dan.

 
 

Tags: , , ,

How Things Work: Commodore 64 and Summer Games Edition

I’m always trying to convey a sense of the audacity and creativity of hackers of the early PC era, who made so much out of so little. I include amongst this group both the hardware hackers who created the machines themselves and the software hackers who took them to places even their creators never imagined. In that spirit, I thought today we’d look at how the Commodore 64’s hardware team managed to make it do some of what it could given the technical constraints under which they labored, and how the software team who created Summer Games at Epyx found ways to make it do even more than they had fully considered. So, much of this article is for the gearheads among you, or at least those of you who’d like to understand a bit more of what the gearheads are on about. If you’re a less technical sort, perhaps you’ll be consoled by learning about some of the softer factors that went into the Summer Games design as well. And if that’s not interesting, hey, you can still watch my wife and I (mostly I) fail horribly at various Summer Games events via the movie clips.

This is, by the way, my first attempt to make use of WordPress 3.6’s integrated video capabilities. You’ll need an up-to-date browser with good HTML 5 support to see the clips. Hopefully my site won’t choke on the bandwidth demands. We’ll see how we go.

While you’re waiting (hopefully not too long) for the videos to load, let’s consider the basic visual capabilities of the Commodore 64: a palette of 16 colors at a resolution of 320 X 200. Those capabilities are, to say the least, modest by modern standards, but they actually present a huge problem when paired with another key specification: the 64 has just 64 K of RAM memory. This is all there is to work with; there is no separate bank of video memory, as on a modern computer. Everything — programs, data, the contents of the screen, and miscellaneous other things like buffers for the disk drive — must draw from this pool.

Now, a modern programmer wishing to represent a 320 X 200 screen with 16 colors in memory would probably just store it as a series of pixels, with one byte devoted to each pixel and storing a value of between 0 and 15 to represent that pixel’s color. This approach, known as bitmap graphics, is straightforward and eminently flexible, but there’s a problem. Consider: a 320 X 200 screen has exactly 64,000 pixels. In other words, by devoting one byte to each pixel we’ve just filled our entire 64 K of memory with a single screen.

Let’s consider then. Even a modern programmer, if she’s a more efficient sort, might note that we only actually need four bits to store a number between 0 and 15, and could therefore, at the cost of a bit more confusing layout, pack two pixels into every byte. That reduces consumption to a little under 32 K — better, but it’s still untenable to devote half of our precious memory to the screen.

It’s because bitmap graphics are so demanding that only high-end machines like the Apple Lisa and Macintosh used them by default at the time of Summer Games‘s release. And, notably, even those 68000-powered machines only displayed black and white, which reduced the requirement from four bits per pixel to one — a simple on-off, black-or-white toggle. Let’s consider the alternative that the 64’s designers, as well as those of many other machines, employed in various ways: character graphics.

Commodore 64 startup screen

In its default mode, the 64 subdivides its screen into a grid of character cells, each 8 X 8 pixels. Thus there are 40 of them across and 25 down, corresponding to the machine’s standard text display. Elsewhere in memory are a set of up to 256 tiles that can be copied into these cells. A default set, containing the glyph for each letter, number, and mark of punctuation in addition to symbols and simple line-drawing figures, lives in ROM. The programmer can, however, swap this set out for her own set of tiles. This system is conceptually the same as the tile-graphics system which Richard Garriott used in the Ultima games, but these tiles are smaller (only the size of a single character) and monochrome, just a set of bits in which 1 represents a pixel in the foreground color, 0 a pixel in the background color. The latter color is set globally, for the whole screen. The former is specified individually for each cell, via a table stored elsewhere in memory.

So, let’s look at what all this means in terms of memory. Each cell on the screen consumes one byte, representing the number (0 to 255) of the tile that is placed there. There are 1000 character cells on a 40 X 25 display, so that’s about 1 K consumed. We need 8 bytes to store each tile as an 8 X 8 grid of on-off pixels. If we use all 256, that’s 2 K. Finally, the color table with the foreground color for each cell fills another 1 K. We’ve just reduced 32 K to 4 K, or just 2 K if we use the default set of character glyphs in ROM. Not bad. Of course, we’ve also introduced a lot of limitations. We now have to build our display, jigsaw-puzzle style, from our collection of tiles. And each cell can only use two of our total of 16 colors, one of which can be unique to that cell but the other of which must be the same for the entire screen. For someone wishing to make a colorful game, this last restriction in particular may just be too much to accept.

Enter multicolor character mode. Here, we tell the 64 that we want each tile to be not monochrome but drawn in four colors. Rather than using one bit per pixel within the tile, we now use two, which allows us to represent any number from 0 to 3. One of these colors is still set individually for each cell; the other three are set globally, for the screen as a whole. And there’s another, bigger catch: because we still only devote eight bytes to each tile, we must correspondingly reduce its resolution, and that of the screen as a whole. Each tile is now 4 X 8 (horizontally elongated) pixels, the screen as a whole 160 X 200. Even so, this is easily the most widely used mode in Commodore 64 games. It’s also the mode that Scott Nelson (little brother of Starpath co-founder Craig Nelson) chose for Summer Games‘s flag selection screen.

Summer Games country selection screen

But… wait, you might be saying. Surely the colorful screen shown above doesn’t always use the same three of the four colors within each tile. In fact, it doesn’t, and this introduces us to one of the keys to getting the most out of the Commodore 64: raster interrupts.

The picture on a cathode-ray-tube television or monitor is generated by an electron gun which moves across and down behind the screen, firing charged electrons at phosphors that coat the back of the screen glass. This causes them to briefly glow — so briefly, in fact, that the gun must paint the screen 60 times per second for televisions using the North American NTSC standard, or 50 times for the European PAL standard, in order to display a stable image without flicker. After painting each line of the screen from left to right, the gun must move back to the left to paint the next. This split second’s delay can be exploited by the Commodore 64 programmer. She can ask the machine to generate what’s known as a raster interrupt when the gun finishes painting a given line. She then has a few microseconds to make changes to the display configuration before the gun starts painting the next line. She can, for example, change one or more of the three supposedly fixed colors, as Scott Nelson does to generate the screen shown above.

But let’s say we don’t want to deal with trying to create a picture using tiles. The Commodore 64 actually does also offer a bitmap mode of sorts, albeit one with restrictions of its own that allow it to reduce the memory footprint from an untenable 32 K to a more reasonable if still painful 9 K. Here an 8 K chunk of memory is allocated to the bitmap, with each bit representing the status (on or off) of a single pixel. The foreground color represented by an “on” pixel is once again determined by a 1 K color table, with the colors still sorted into 8 X 8 pixel blocks. This leads to the most obvious oddity of the 64’s bitmap mode: the bitmap does not run all the way across the screen and then down, but rather across and down through each 8 X 8 cell that is assigned a given foreground color.

Bitmap mode on the Commodore 64

For those willing to trade resolution for colors, there is also a multicolor bitmap mode, which, like the multicolor character mode, treats each two bits as representing a single pixel of one of four possible colors. Horizontal resolution is accordingly reduced to 160 pixels. This mode is, however, more flexible than multicolor character mode in its choice of colors. Another area of memory, of 1 K, is allocated to a collection of color pairs for each cell, each pair packed into a single byte. Thus we can freely choose three of the four colors found within each cell without resorting to raster interrupts or other tricks. Total memory devoted to the display in multicolor bitmap mode amounts to 10 K.

That may not look like much at first glance, but for a programmer trying to shoehorn a complex game into 64 K it’s quite a sacrifice indeed. For this reason, and because its other restrictions could make it almost as challenging to work with as character mode, bitmap mode is not used as often as you might expect in Commodore 64 games. Summer Games is, however, a partial exception, employing bitmap mode in quite a number of places. For instance, Stephen Landrum’s opening-ceremonies sequence uses a multicolor bitmap. This sequence also demonstrates another critical part of the 64’s display hardware: sprites.


Doing animation by changing the contents of screen memory is very taxing on a little 8-bit CPU like the 64’s 6502, not to mention tricky to time so that changes are not made in the middle of screen paints, which would result in ugly jerking and tearing effects. Sprites come to the rescue. Indeed, their presence or absence is a good indication of whether a given machine from this era is pretty good at playing graphically intense games (the 64, the Atari 8-bit lines) or not (the Apple II, the IBM PC). A sprite is a relatively small graphical element which is overlaid onto the physical screen, but independent of the bitmap or tile map stored in memory. It can be moved about quickly at minimal cost, just by changing a couple of registers. The display circuitry does the rest.

The 64 offers eight sprites to the programmer, each exactly 24 pixels wide by 21 tall. The image for each is stored in memory as the usual grid of on/off bits, for the modest total of 64 bytes used per sprite. An on bit represents the sprite’s color, of which each has exactly one; an off bit represents transparency, so that whatever is on the screen behind shows through. This means that the 24 X 21 pixel size is not so arbitrary as it may first appear; a smaller sprite can be displayed simply by turning off the unneeded pixels.

There is also the inevitable multicolor sprite, which gives us three foreground colors to work with at the expense of half of our horizontal resolution. In this mode, the sprite is effectively just 12 X 21 pixels, but each pixel is now twice as wide as before, resulting in the same physical width on the screen. As in multicolor character mode, the second and third colors are fixed across all sprites in this mode.

A sprite can be pointed to different addresses in memory for its image between screen paints, creating the possibility of making animated sprites which cycle through a sequence of frames, page-flip style. Likewise, single- and multicolor sprites can be placed together and moved in lockstep to create larger or more complex onscreen figures. In the sequence above, the runner is made from three single-color sprites, each of which cycles through 14 frames of animation. (If you’ve played Impossible Mission, he may look familiar to you: he is in fact the same sprite as your avatar in that game, which Dennis Caswell happily shared with his colleagues.) The flames are four multicolor sprites, each with four frames of animation. And each of the eight doves is a single single-color sprite of eight animation frames.

But… again, wait. That’s far more than eight sprites in total, isn’t it? As you may have guessed, Landrum uses raster interrupts to reconfigure and thus reuse sprites as each screen paint proceeds. With the addition of such tricks the 64’s effective limit becomes not eight sprites in total but no more than eight sprites horizontally parallel with one another.

Let’s take another example, this time one showing an actual, interactive event in action: Stephen Landrum’s pole vault. I have my usual mediocre performance in the clip that follows, but my wife Dorte kicks some ass and actually demolishes our old world record.


The screen you see here is another multicolor bitmap. The vaulter is made up of three single-color sprites, which cycle through seven frames of animation as he runs and are then changed appropriately to reflect his state after he goes airborne. The pole is three single-colored sprites and the crossbar is a single multicolored sprite, as is, surprisingly and cleverly, the stationary top of the nearer (right-hand) upright. To understand this last, we have to understand the 64’s concept of sprite priority. Sprites are numbered from 0 to 7. If two sprites overlap one another, the sprite with the lower number is drawn on top of the one with the higher number. Landrum uses this property to easily create the illusion of the jumper passing behind the nearer upright as he soars through the air.

You might have noticed that the pole, the crossbar, and the upright are all quite large. This is down to yet another feature of the 64’s sprite system. It’s possible to expand a sprite vertically or horizontally or both, doubling its size (but not its resolution).

The pole vault is not quite as polished as most of the events, which may be a sign that, as one of the later events completed, it was a bit rushed. There’s some odd artifacting in the pole, for instance. And there’s a wonderful bug that lets you vault under the crossbar on its highest setting, creating a world record for the ages.


The two swimming events, which were started by Randy Glover but finished by Landrum following the former’s abrupt resignation, are the most complex in Summer Games. They’re largely an exercise in rhythm; you have to press the joystick button as your swimmer’s arms enters the water, then release it when they emerge. I’m awful at it, but Dorte is pretty good.


The clock at top right is formed from six single-color sprites, each swimmer from four. The rest of what you see here may begin to illustrate how crazy you can get with raster interrupts. Each paint begins with the 64 in single-color bitmap mode. This allows the text (“Ready… Set… Go!”), which is drawn and erased directly into the bitmap, to be rendered in the higher resolution. But then, just as the electron gun reaches the top of the stands, the screen is changed to a multicolor bitmap.

Glover and Landrum use a technique known as double buffering to make the scrolling as smooth as possible. There are actually two bitmaps in memory, one of which is always being displayed and the other of which is being updated by the CPU for the next step in the scroll. When the time comes, the two are swapped, as the 64’s VIC-II graphics chip is pointed to the other in the pair. Well, it’s almost that simple. Complications arise because the poor 6502 just doesn’t have time to completely redraw a screen in memory for every pixel of scroll. Luckily, it doesn’t have to. The VIC-II also has what are known as horizontal and vertical fine-scrolling registers. They allow the programmer to shift the bitmap that appears onscreen by from 1 to 7 bits to the right (as in the swimming events) or down. Since this will create an ugly empty zone at the edges of the display for which the computer has no pixel data to display, another register lets the programmer expand the size of the border slightly to cover these cells — the width of the screen is reduced from 40 to 38 columns, or the height from 25 to 24 lines. Now it’s possible for Glover and Landrum to scroll the screen eight pixels before having to swap to the alternate bitmap, giving the CPU time to make said bitmap. Double buffering is rather unusual to find on the 64, as it’s horrendously expensive in memory. And indeed, the swimming events use virtually every last byte.

But that’s probably enough tech talk for today. Just for fun — and because if you got through all that you’ve earned it — let’s look at the other events in somewhat less exhaustive (exhausting?) detail.

The two running events have their origin in Starpath’s old Supercharger decathlon project, but were brought to the 64 and completed by Brian McGhie. Like virtually everyone at Epyx, he had no particular knowledge of or burning interest in Olympic sports. He therefore relied on a stack of old Sports Illustrateds to try to get the look of his runners and the stadium right. The events are very similar in appearance, but unlike the swimming events very different in execution. The 100 meter dash is a notorious joystick killer. You have to move the stick back and forth as quickly as possible — nothing more, nothing less. The 4 X 400 meter relay, by contrast, is the most cerebral of the events, a game of energy conservation and chicken. I’m unaccountably good at both, much to Dorte’s frustration.


Interestingly, the scrolling in these events is implemented in an entirely different way from that in the swimming, illustrating how very much Summer Games is really a collection of individual efforts brought together under one banner. McGhie uses a multicolor character screen, and rather than using double buffering updates the hidden border areas on the fly to… but I promised to stop with the tech talk, didn’t I?

The diving event is yet another of Landrum’s. The diver here rather disconcertingly never surfaces after entering the water, simply because Landrum ran out of time.


Skeet shooting was a joint project of John Leupp, Steve Mudry, and Randy Glover prior to his departure. They originally planned to show the shooter on the screen, as in all of the other events, but found it difficult to work out a practical way of implementing the event from that perspective. So skeet shooting received the only first-person perspective in Summer Games, and the poor shooter was left out entirely.


Finally, there’s the gymnastics event — really just a vault — by Mudry. In an example of the, shall we say, casual approach to box art that was so rife in this era, the Summer Games box shows someone doing a handstand.


If nothing else, this article has hopefully conveyed what a tricksy machine the Commodore 64 is, full of hidden capabilities and exploitable quirks. Learning to make it dance for you requires considerable time even if you have examples to follow. If you don’t… well, small wonder that its games were just beginning to come into their own in 1984, the year it had its second birthday. And Epyx and companies like it were barely scratching the surface. In a couple of years Summer Games would look downright quaint.

You can download the original Commodore 64 Summer Games and its manual from here if you like, for use in the emulator of your choice (I recommend VICE). Unlike most of the disk images floating around the Internet, this one is pristine, with the original set of world records, so you and your friends and/or family can make your own records — which is about 20% of the fun of playing Summer Games — rather than be shamed by the performances of obsessed teenagers from two or three decades ago.

We’ll continue to observe the Commodore 64 scene with interest in future articles. But next we’ll check in with a group of Atari 8-bit loyalists: the backwoods savants of Ozark Softscape.

(This article draws again from the Epyx retrospectives in the July 1988 and August 1989 issues of Commodore Magazine. Technical details of Summer Games were drawn from the Commodore 64 design case study which appeared in the March 1985 IEEE Spectrum. I also lifted the diagram showing the 64’s unusual bitmap mode from there. For what it’s worth, my favorite 64 technical reference is Mapping the Commodore 64 by Sheldon Leemon. And if I may be forgiven a blatant plug, do check out my book on the Amiga if you’re interested in the sort of technical details I’ve delved into in this post. Some of what I go into in the book actually apply equally to the 64, and I explain basic concepts, starting with what a bit and byte actually are, much more fully there.)

 
 

Tags: , , ,

From Automated Simulations to Epyx

When Robert Botch joined Automated Simulations as director of marketing just as 1982 expired, it wasn’t exactly the sexiest company in the industry. They were still flogging their Dunjonquest line, which now consisted of no less than eleven sequels, spinoffs, and expansions to Temple of Apshai. More than a year after co-founder Jon Freeman had left in frustration over partner Jim Connelley’s refusal to update Automated’s technology, the entire line was still derived from the same BASIC-based engine that had first been designed to run on a 16 K TRS-80 back in 1979. It was hard for anyone to articulate why someone would choose to play a Dunjonquest game in a world that contained Ultima and Wizardry. And, indeed, Automated’s sales numbers were not looking very good, and the company had stopped making money almost from the moment that the Ultima and Wizardry series debuted. Still, that hadn’t prevented them from benefiting from the torrents of venture capital that entered the young industry in 1982, courtesy of the pundits who were billing home computers as the next big thing to succeed the game consoles. But now the investors were getting worried, wondering if this stodgy company and their somewhat pedantic approach to gaming had really been such a good risk after all. Thus Botch, whom Connelley hired under pressure to remake Automated’s image.

Botch’s first assignment was to visit the Winter Consumer Electronics Show in Las Vegas that January, with Dunjonquest titles in tow to display to the crowd on a big-screen television rented for the occasion. Botch, who knew nothing about computers or computer games, didn’t much understand the Dunjonquest concept. He could hardly be blamed, for just trying to figure out which one you could play was confusing as hell: you needed to already have Temple of Apshai to play these additional games, but needed Hellfire Warrior to play those, etc. He was therefore relieved when another employee handed him a disk containing a straightforward, standalone action/puzzle game for the Atari home-computer line called Jumpman, a sort of massively expanded version of the arcade classic Donkey Kong with thirty levels to explore. Unusually for Automated, who usually developed games in-house, its presence was the result of an unsolicited third-party submission from a hacker named Randy Glover.

Randy Glover, developer of Jumpman

Randy Glover, developer of Jumpman

Botch was such a computer novice that he couldn’t figure out how to boot the game; his colleague had to tell him to “put it in that little slot over by the computer.” But when he finally got it working he fell in love. The rest of the show turned into an extended battle of wills between Botch and Connelley. The latter, who was determined to showcase the Dunjonquest games, would “come over, yell a lot, and tell me to take the disk out. Whenever he left the room, I’d load the program in again.” The crowd seemed to agree with Botch: he left CES with a notebook full of orders for the as yet unreleased Jumpman, convinced that in it he had seen the only viable future for his new employers.

The embattled Connelley saw his power further eroded the following month, when the investors brought in Michael Katz, an unsentimental, hard-driving businessman with an eye for mainstream appeal. He had spent the past four years at Coleco, where he had masterminded the launch of some very successful handheld electronic games as well as the ColecoVision console, which had just sold more than 500,000 units in its first Christmas on the market. It was first agreed that Connelley and Katz would co-lead the company, but this was obviously impractical and untenable. In a scenario that could have easily happened to Ken Williams at Sierra if he had been less strong-willed and business-savvy, Connelley was being eased out of his own company by the monied interests he had welcomed with open arms. Seeing which way the wind was blowing, he left within months, taking a number of his loyalists with him to form a development studio he named The Connelley Group, which would release a couple of games through Automated before becoming free agents and eventually fading away quietly.

Katz, Botch, and the other newcomers were thus left alone to literally transform Automated Simulations into a new company. Automated had for some time now been branding many of their games with the label of “Epyx,” arrived at because their first choice, “Epic,” was already taken by a record label. No matter; “Epyx” was a better name anyway, proof that even a blind-to-PR squirrel like Connelley could find a nut every now and again. Katz and Botch now made it the official name of the reborn company, excising all trace of the stodgy old “Automated Simulations” name. Gone also would be the nerdy old Dunjonquest line which positively reeked of Dungeons and Dragons sessions in parents’ basements. They would instead strive to make Epyx synonymous with colorful, accessible games like Jumpman, aimed straight at the heart of the mass market. The old slogan of “Computer Games Thinkers Play” now became “Strategy Games for the Action-Game Player,” and they hired Chiat Day, Apple’s PR firm and the hottest such in Silicon Valley, to remake Epyx’s image entirely.

Epyx

Jumpman itself made a good start toward that goal. It was a huge hit, especially once ported to the Commodore 64. One of the first games to really take proper advantage of the 64’s audiovisual capabilities, it hit that platform like a nova at mid-year, topping the sales charts for months and probably becoming the bestselling single Commodore 64 game of 1983. It alone was enough to return Epyx to profitability. Unsurprisingly given commercial returns like that, from now on Epyx would develop first and most for the 64. They also hired Glover to work in-house. Before the end of the year he had already delivered a cartridge-based pseudo-sequel, Jumpman Junior, to reach ultra-low-end systems without a disk drive.

But now Katz had a problem. Other than Glover, he lacked the technical staff to make the Jumpmans of the future. Most of them had left with Connelley — and anyway games like their old Dunjonquests were exactly what the new Epyx didn’t want to be making. Then Starpath caught Katz’s eye.

Back in 1981, two former Atari engineers, Bob Brown and Craig Nelson, had founded Arcadia, Inc., eventually to be renamed Starpath after the release of the Arcadia 2001, an ill-conceived and short-lived games console from Emerson Radio Corporation. Drawing from friends, family, and former colleagues, Brown and Nelson put together a crack team of hardware and software hackers to make their mark in the Atari VCS market. Their flagship product was the marvelously Rube Goldberg-esque Supercharger, which plugged into the VCS’s cartridge port and added 6 K of memory (which may not sound like much until you remember that the VCS shipped with all of 128 bytes), new graphics routines in ROM, and a cable to connect the console to a cassette player. Starpath developed and released half a dozen games on cassette for use with the Supercharger, most of them apparently quite impressive indeed. But problems dogged Starpath. The company lived in constant fear of legal action by Atari, whom Brown and Nelson had not left on particularly good terms, in response to their unauthorized expansion. It did eventually become clear that Starpath had little to fear from Atari, but for the worst possible reason: the videogame market was collapsing, and Atari had far bigger problems than little Starpath. By late 1983 Starpath was floundering. Katz swooped, buying the entire company for a song and moving them lock, stock, and barrel from Santa Clara, California, into Epyx’s headquarters in nearby Sunnyvale.

Katz had no interest in any of Starpath’s extant products for a dead Atari VCS market. No, he wanted the programming talent and creative flair that had led to the Supercharger and its games in the first place. If they could do work like that on the Atari VCS, imagine what they could do with a Commodore 64. The Starpath folks would prove to be the final, most essential piece in his remaking of Epyx.

One of Starpath’s programmers, Dennis Caswell, had been playing around with ideas for a platforming action-adventure game before the acquisition. Indeed, he was already at work trying to animate the running man who would be the star. It was decided to let Caswell, who had three Supercharger games under his belt, run with his idea on the Commodore 64. He says his elation at the platform change was so great that “I unplugged my [Atari] 2600 and threw it out of my office and into the hall.” Working essentially alone, Caswell crafted one of the iconic Commodore 64 games and one of the bestselling in the history of Epyx: Impossible Mission.

Starpath had also been working on a decathlon simulation. In fact, it was far enough along to be basically playable. They discussed porting it to the 64, but the capabilities of that machine quickly led them to think about something more than just a simulation of track and field. Why not use the luxury of 64 K of memory and disk-based storage to simulate a broader cross-section of Summer Olympic events? With the 1984 Summer Olympics coming to Los Angeles, it seemed the perfect game for the zeitgeist, with exactly the sort of mass-market appeal Katz wanted from his new titles. He thought it a brilliant idea, and even went so far as to approach the Olympic Committee about making it an officially licensed product. He found, however, that Atari had long before sewn up the rights, back when they had been the fastest growing company in America. Epyx therefore decided to do everything possible to associate the game with the Olympics without outright declaring it to be an official Olympics simulation. They pushed the envelope pretty far: the game would be called Summer Games, would begin with an opening ceremony and a runner lighting a flame to the strains of “Bugler’s Dream,” would offer medals, would (as its advertising copy proclaimed) let you “go for the gold!” representing the country of your choice. Such legal boundary-pushing became something of a habit; witness Impossible Mission, which plainly hoped to benefit from an association with Mission: Impossible. (This in spite of the fact that Scott Adams had already been forced by the lawyers to change the name of his third adventure from Mission Impossible to Secret Mission.) In the case of Summer Games, Epyx likely got away with it because Atari was in no financial shape to press the issue and the Olympic Committee, never the most progressive institution, was barely aware of home-computer games’ existence. To this day many people are shocked to realize that Summer Games is not actually an official Olympics game. It all speaks to Katz’s determination to create games that felt up-to-date and relevant to the times. Yes, sometimes that could backfire, leading to trying-way-too-hard titles like Break Dance. Much of the time, however, it was commercial gold.

The original design brief for Summer Games called for ten events. The team also very much wished to include head-to-head, real-time competition wherever the nature of the sport being simulated allowed it. Beyond that, they would pretty much make it up on the fly; even the events themselves were largely chosen in the moment. The Starpath programmers’ talents were augmented by Randy Glover of Jumpman fame and Epyx’s first full-time artist, Erin Murphy. They were all under the gun from the start, for Katz wanted them to have something ready to show at the 1984 Winter CES, barely six weeks away when the project was officially green-lit. They worked through the holidays to deliver. Epyx arrived at CES with a very impressive albeit non-interactive opening-ceremonies sequence, fairly playable 4 X 400-meter relay and 100-meter dash races (both partially adapted from Starpath’s old decathlon project), and a diving event. At the show they learned that they had more competition in the (pseudo-)Olympics genre beyond Atari. HESWare, an aggressive up-and-comer not that dissimilar to Epyx who were about to sign Leonard Nimoy as their spokesman, showed HES Games. The prospect pushed Epyx to make sure Summer Games both met its planned pre-Summer Olympics release date and was as good as they could make it. To help with the former, the original plan for ten events was reduced to eight, principally via the sacrifice of weight lifting (fans of which sport would have to wait until 1986’s World Games to get their due). To help with the latter, more resources and personnel were poured into the project.

Even as this happened, attrition, a constant at Epyx, also became a concern. Katz’s new Epyx could be a rewarding place, but also an unrelentingly intense and competitive one, full of mathematical athletes convinced they were the smartest people in the room and all too happy to demonstrate it at their rivals’ expense. The spirit of competition extended beyond working hours; hundreds of dollars changed hands weekly in epic games of poker. Even some of Epyx’s brightest stars eventually found the company’s testosterone- and brainpower-fueled culture too much to take. Thus Starpath co-founder Bob Brown, finding Starpath’s new masters not to his liking, left quite soon after the acquisition, and Randy Glover, who had been assigned to the swimming events, abruptly left not long after CES. The swimming events were taken up by Stephen Landrum, the biggest single contributor to the project as a whole, who also did the opening ceremonies and the diving and pole-vaulting events.

It had been decided early on that Summer Games would let you compete as the representative of any of a variety of nations, complete with flags and national anthems to play during the medal ceremonies. Since it obviously would not be possible to include all of the 140 countries who would participate in the real Olympics, Epyx was left with the question of which ones should make the cut. Beyond the big, obvious powerhouses of the United States, the Soviet Union, and China, commercial considerations once again reigned supreme here. Katz had begun signing deals with foreign distributors, pushing hard to get Epyx’s games into the vibrant British and steadily emerging Western European software markets. Epyx reasoned that players in these countries would want the opportunity to represent their own nation. Thus relative Summer-Olympic non-factors like Norway and Denmark were included in the game, while potent teams from parts of the world that didn’t buy computer games, like East Germany, Romania, and Yugoslavia, were omitted. Most of the countries included had never been visited by anyone at Epyx. They sourced the flag designs from a world atlas, and called consulates and sales connections in Europe to drum up sheet music for the various anthems. Many of those anthems had never been heard by anyone working on the game; if some sound a bit “off” in tone or tempo, perhaps that’s the reason. For the coup de grâce, Epyx couldn’t resist including their own company as one of the “nations,” complete with a national anthem that was actually the Jumpman theme.

Summer Games was nearing the final crunch time on May 8, 1984, when the Soviet Union initiated a boycott of the Los Angeles Games in a rather petty quid pro quo for the West’s boycott of the 1980 Moscow Games. (The people who were really hurt by both gestures were not the governments of the boycottees but a generation of athletes on both sides of the political divide, who lost what was for many literally a once-in-a-lifetime opportunity to compete against the true best of their peers on the biggest stage their sports could offer them.) Epyx quickly decided to leave the Soviet Union in their version of the Olympics. After the game’s release, they reached out a bit cheekily to the Soviets in real life. Botch:

We sent the Russian [read: Soviet] embassy (in Washington, D.C.) several copies of Summer Games for the Commodore 64. An enclosed letter stated since they would not be competing in the regular Olympics, at least they could participate in our version of the Games. This package was eventually returned to us with a thank-you note, because they only had access to Atari home computers. Our marketing people quickly replaced the Commodore software with Atari material and sent it back. I always wondered if they enjoyed the game, because we never heard from them again.

Epyx’s bigger concern was the same as that of everyone involved with the Los Angeles Games, whether directly or tangentially: what commercial impact would the boycott have? It seemed it must inevitably tarnish the Games’ luster somewhat. In the case of both Summer Games and the Olympic Games themselves, the impact would turn out to be less than expected. The latter has gone down in history as the most financially successful Olympics of modern times, while Summer Games would become — and this probably comes as anything but a spoiler to most of you — one of the bestselling computer games of the year, and the first entry of the bestselling series in the history of the Commodore 64.

Katz was determined to get Summer Games out in June, to beat HES Games to the market and to derive maximum advantage from the pre-Olympics media buildup. The team worked frantically to finish the final two events (gymnastics and skeet shooting) and swat bugs. They worked all but straight through the final 72 hours. Disks went into production right on schedule, the morning after the code they contained had been finalized.

Summer Games

Summer Games went on to sell in the hundreds of thousands across North America and Europe, thoroughly overshadowing the less impressive Olympian efforts of HESWare and Atari, the latter of whose games were at any rate only available on their own faltering lines of game consoles and home computers. It would be ported to a variety of platforms, although it would always remain at its best on the Commodore 64. Together with Impossible Mission and a racing game developed by the indefatigable Landrum and Caswell called Pitstop II, both also huge worldwide smashes, Summer Games completed the remaking of Epyx’s image and made of them a worldwide commercial powerhouse. Being for the most part conceptually simple games without much dependence on text, most of Epyx’s games were ideally suited to do well in non-English-speaking countries. Combined with Katz’s aggressive distributional push, this was key to making Epyx one of the first big entertainment-software publishers that could be said to be truly international. With so many potential customers to serve in emerging new markets and several new hits in addition to the still popular Jumpman, sales in 1984 soared as Epyx enjoyed almost exponential growth in earnings as the months passed.

We’ll continue the story of Epyx later, but for now I’m not quite done with Summer Games. Next time I’d like to do something I haven’t done in a while: dig into the technology a bit and explain how some of the magic that wowed so many back in 1984 actually works. It will also give us a chance to get to know the Commodore 64, a computer whose importance to gaming during the middle years of the 1980s can hardly be overstated, just a little bit better.

(The bulk of this article is drawn from two lengthy retrospectives published in the July 1988 and August 1989 issues of Commodore Magazine. The pictures of Randy Glover comes from the April 1984 K-Power.)

 
 

Tags: , ,

A Computer for Every Home?

On January 13, 1984, Commodore held their first board of directors meeting of the year. It should have been a relaxed, happy occasion, a time to make plans for the new year but also one last chance to reflect on a stellar 1983, a year in which they had sold more computers than any two of their rivals combined and truly joined the big boys of corporate America by reaching a billion dollars in gross sales. During the last quarter of 1983 alone they had ridden a spectacular Christmas buying season to more than $50 million in profits. Commodore had won the Home Computer Wars convincingly, driving rival Texas Instruments to unconditional surrender. To make the triumph even sweeter, rival Apple had publicly announced the goal of selling a billion dollars worth of their own computers that year, only to fall just short thanks to the failure of the Lisa. Atari, meanwhile, had imploded in the wake of the videogame crash, losing more than $500 million and laying off more than 2000 workers. Commodore had just the previous summer moved into a sprawling new 585,000 square-foot, two-story headquarters in West Chester, Pennsylvania that befitted their new stature; some of the manufacturing spaces and warehouses in the place were so large that Commodore veterans insist today that they had their own weather. Yes, it should have been a happy time at Commodore. But instead there was doubt and trepidation in the air as executives filed into the boardroom on that Friday the 13th.

A day or two before, Jack Tramiel had had a heated argument with Irving Gould, Commodore’s largest shareholder and the man who controlled his purse strings, in the company’s private suite above their exhibit at the 1984 Winter Consumer Electronics Show. That in itself wasn’t unusual; these two corrupt old bulldogs had had an adversarial relationship for almost two decades now. This time, however, observers remarked that Gould was shouting as much as Tramiel. That was unusual; Gould normally sat impassively until Tramiel exhausted himself, then quietly told him which demands he was and wasn’t willing to meet. When Tramiel stormed red-faced out of the meeting and sped away in the new sports car he’d just gotten for his 55th birthday, it was clear that this was not just the usual squabbling. Now observers outside the board-of-directors meeting, which was being chaired as usual by Gould, saw him depart halfway through in a similar huff. He would never darken Commodore’s doors again.

No one who was inside that boardroom has ever revealed exactly what transpired there. With Gould and Tramiel both now dead and the other former board members either dead or aged, it’s unlikely that anyone ever will. On the face of it, it seems hard to imagine. What could cause these two men who had managed to stay together through the toughest of times, during which Commodore had more than once teetered on the edge of bankruptcy, to irrevocably split now, when their company had just enjoyed the best year in its history? We can only speculate.

Commodore had ceased truly being Tramiel’s company in 1966, when Gould swooped in to bail him out from the Financial Acceptance Scandal of the previous year. Tramiel, however, never quite got the memo. He continued to run the company like a sole proprietor to whatever extent that Gould would let him. Tramiel micro-managed to an astonishing degree. He did not, for instance, believe in budgets, considering them a “license to steal,” a guarantee that the responsible manager, knowing he had X million available, would always spend at least X million. Instead he demanded that every expenditure of greater than $1000 be approved personally by him, with the result that much of the company ground to a halt any time he took a holiday. Even as Tramiel enjoyed his best year ever in business, Gould and others in the financial community were beginning to ask the very reasonable question of whether this was really a sustainable way to run a billion-dollar company.

Still, the specific cause of Tramiel’s departure seems likely to have involved his sons. Tramiel valued family above all else, and, like a typical small businessman, dreamed of leaving “his” company to his three sons. Whether by coincidence or something else, it even worked out that each son had an area of expertise that would be critical to running a company like Commodore. Sam, the eldest, had trained in business management at York University, while Gary, the youngest, was a financial analyst with a degree from Manlow Park College and experience as a stockbroker at Merrill Lynch. Leonard, the middle child, was the intellectual and the gearhead; he was finishing a PhD in astrophysics at Columbia, and was by all accounts quite an accomplished hardware and software hacker. Sam and Gary already worked for Commodore, while Leonard planned to start as soon as he finished his PhD in a few more months. Various witnesses have claimed that Tramiel the elder now wished to begin more actively grooming this three-headed monster to take more and more of his responsibilities, and someday to take his place. Feeling nothing good could come out of such blatant nepotism inside a publicly traded corporation that was trying to put its somewhat seedy history behind it, Gould refused absolutely to countenance such a plan. Given Tramiel’s devotion to his family and his attitude toward Commodore as his personal fiefdom, it does make a degree of sense that this particular rejection might have been more than he could stomach.

In any case, Tramiel was gone, and Gould, who had made his fortune in the unglamorous world of warehousing and shipping and was reportedly both a bit jealous of Tramiel’s high profile in an exciting, emerging industry and a bit embarrassed by his gruff, untutored ways, didn’t seem particularly distraught about it. The man he brought in to replace him could hardly have been more different. Marshall F. Smith was a blandly feckless veteran of boardrooms and country clubs who had spent his career in the steel industry. It’s hard to grasp just why Gould latched onto Smith of all people. Perhaps he was following the lead of Apple, who the previous year had brought in their own leader from outside the computer industry, John Sculley. Sculley, however, understood consumer marketing, having cut his teeth at Pepsi, where he was the mastermind behind the Pepsi Challenge, still one of the most iconic and effective advertising campaigns in the long history of the Cola Wars. The anonymous world of Big Steel offered no comparable experience. Smith’s appointment was the first of a long string of well-nigh incomprehensible mistakes Gould would make over the next decade. Engineers that were initially thrilled to have proper funding and actual budgets at last were soon watching with growing concern as Smith puttered about with a growing management bureaucracy and let the company drift without direction. Many were soon muttering that it’s often better to make a decision — even the wrong decision — than to just let things hang. Whatever else you could say about Jack Tramiel, he never lacked the courage of his convictions.

Commodore’s first significant new models, which reached stores at last in October of 1984, more than two years after the Commodore 64, hardly did much to inspire confidence in the new regime. Nothing about the Commodore 16 and the Plus/4 made any sense at all. The 16 was an ultra-low-end model with just 16 K of memory, long after the time for such a beast had passed. The trend in even inexpensive 8-bit computers was onward, toward the next magic number of 128 K, not backward to the late 1970s.

The Commodore Plus/4

The Commodore Plus/4

As for the Plus/4, which like the 64 was built around a variant of the 6502 CPU and had the same 64 K of memory but was nevertheless incompatible… well, it was the proverbial riddle wrapped in a mystery inside an enigma. It was billed as a more “serious” machine than the 64, a computer for “home and business applications” rather than gaming, and priced to match at about $300, more than $100 more than the 64. It featured four applications built right into its ROM (thus the machine’s name): a file manager, a word processor, a spreadsheet, and a graphing program. All were pathetically sub-rate even by the standards of Commodore 64 applications, hardly the gold standard in business computing. The Plus/4 lacked the 64’s sprites and SID sound chip, which made a degree of sense; for a dismaying number of years yet a lack of audiovisual capability would be taken as a signifier of serious intent in computing. But why did it offer more colors, 128 as opposed to the 64’s 16? And as an allegedly more serious computer, why didn’t it offer the 80-column display absolutely essential for comfortable word processing and other typical productive tasks? And as a more serious (and expensive) computer, why did it have a rubbery keyboard almost as awful to type on as the IBM PCjr’s Chiclet model? And would all those serious, more productive buyers really be doing a lot of BASIC programming? If not, why was one of the main selling points a much better BASIC than the bare-bones edition found in the 64? Info, a magazine that would soon build a reputation for saying the things about Commodore’s bizarre decisions that nobody else would, gave the Plus/4 a withering review:

The biggest problem with the Plus/4 is the fundamental concept: an 8-bit, 64 K, 40-column desktop personal computer. Commodore already makes the best 8-bit, 64 K, 40-column desktop personal computer you can buy, with literally thousands of products supporting it! Why should consumers want a “new” machine with no significant advances, several new limitations, and virtually no third-party product support? And why would a company with no competition in the under-$500 category bring out an incompatible [machine] that can’t compete with anybody’s machine except their own? It just doesn’t compute!

Info ran a wonderfully snarky contest in the same issue, giving away the Plus/4 they’d just reviewed. After all, it was “sure to become a collector’s item!” Even the more staid Compute!’s Gazette managed to flummox a poor Commodore representative with a single question: “Why buy a 264 [a pre-release name for the Plus/4] instead of a 64 that has a word processor and, say, a Simon’s BASIC? It would be the equivalent of the 264 for less money.” Commodore happily claimed that the Plus/4 had enough utility built right in for the “average small business” (maybe they meant one of the vast majority that fail within a year or two anyway), but in reality it seemed like it had been cobbled together from spare parts that Commodore happened to have lying around. In fact, that’s not far from what happened — and Tramiel actually bears as much responsibility for the whole fiasco as the clueless Marshall Smith.

Tramiel, you’ll remember, had driven away the heart of his engineering team in his usual hail of recriminations and lawsuits shortly after they had created the 64 for him. He did eventually find more talented young engineers, notably Bil Herd and Dave Haynie. (Commodore always preferred their engineers young and inexperienced because that way they didn’t have to pay them much — a strategy that sometimes backfired but was sometimes perversely successful, netting them brilliant, unconventional minds who would have been overlooked by other companies.) When Herd arrived at Commodore in early 1983, engineers had been tinkering for some time with a new video and audio chip, the TED (short for Text Display). With engineering straitened as ever by Tramiel’s aversion to spending money, the 23-year-old Herd soon found himself leading a project to make the TED the heart of a new computer, despite the fact that it was in some ways a step back, lacking the sprites of the 64’s VIC chip and the marvelous sound capabilities of its SID chip. Marketing came up with the dubious idea of including applications in ROM, which by all accounts delighted Tramiel.

Tramiel, who at some fundamental level still thought of the computers he now sold like the calculators he once had, failed to grasp that the whole value of a computer is the ability to do lots of different things with it, to have lots and lots of options its designers may never have anticipated, all through the magic of software. Locking applications into ROM, making them impossible to replace or update, was kind of missing the point of building a computer in the first place. Failing to understand that a computer is only as good to consumers as the quality and variety of its available software, Tramiel also saw no problem with making the new machine incompatible with the 64. It seems to have come as a complete surprise to him when the machine was announced at that fateful Winter CES and everyone’s first question was whether they could use it to run the Commodore 64 software they already had.

After Tramiel’s abrupt departure, Commodore pushed ahead with the 16 and Plus/4 in the muddled way that would be their wont for the rest of the company’s life, despite a skeptical press and utterly indifferent consumers. It all made so little sense that some have darkly hinted of a conspiracy hatched by Tramiel amongst his remaining loyalists at Commodore to get the company to waste resources, time, and credibility on these obvious losers. (Tramiel recruited a substantial number of said loyalists to join him after he purchased Atari and got back in the home-computer game — exactly the sort of thing for which he so often sued others. But that’s a story for a later article.) Incredibly given the cobbled nature of the machine, it took nine more months after that CES to finally get the 16 and Plus/4 into production and watch them duly flop. Again, such a glacial pace would prove to be a consistent trait of the post-Tramiel Commodore.

By the time they did appear at last, the poor, benighted 16 and Plus/4 had more working against them than just their own failings, considerable as those may have been. The year as a whole was marked by failures in the home-computer segment of the market. Atari was reeling. Coleco was taking massive losses on their tardy entry into the home-computing field, the Adam. And of course I’ve already told you about the IBM PCjr.

Even Apple, who had enjoyed a splashy, successful launch of their new higher-end Macintosh (another story for a later date), had a somewhat disappointing new model amongst their bread-and-butter Apple II line. The “c” in the Apple IIc’s name stood for “compact,” and it was indeed a much smaller version of Steve Wozniak’s old evergreen design. Like the Macintosh, it was a closed system designed for the end user who just wanted to get work (or play) done, not for the hackers who had adored the earlier editions of the II with their big cases and heaps of inviting expansion slots. The idea was that you would get everything you, the ordinary user, really needed built right in: all of the fundamental interface cards, a disk drive, a full 128 K of memory (as much as the Macintosh), etc. All you would really need to add to have a nice home-office setup was a monitor and a printer.

The Apple IIc

The Apple IIc

But the IIc was not envisioned just as a more practical machine: as the only II model after the first with which Steve Jobs played an important role, it evinced all of his famous obsession with design. Indeed, much of the external look and sensibility that we associate with Apple today begins as much here as with the just slightly older — and, truth be told, just slightly clunkier-looking — first Macintosh model. The Apple IIc was the first product of what would turn into a longstanding partnership with the German firm Frog Design. It marks the debut of what Apple referred to as the “Snow White” design language — slim, modern, sleek, and, yes, white. Everything about the IIc, including the packaging and the glossy manuals inside, oozed the same chic elegance.

Apple introduced the IIc at a lavish party and exhibition in San Francisco’s Moscone Center in April of 1984, just three months after a similar shindig to launch the Macintosh. The name was chosen to mollify restless Apple II owners who feared — rightly so, as it would turn out; even at “Apple II Forever” Jobs made time for a presentation on “The First 100 Days of Macintosh” — that Sculley, Jobs, and their associates had little further interest in them. Geniuses that they have always been for burnishing their own myths, Apple built a museum right there in the conference center, its centerpiece a replica of the garage where it had all begun. The IIc unveiling itself was an audiovisual extravaganza featuring three huge projection screens for the music video Apple had commissioned for the occasion. The most dramatic and theatrical moment came when Sculley held the tiny machine above him onstage for the first time. As the crowd strained to see, he asked if they’d like a closer look. Then the house lights suddenly came up and every fifth person in the audience stood up with an Apple IIc of her own to show and pass around.

Apple confidently predicted that they would soon be selling 100,000 IIcs every month on the strength of the launch buzz and a $15 million advertising campaign. In actuality the machine averaged just 100,000 sales per year over its four years in Apple’s product catalogs. The old, ugly IIe outsold its fairer sibling handily. This left Apple in a huge bind for a while, for they had all but stopped production of the IIe in anticipation of the IIc’s success while wildly overproducing IIcs for a rush that never materialized. Thus for some time stores were glutted with the IIcs that consumers didn’t want and couldn’t get their hands on the IIes that they did. (It’s interesting to consider that the PCjr almost certainly sold more units than the IIc, which has never been tarred with the label of outright flop, during each machine’s first year on the market. Narratives can be funny things.)

It remains even today somewhat unclear why the world never embraced the IIc as it had the three Apple II models that preceded it. There’s some evidence to suggest that consumers, not yet conditioned to expect each new generation of computing technology to be both smaller and more powerful than the previous, took the IIc’s small size to be a sign that it was not as serious or powerful as the IIe. Apple was actually aware of this danger before the IIc debuted. Thus the advertising campaign worked hard to explain that the IIc was more powerful than its size would imply, with the tagline, “Announcing a technological breakthrough of incredible proportions.” Yet it’s doubtful whether this message really got through. In addition, the IIc was, like the PCjr, an expensive proposition for the home-computer buyer: almost $1300, $300 more than a basic IIe. For that price you got twice the memory of the IIe as well as various other IIe add-on options built right in, but the value of all this may have been difficult for the novice buyer, the IIc’s main target, to grasp. She may just have seen that she was being asked to pay more for a smaller and thus presumably less capable machine, and gone with the bigger, more serious-looking IIe (if anything from Apple).

Then again, maybe the IIc was just born under a bad sign. As I’ve already noted, nobody was having much luck with their new home computers in 1984, almost regardless of their individual strengths and weaknesses.

But why was this trend so universal? That’s what people inside the industry and computer evangelists outside it were asking themselves with increasing urgency as the year wore on. As 1984 drew toward a close, the inertia began to affect even the most established warhorses, the Commodore 64 and the Apple IIe. Both Commodore and Apple posted disappointing Christmas numbers, down at least 20% from the year before, and poor Commodore, now effectively a one-product company completely reliant on continuing sales of the 64, sank back well below that magic billion-dollar threshold again. In the grand scheme of things the Commodore 64 was still a ridiculously successful machine, by far the bestselling computer in the world and the preeminent gaming platform of its era. Yet there increasingly seemed to be something wrong with the home-computer revolution as a whole.

Commodore 64 startup screen

The fact was that a backlash had been steadily building almost from the moment that the spectacular Christmas 1983 buying season had ended. Consumers had begun to say, and not without considerable justification, that home computers promised far more than they delivered. Watching all those bright, happy faces in television and print advertising, people had bought computers expecting them to do the things that the computers there were doing. As Commodore’s advertising put it, “If you’re not pleased with what’s on your TV set tonight, simply turn on your Commodore 64.” Yet what did you get when you turned on your 64 — after you figured out how to connect it to your TV in the first place, that is? No bright fun, just something about 38,911 somethings, a READY prompt, and a cryptically blinking cursor. Everything about television was easy; everything about computers was hard. Computers had been sold to consumers like any other piece of consumer electronics, but they were not like any other piece of consumer electronics. For the vast majority of people — those who had no intrinsic fascination with the technology itself, who merely wanted to do the sorts of things those families on TV were doing — they were stubborn, frustrating, well-nigh intractable things. Ordinary consumers were dutifully buying computers, but computers were at some fundamental level not yet ready for ordinary consumers.

The computer industry was still unable to really answer the question which had dogged and thwarted it ever since Radio Shack had run the first ads showing a happy housewife sorting her recipes on a TRS-80 perched on the kitchen table: why do I, the ordinary man or woman with children to feed and a job to keep, need one? Commodore had cemented the industry’s go-to rhetoric with the help of William Shatner in their VIC-20 advertising campaign that first carved out a real market segment for home computers. You needed a computer for productivity tasks and for your children’s future, “Johnny can’t read BASIC” having replaced “Johnny can’t read” as the marker of a neglectful parent. Entertainment was relegated to an asterisk at the end: “Plays great games too!”

Yet, honestly, how productive could you really be with even the Commodore 64, much less the 5 K VIC-20? Some people did manage to do productive things with their 64s, but most of those who did forgot or decided not to ask themselves a simple question: is doing this on the computer really easier than the alternative? The answer was almost always no. Hobbyists chose to do things on the computer because it was cool, not because it was practical. Never mind if it took far more effort to keep one’s address book on the Commodore 64, what with its slow disk drive and quirky, unrefined software, than it would have to just have a paper card file. Never mind if it was much riskier as well, prone to deletion by an errant key swipe or a misbehaving disk drive. It was cooler, and that was all that mattered — to a technology buff. Most other people found it easier to address their Christmas cards by hand than to try to feed envelopes through a tractor-fed dot-matrix printer that made enough noise to wake the neighbors.

Perhaps the one possible compelling productive use of a machine like the Commodore 64 in the home was as a word processor. Kids today can’t imagine how students once despaired when their teachers told them that a report had to be typed back in the era of typewriters, can’t conceive how difficult it was to get anything on paper in typewritten form when every mistake made by untutored fingers meant trying to decide between pulling out the Liquid Paper or just starting all over again. But even word processing on the 64 was made so painful by the 40-column screen and manifold other compromises that there was room to debate whether the cure was worse than the disease. Specialized hardware-based word processors became hugely popular during this era for just this reason. These single-function, all-in-one devices were much more pleasant to use than a Commodore 64 equipped with a $30 program, and cheaper than buying a whole computer system, especially if you went with a higher priced and thus more productively useful model like the Apple II.

The idea that every child in America needed to learn to program, lest she be left behind to flip burgers while her friends had brilliant careers, was also absurd on the face of it. It was akin to declaring during the days of the Model T that every citizen needed to learn to strip down and rebuild one of these newfangled automobiles. Basic computer literacy was important (and remains so today); BASIC literacy was not. What a child really needed to know could largely be taught in school. Parents needn’t have fretted if Junior preferred reading, listening to music, playing sports, or practicing origami to learning the vagaries of PEEKs and POKEs in BASIC 2.0. There would be time enough for computing when computing and Junior had both grown up a bit.

So, everything had changed yet nothing had changed since the halcyon days of the trinity of 1977. Computers were transforming the face and in some cases the very nature of business, yet there remained just two compelling reasons to have one in the home: 1) for the sheer joy of hacking or 2) for playing games. Lots more computers were now being used for the latter than the former, thanks to the vastly more and vastly better games that were now available. But for many folks games just weren’t a compelling enough reason to own one. The Puritan ethic that makes people feel guilty of their pleasures was as strong in America then as it remains today. It certainly didn’t help that the media had been filled for several years now with hand-wringing about the effect videogames were having on the psyches of youngsters. (This prompted many computer publishers of this period to work hard, albeit likely with limited success, to label their computer games as something different, something more cerebral and rewarding and even, dare we say it, educational than their simplistic videogame cousins.)

But, perhaps most of all, computers still remained quite expensive when you really dug into everything you needed for a workable system. Yes, you could get a Commodore 64 for less than $200 by the Christmas of 1983. But then you needed a disk drive ($220) if you wanted to do, well, much of anything with it; a monitor ($220) if you wanted a nice picture and didn’t want to tie up the family television all the time; a printer ($290) for word processing, if you wanted to take that fraught plunge; a modem ($60) to go online. It didn’t take long until you were approaching four digits, and that’s without even entering into a discussion of software. There was thus a certain note of false advertising in that sub-$200 Commodore 64. And because these machines were being sold through mass merchandisers rather than dealers, there was no one who really knew better, who could help buyers to put a proper system together at the point of sale. Consumers, conditioned by pretty much everything else that was sold to them not to expect the 64 on its own to be pretty much useless, were often baffled and frustrated when they realized they had bought an expensive doorstop. Many of the computers sold during that Christmas of 1983 were turned on a few times only, then consigned to the back of the closet or attic to gather dust. The bad taste they put in many people’s mouths would take years to go away. Meanwhile the more complete, useful machines, like the Apple IIc and the PCjr, were still more expensive than a complete Commodore 64 system — and the games on them weren’t as good to boot. Hackers and passionate gamers (or, perhaps more commonly, their generous parents) were willing to pay the price. Curious novices largely were not. Faced with no really good all-purpose options, many — most, actually — soon decided home computers just weren’t worth it. The real home-computer revolution, as it turned out, was still almost ten years away. About 15% of American homes had computers — at least ostensibly; many of them were, as just mentioned, buried in closets — by January 1, 1985, but that figure would rise with agonizing slowness for the rest of the decade. People could still live perfectly happy lives fully plugged into the cultural discourse around them and raise healthy, productive children in the process without owning a computer. Only much later, with the arrival of the World Wide Web and computers equipped with more intuitive graphical user interfaces for accessing it, would that change.

Which is not to say that the software and information industries that had exploded in and around the home-computer revolution during 1982 and 1983 died just like that. Many of its prominent members, however, did, as the financial gambles they had taken in anticipation of the home-computer revolution came back to haunt them. We’ve just seen how Sierra nearly went under during this period. Muse Software and Scott Adams’s Adventure International, to name two other old friends from this blog, weren’t so lucky; both folded in 1985. Electronic Arts survived, but steered their rhetoric and choice of titles somewhat away from Trip Hawkins’s original vision of “consumer software” toward titles tilted more toward the hardcore, in proven hardcore genres like the CRPG and the adventure game.

Magazines were even harder hit. By early 1984 there were more than 300 professionally published computing periodicals of one sort or another, many of them just founded during the boom of the previous year. Well over half of these died during 1984 and 1985. Mixed in with the dead Johnny-come-latelys were some cherished veteran voices, among them pioneers Creative Computing (1974), SoftSide (1978), and Softalk (1980). The latter’s demise, after exactly four years and 48 issues of sometimes superb people-focused journalism, came as a particular blow to the Apple II community; Apple historian Steven Weyhrich names this moment as nothing less than the end of the “golden age” of the Apple II. Those magazines that survived often did so in dramatically shrunken form. Compute!, for instance, went from 392 pages in December of 1983 to 160 ten months later.

Yet it wasn’t all doom and gloom. Paradoxically, some software publishers still did quite well. Infocom, for example, had the best single year in their history in 1984 in terms of unit sales, selling almost 750,000 games. It seemed that, with more options than ever before, software buyers were becoming much more discerning. Those publishers like Infocom who could offer them fresh, quality products showing a distinctive sensibility could do very well. Those who could not, like Adventure International with their tired old two-word parsers and simplistic engines, suffered the consequences. That real or implied asterisk (“Plays great games too!”) at the end of the advertising copy remained the great guilty secret of the remaining home-computer industry, the real reason computers were in homes at all. Thankfully, the best games were getting ever more complex and compelling; otherwise the industry may have been in even more trouble than it actually was.

Indeed, with a staggering number of machines already out there and heaps still to be sold for years to come, the golden age for Commodore 64 users was just beginning. This year of chaos and uncertainty was the year that the 64 really came into its own as a games machine, as programmers came to understand how to make it sing. Companies who found these keyboard maestros would be able to make millions from them. The home-computer revolution may not have quite panned out as anticipated and the parent company may have looked increasingly clueless, but for gamers the Commodore 64 stood alone with its combination of audiovisual capability, its large and ever growing catalog of games, and its low price. What with game consoles effectively dead in the wake of Atari’s crash and burn, all the action was right here.

In that spirit, we’ll look next time at the strange transformation that led one of our stodgiest old friends from earlier articles to become the hip purveyor of some of the slickest games that would ever grace the 64.

(The indispensable resources on Commodore’s history remain Brian Bagnall’s On the Edge and its revised edition, Commodore: A Company on the Edge. Frank Rose’s West of Eden is the best chronicle I know of this period of Apple’s history. The editorial pages and columnists in Compute! and Compute!’s Gazette provided a great unfolding account of a chaotic year in home computing as it happened. Particular props must go to Fred D’Ignazio for pointing out all of the problems with the standard rhetoric of the home-computer revolution in Compute!‘s May 1984 issue — but he does lose points for naming the PCjr as the answer to all these woes in the next issue.)

 

Tags: , ,

The Unmaking and Remaking of Sierra On-Line

King's Quest

What happened for Ken and Roberta Williams in less than three years would have gone to anyone’s head. As the 1980s dawned, their lives were utterly ordinary. Ken was a business programmer putting in long hours every day in Los Angeles, Roberta his pretty, quiet, vaguely dissatisfied stay-at-home wife. Six months later she was a published game designer (to the extent that description meant anything in 1980), and the couple was sitting at their kitchen table opening the mail in disbelief as orders poured in for their little homemade adventure game. A year later, Ken was head of a burgeoning software house in their dream setting, nestled in the heart of the California Redwoods, and Roberta was his star designer. A year after that, they and the company they had built were software superstars. Glossy magazines and television shows begged for access and interviews; entertainment moguls flew them to New York to wine and dine them at 21 Club; venture capitalists lined up to offer money and advice, telling them they were at the forefront of the next big thing in media; big corporations offered to buy their whole operation, with starting offers of $20 million or more. Big franchises approached to talk about licensing deals: Jim Henson Associates, Disney, the Family Circus comic strip. For Ken, two of whose greatest heroes were Jim Henson and Walt Disney, such offers were flabbergasting. Late in 1982 IBM, by at least some measures the biggest, most powerful company in the world, humbly came knocking at the Williams’ door to ask if they’d be willing to work with them to develop software for their new home computer.

Yes, it would have gone to anyone’s head. Ken said yes to just about everyone, with the exception only of the outright buyout offers; he was having far too much fun to entertain them. The pundits, advisers, and investors that surrounded Ken were all telling him that the new low-cost home computers were the wave of the future, destined to replace the old Atari VCS game console and its competitors in the hearts and minds of consumers. This was the new gravy train, and the key to riding it was to get lots and lots of product out there to feed customers hungry for games for their new Commodore VIC-20s, Texas Instruments 99/4As, and Coleco Adams. Don’t stress too much about any given title, they said; just get lots of them out there. Simpler games were actually better, because then you could port them more quickly from platform to platform and pack them onto cartridges for all those ultra-low-end users without even a cassette drive. Ken, with these words ringing in his ears, dutifully made plans to push out 100 separate products in 1983 alone. He amassed a fleet of programmers to churn out action games which could be easily ported from platform to platform. Sierra spelled out this new approach in their “strategy outline” for 1983:

We believe the home-computer market to be so explosive that “title saturation” is impossible. The number of new machines competing for the Apple/Atari segment in 1983 will create a perpetually new market hungry for winning 1982 titles. We will exploit this opportunity.

Mr. Cool advertisement VIC-20 advertisements

Housing his growing fleet of salaried, workaday programmers — Ken had decided that dealing with artistically-tempered programmers like John Harris of Jawbreaker fame just wasn’t worth the trouble, that programming really shouldn’t be considered a creative endeavor at all — was soon becoming a problem. Growing technical, clerical, marketing, and warehouse staffs were also pushing the company’s total head count rapidly toward 100. Thus when the developer who owned Sierra’s office facilities offered to build a brand new building to house the company, a lovely place which perfectly suited the company’s image (if not, increasingly, its reality) as a clan of computer artisans living in the woods, Ken happily acquiesced, accepting rent in the vicinity of $25,000 per month.

The Sierra "redwood" building, custom-built for them in 1982

The Sierra “redwood” building, custom-built for them in 1982

The new offices weren’t the only building contract Ken signed around this time. Figuring that if they were going to be entertainment moguls they needed to live the part, Ken and Roberta hired an architect to design a sprawling 10,000 square-foot, $800,000 house — huge money in this rural area — on the Fresno River, complete with racquetball and volleyball courts, full-length wet bar, and a mini-arcade with all the latest games.

But by the time Ken and Roberta moved on Labor Day weekend, 1983, the fantasy of their lavish housewarming party, which included a professional comedy troupe brought in from San Francisco for the occasion, was undercut by some slowly dawning realities. Sierra’s first big partnership with Big Media, on the Dark Crystal game, had been a major artistic and commercial disappointment, done in by the tired old Hi-Res Adventure engine that powered it and a rote design by a Roberta Williams who seemed determined not to grow past what she had done for Mystery House. Their one real hit of the year, meanwhile, had not been any of the titles from Ken’s new programmers, but rather John Harris’s loving, officially licensed port of the arcade game Frogger, a port done so well that some said it surpassed its inspiration. Alas, Frogger was the last game Harris did for Sierra; he had left some time before, having signed on with Synapse Software, whom he considered more quality-oriented. It was already beginning to dawn at that party that they might actually make less this year than they had the last even as the new building and growing staff had increased their expenses enormously. Soon after, things really started to go south.

Much of the software that Sierra was now producing was on cartridges, which were both more expensive to produce than disks or tapes and took much longer to duplicate. With much of the industry following Sierra’s plan of churning out new games practically by the dozen, production capacity at the relatively limited number of facilities capable of making cartridges was at a premium. Sierra was forced to place huge orders in June or July for the games they hoped to be selling huge numbers of come Christmas. But a funny thing happened during the six months in between: the market for the VIC-20, the TI 99/4A, and the Coleco Adam, the machines for which most of these cartridges were produced, collapsed. Jack Tramiel, you see, had won the Home Computer Wars of 1983 by then, driving TI right out of the market. In the process, he had just about killed his own VIC-20 as well; the price of the vastly more desirable and capable Commodore 64 had dropped so low that there was little point in buying a VIC-20 instead. As for the Adam… well, it never had a chance; by the time it arrived the war was largely over and the victor already determined. The Commodore 64 rocketed out of that Christmas the new center of the gaming universe, a position it would hold for the next several years. Yet all Sierra had to sell Commodore 64 owners were a few simple games ported from the VIC-20. And they had tens of thousands of cartridges, millions of dollars of inventory which they couldn’t move for ten cents on the dollar, sitting in warehouses. Meanwhile their shiny licensing deals were also turning out to be of little benefit to the bottom line. Sierra felt that they were doing all the work on these and all the profits — what little there sometimes were — were going to the licensees. As 1984 ground on, it became clear that the company was in the most dire of straits, unable to even make their mortgage payments on their fancy new office building.

The only thing to do was to start cutting. In a matter of days the company shed the extra skin it had built up, going from 100 employees to an absolutely essential core of about 20. A desperate Ken went to Sierra’s landlord and offered him a 10% share in the company if he would just forgive them the rent for a few months, while they got back on their feet. Figuring that 10% of a dead company was worth less than the rent he might be able to get out of them now, he said no thanks. In the end Ken was able to negotiate only to give back some of the building for other tenants. He and Roberta and their closest associates paid some of the remaining rent for a while using second mortgages and personal credit cards. It looked like this dream they had been living was about to end less than four years after it had begun, that soon they might end up right back where they had started in the suburbs of Los Angeles. They might have packed it in but for one remaining hope: that contract they had signed with IBM back in the halcyon days.

Sierra’s relationship with IBM actually went back even further than that contract. IBM first partnered with Sierra during the run-up to the original IBM PC’s launch in 1981, when they hired them to port The Wizard and the Princess, one of the biggest Apple II titles of that year, to their new machine. Sierra first experienced the legendary IBM secrecy then. Prototypes would arrive in X-ray-resistant lead chests sealed with solder, and were expected to be stored and used in windowless rooms that were to be kept locked at all times. Despite being a relatively minor part of the PC’s launch, Sierra, and Ken in particular, got on well with IBM. For all the party-hearty persona Ken could put on (as well described in Hackers and elsewhere on this blog), he had spent his previous life working for big technology companies like IBM. He understood how they worked, knew what it meant to shake down a new computer system and find the bugs and flaws while also obeying the rules of corporate hierarchy. IBM likely found him a refreshing change from both the un-technical MBAs and the technically masterful but socially unsophisticated hackers that were most of his peers. At any rate, they came back to Sierra soon after initiating the PCjr project.

IBM flew Ken and Jeff Stephenson, the man who was quickly assuming Ken’s role as hacker in chief at Sierra as Ken got more and more absorbed with the business side, out to their offices in Boca Raton, Florida. After the NDAs and other legal niceties that were part and parcel of dealing with IBM, they explained what the PCjr was to be and asked them to pitch some software that might make a good fit. Ken and Jeff made a number of proposals that were accepted, including HomeWord, an easy-to-use, casual word processor with an early graphical user interface of sorts which Ken and Jeff were already working on; it would wind up IBM’s official word processor for the PCjr. But the most important proposal, the biggest in the history of Sierra On-Line and one which would change adventure gaming forever, was made up on the fly, drawn up on the back of a napkin during a pause in the proceedings.

Sierra was still known most of all for their Hi-Res Adventure line of illustrated adventure games. Unsurprisingly, IBM very much wanted something along those lines for the PCjr. But they had some specific requests for changes from Sierra’s traditional approach, which if nothing else proved that not everyone at IBM was as blissfully ignorant of gaming as legend would have it. They asked for a game that could be replayable, that would be more dynamic and complex in its world modeling, sort of like Ultima and Wizardry (adventures and CRPGs were not yet clearly defined separate genres at this point). They specifically asked that puzzles have multiple solutions, that there be many different possible paths through the game.

Ken and Jeff sensed that they really wanted Sierra to push themselves, to get beyond the tried-and-true Hi-Res Adventure model. And with good reason: as the sales for The Dark Crystal were about to show, Sierra desperately needed to raise their game if they wanted to keep their hand in adventures at all. Next to the games that Infocom was putting out, the Hi-Res Adventure games were painfully primitive. Yet how should they try to compete? Most other publishers, witnessing Infocom’s success with pure text, were beginning to shift their emphasis back to the parsers and the writing, de-emphasizing their pictures or removing them entirely. Infocom, in other words, was replacing Sierra as the model to be emulated. Ken instinctively sensed that this was not the right bandwagon for Sierra to leap aboard, much as they respected the technical accomplishment in Infocom’s games. They were movie people rather than book people; as Ken later said, Sierra had a “mass-market” sensibility which contrasted with Infocom’s “cerebral” approach. Rather than try to ape Infocom like other publishers, why not zig while everyone else zagged, double down on graphics while de-emphasizing text? Besides, one of the main selling features of the PCjr was to be its bright 16-color graphics. Shouldn’t its showcase adventure take advantage of them?

King's Quest

When IBM joined them again in the conference room, Ken and Jeff made their pitch for a new type of adventure game. Most of the screen would be given over to the graphics, like in the Hi-Res Adventures, but the interactivity would now also extend to this part of the display. The player’s avatar would be visible onscreen, with the player able to move him around within each room using a joystick or the arrow keys. The player would still have to type non-movement commands, but now positioning within each room would play an important role: you would have to move right up next to that old tree stump to peer inside, walk up to the kindly forest elf to talk to him, etc. Some text would still have to remain to explain some of what happened, but much of the experience would be entirely visual, more movie than book. Action sequences requiring precise timing and coordination could be introduced. The system also promised to introduce the kind of dynamism that IBM desired in other ways. Other characters and creatures could wander the world, to be dodged, fought, or befriended. What we would today call emergent behavior might arise: the player might hide behind a handy tree when the wicked witch suddenly popped onto the scene. It would be a showstopper, conforming to Ken’s ten-foot rule for software marketing while also introducing whole new tactical layers that had never been seen in adventure games before. IBM signed on happily.

The reaction in Oakhurst was not quite so enthusiastic. Some felt that Ken and Jeff had promised IBM the moon, that this was simply a leap too far. Perhaps remembering Sierra’s last two adventure games, both of which had gone through long, painful development cycles for little commercial reward, they pointedly suggested that Ken go back to IBM and explain that Sierra had bitten off more than they could chew, cut the proposal down dramatically to something more realistically achievable, and try to get IBM to accept it. Ken, realizing that any such action would destroy his credibility with IBM forever, absolutely refused. He pointed out that they had 128 K of memory to work with for this project, a huge figure in comparison to the 48 K they’d had for the Hi-Res Adventure games. He found a critical ally in Roberta, the person who would have to actually design for the system. She simply asked questions until she felt she understood the system and what it would and would not be capable of, digested IBM’s desires for a more dynamic game than was her previous wont, then went to work. Eventually the grumbling mostly ceased and the rest of the staff followed her example.

What with Ken having a company to run, the heavy lifting of turning the proposal into a game engine largely fell to Jeff Stephenson. Just like the Hi-Res Adventure engine, this one was designed to be reusable and extendible from the start. It was initially known as the Game Adaptation Language, or GAL. Ken, however, loathed the cutesiness of that acronym, and it was eventually renamed to the Adventure Game Interpreter, or AGI. (I’ll refer to the system as AGI from here on for the sake of consistency.) Soon the trucks bearing the familiar lead-lined crates began arriving in Oakhurst again, and development began in earnest on both the engine and the game it would run. The team chosen for the task consisted of Roberta and about half a dozen programmers and artists. The PCjr projects as a whole, which included the adventure game, HomeWord, and several other pieces of software, were given a top-secret code-name: Project Siesta. Still, it’s hard to keep anything a secret in a city as small as Oakhurst. Word quickly spread: “The big fucking company is in town again.”

Some of the process of developing the first AGI game, eventually to be named King’s Quest, was not that far removed from the days of Hi-Res Adventure. The artists still drew each scene on paper using colored pencils. These drawings were then traced using a graphics tablet connected to a computer, where they were stored using the vector-graphics techniques Ken had developed back in the days of Mystery House and The Wizard and the Princess. (When playing King’s Quest on older, slower hardware you can see each new room being drawn in line by line. Fascinatingly, what you are actually seeing there are the motions of the stylus being guided by the person who first traced the image all those years ago. Early King’s Quest versions let you see the process more clearly via an undocumented “slow draw” mode that can be activated by pressing Control-V.) Thanks to this evergreen technique, the image of each room occupies only .5 to 2.5 K. The same data also tells the interpreter where Sir Graham, [1]In the very early versions of King’s Quest, Sir Graham’s name was spelled as “Grahame.” the game’s protagonist, can and cannot walk. Boundaries, such as the castle walls you see in the screenshot above, were traced with a special flag activated and incorporated right into the image itself.

Perhaps the trickiest problem that Jeff Stephenson had to wrestle with stems from the fact that we view each room in the game from a parallel perspective. Thus the game needs to account for the z-axis in addition to the x- and y-axes to maintain the illusion of depth. Each object in each room is therefore given something Jeff called its “priority,” essentially its position on the Z-axis. An object’s priority can range from 1 to 15, and increases as it gets closer to the “back” of the room. In drawing a scene, the interpreter draws objects of lower priority after those of higher priority. Say that a tree is positioned on the screen at priority 9. If Graham moves vertically, “deeper” into the screen, to, say, priority 11, then moves horizontally “behind” the tree, the tree will conceal him as expected. Up to four moving characters can be in a single room, the interpreter constantly adjusting the onscreen image to account for their movements.

King's Quest

The game logic is described using a simple scripting language which is once again descended from the system Ken had developed for the Hi-Res Adventure line. Let’s take a look at one small piece of the scene shown above. In addition to our alter ego Graham, it shows a goat — “object” number 14 — who wanders back and forth in his corral, which in turn spans two rooms, numbers 10 and 11; the room shown above, the leftmost, is room 10. The goat continues to wander unless and until he is tempted to join Graham by a scrumptious-looking carrot. Here’s how the goat’s logic in room 10 is described in AGI:

IF HAS-GOAT 0 AND OBJHIT-EDGE 14 AND EDGE-OBJ-HIT 1 AND GOAT-GONE 0 AND SHOW-CARROT 0 THEN ASSIGN GOAT-ROOM 11, ERASE 10

So, and without getting too lost in the weeds here, if we do not “have” the goat and are not showing him the carrot, and the goat has hit the edge of the screen in his wanderings, remove (“erase”) him from room 10 and put him in room 11. Room 10 alone has 180 such lines of script to describe all of its interactive possibilities. Like most software, an AGI game is more complex than it looks. This is true from the standpoint of both the engine programmers and the scripters. In the context of its time, AGI is nothing less than a stunning technological tour de force — one which, like all the best software, looks easy.

The technical virtuosity on display here made it rather easy for reviewers of the time to lose sight of the actual game it enabled, a painfully common phenomenon in the field of videogames. Indeed, I was anticipating reviewing King’s Quest more as a piece of technology than an adventure game, particularly given that I frankly don’t think very highly of Roberta’s work on the Hi-Res Adventure line. I was, however, pleasantly surprised by her work here. King’s Quest‘s plot is almost as basic as that of the original Adventure: the kingdom of Daventry is in some sort of vaguely defined trouble, and the aging King Edward needs you, the brave knight Sir Graham, to find three magic items that can save it. Since he conveniently has no heirs, do that and “the throne will be yours.” King’s Quest is another treasure hunt, nothing more or less.

Still, and making allowances for the newness of the technology, Roberta does a pretty good job with it. Many of the characters and situations you encounter as you roam Daventry are drawn, and not without a certain charm, from classic fairy tales: Hansel and Gretel, Jack and the Beanstalk, Rumpelstiltskin. The latter is at the core of the one howlingly awful puzzle in the game, which starts out dodgy and just keeps layering on the complications until it’s well-nigh impossible.

King's Quest

(For the record: you meet an old gnome-looking sort of fellow who gives you three chances to guess his name. If you’re familiar with your Brothers Grimm, you might divine that he’s Rumpelstiltskin given the fairy-tale characters everywhere else in the game. But, no, “That is very close but not quite right.” Okay, you do have a note you found elsewhere which says, “Sometimes it is wise to think backwards.” So, “nikstlitslepmur.” No — “You have the right idea, but your thinking is just a little bit off.” It turns out you have to write the name using a backwards alphabet.)

But even here the IBM design brief saves Roberta from her worst instincts. There is, thank God, an alternate way to proceed without solving this puzzle, even if it does cost you some points. And most of the other puzzles are… not that bad, actually. Some are even pretty clever. That may sound like damning with faint praise, but given some of the absurdities of Time Zone it’s nevertheless praise indeed. There are a huge number of ways to go through King’s Quest, what with all of the alternate solutions on offer, and the game feels consciously designed in a holistic sense in a way that no previous Roberta Williams game did.

King’s Quest also makes use of most of the new possibilities afforded by the AGI system. There are enemies to be dodged and eventually dispatched — the witch out of Hansel and Gretel is particularly harrowing — and tricky action sequences to be navigated. King’s Quest is mostly a competent, enjoyable game even when divorced from its place in history as the first use of the revolutionary technology that powers it. It’s also reasonably solvable, at least if you aren’t too fixated on getting the maximum possible points. Realistically, it needed to be no more than a technological proof of concept to be a bestseller, but it manages to be considerably more than that. It acquits itself very well overall as the herald of a new paradigm for adventure gaming.

As development continued and Sierra’s financial position began to look more precarious, stress began to mount. Ken’s wish to just find average and uncreative but reliable programmers was perhaps amplified more than ever by some of the characters he ended up having to assign to the King’s Quest project. Whether because of its location near the old hippie meccas of northern California or just something in the water, Sierra always seemed to be filled with eccentrics despite Ken’s best efforts to run a more buttoned-down operation. One fellow was particularly noted for his acid consumption and his fascination with Fozzy Bear, and looked freakish enough that (in John Williams’s words) “when he went into a restaurant, everyone looked at him.” Another, similarly “off” programmer acted like a cross between a mad scientist and Zaphod Beeblebrox of Hitchhiker’s Guide to the Galaxy fame. Near the end of the project one developer, angry at the long hours he had been working, held a critical piece of code ransom until Sierra paid him for dozens of hours of overtime to which he felt entitled. They agreed to pay, got the code, and promptly reneged, citing a clause in American contract law which says that a contract is null and void if one of the parties signs under duress.

When IBM officially unveiled the PCjr and its horrid Chiclet keyboard in November of 1983, Sierra was as surprised as anyone. For all their involvement in the machine’s development, they never had access to a real production model. Ken had to go down to ComputerLand and buy his own, just like everyone else, when the PCjr shipped at last in March of 1984. His first machine didn’t remain in his possession for very long. He went to the movies on his way home, leaving it in his car, only to find it stolen when he returned. That must have seemed like a bad omen coming to fruition as it became more and more clear that the PCjr, seemingly Sierra’s last hope, was flopping in the marketplace.

King's Quest

And that must have seemed a double shame because — and I know I may seem to be belaboring the point, but I can hardly emphasize this enough — King’s Quest was amazing in its time. Even magazines devoted to other platforms felt compelled to talk about it; it was just that revolutionary. King’s Quest, marketed under IBM’s official imprint with cover art apparently drawn by someone who had never seen the game, did sell pretty well by the standards of IBM PCjr software, but there just weren’t enough PCjrs being sold to save Sierra. Similarly, a version for the PCjr’s bigger brother, the IBM PC, sold well by the standards of games for that platform, but the entertainment market for such a business-computing stalwart wasn’t up to much. Although the AGI system had been designed to be portable, it had also been designed to run in 128 K of memory. This locked it out of the typical unexpanded Apple IIe (64 K) and the biggest gaming platform in the country, the Commodore 64. Sierra had exactly the right game on exactly the wrong platform. It seemed Ken had backed the wrong horse, a final bad decision that looked to have doomed his company. The situation just got more and more desperate. John Williams, Sierra’s marketing director, recalls placing media buys around this time with no idea how he was going to pay for them when the invoices came due: “This is either going to help, in which case we can deal with the cost of these and maybe negotiate payment on it — or it won’t work and we’ll be gone” anyway.

Two new machines played a big role in saving Sierra. Just a month after the PCjr finally shipped to stores, Apple announced and shipped the fourth incarnation of the Apple II, the IIc. We’ll talk a bit more about it in a future article, but for now suffice to say that the IIc was designed to be a semi-portable, closed appliance computer, in contrast to the hacker’s laboratories that had been previous Apple II models. Most critically for our purposes today, the IIc shipped with 128 K of memory. Its commercial performance would ultimately be rather lukewarm, but it did prompt many users of the older IIe model to upgrade to (at least) 128 K to match its capabilities. In time there were enough 128 K Apple IIs to justify porting the AGI interpreter to the platform.

But it was the Tandy 1000 that really saved Sierra in the most immediate sense, that gave that critical mass of 128 K Apple II users time to amass. It was introduced just as 1984, the most difficult year in Sierra’s history, was winding down. In many ways it was what the PCjr should have been, with the same graphics and sound capabilities and IBM PC compatibility in a smarter, more usable and expandable package. And it was sold in Radio Shack stores all over the country. In some areas the local Radio Shack was the only place within 200 miles to buy a computer. Sierra smartly developed a strong relationship with Radio Shack in the wake of the Tandy 1000’s announcement. Few other software publishers bothered, meaning that King’s Quest and other Sierra games stood almost alone on the shelves in many of these captive markets. The Tandy 1000, combined with the slowly increasing user base of expanded Apple IIs, gave King’s Quest the opportunity to slowly pull Sierra back from the edge of the abyss, particularly since much of the game’s $850,000 development cost had been funded by IBM. It would take time, but by the end of 1985, with King’s Quest II now already out and doing very well, the company was paying off debts and beginning to grow again.

Ken, Roberta, John, Jeff, and their closest associates had, much to their credit, stuck to their guns and not made the perfectly reasonable decision to pack it in. But they had also, as Ken well realized, gotten very, very lucky. Without the Tandy 1000 and few other lucky breaks, Sierra could easily have gone the way of Adventure International, Muse, and other big software houses who were flying high in 1982 and dead in 1985. As he recently said, it had all been “fun and games” for the first few years. Now he understood how quickly things could go bad with a few wrong decisions, understood what a fragile entity Sierra really was. Most of all, he never wanted to go through another year like 1984 again. The Ken Williams that emerged from that period was, like his company, changed. From now on he would do a remarkable job of balancing ambition with caution. This capacity to change and learn from his mistakes, much rarer than it seems it ought to be, was perhaps ultimately the most important quality he brought to Sierra. He reoriented his company to stop chasing fast bucks and to focus on a smaller number of quality titles for a modest number of proven platforms, and accumulated a stable of designers, programmers, and artists whom he treated with respect. They in turn did good, occasionally great work for him. Sierra Mark II, leaner, humbler, and wiser, was off and running.

(My huge thanks once again go to John Williams for contributing so many of his memories to this article. Hackers by Steven Levy was also invaluable for what I believe will be the last time at last, as we’ve now moved beyond the period it covers. An article in the February 1985 Compute! breaks down the AGI system in unusual detail for a contemporary source. If you want to know more about its technical side, it’s been documented in exhaustive detail since. If you just would like to play King’s Quest, it’s available in a pack with King’s Quest II and III at Good Old Games.)

Footnotes

Footnotes
1 In the very early versions of King’s Quest, Sir Graham’s name was spelled as “Grahame.”
 

Tags: , , , ,