RSS

Tag Archives: c64

Elite (or, The Universe on 32 K Per Day)

BBC Micro Elite

Sometimes great works go unappreciated during their time. Other times their time knows exactly what they’re on about. The latter was the good fortune of Elite, Ian Bell and David Braben’s epic game of space combat, trading, and exploration. Arriving at a confused and confusing time in the British games industry, Elite caused a rush of excitement the likes of which had never been seen before even in an industry that seemed to live and die on hype, becoming a bestseller several times over despite being initially released on a platform, the BBC Micro, that was not generally considered much of a gaming machine. Bell and Braben became recognizable stars, their names tripping off the tongues of a generation of British gamers the way that those of Lennon and McCartney had their parents’. It was about as close as the industry would ever get to Trip Hawkins’s dream of game designers as the rock stars of the 1980s. As for the game they created… well, that’s gone down into history as just possibly the most remembered and respected single computer game of the 1980s. But we’re beginning with the ending, which isn’t our usual way around here. Let’s go back to the beginning and see how it all began.

Bell and Braben first met one another during the autumn of 1982, when both arrived at Cambridge University as first-year undergraduates. Bell was to read math, Braben physics. More importantly, both were avid hackers. Bell brought a BBC Micro to university with him, Braben an example of that machine’s predecessor, the Atom, which he had expanded and soldered on and generally hacked at enough to make Dr. Frankenstein proud. Bell had real professional programming experience, at least of a sort: he’d gotten his version of Reversi published by a tiny company called Program Power, and would soon see an original action game, Freefall, published by Acornsoft, software arm of the company that made the computers on his and Braben’s desks. Braben had a passion for 3D graphics, and some code that could draw and rotate wireframe spaceships. The two bonded quickly.

Not that they became precisely bosom buddies. As their later story would demonstrate to anyone’s satisfaction, they were very different personalities. If I may strain an analogy just one more time, Bell was the John Lennon of the pair, pessimistic, introverted, and perhaps just a little bit tortured, while Braben was the Paul McCartney, an optimistic charmer with one eye on the market to go with one eye on his art. If not for their passion for Acorn computers, they would have likely had little to say to one another. Both, however, had programming talent to burn, along with a less obvious but at least as important instinct for visionary game design.

But then in the era of Elite even more so than today technological innovation and design innovation were often inextricably linked, with the latter most often flowing from the former. Thus the design that would become Elite really did stem directly from those 3D spaceships Braben had rotating on his Atom’s screen when he arrived at Cambridge. To understand what made those spaceships so different, and so fraught with potential, we should look to the state of game graphics in general circa 1982.

Defender Pac-Man

Almost all action games of 1982 show their world from either directly overhead or sideways (like Defender in the picture to the left) or some odd hybrid of the two that doesn’t quite make sense in the real world (like Pac-Man in the picture to the right). They employ a third-person perspective; you see and control an onscreen avatar from a distance, rather than viewing the world through her eyes. She, her enemies, and perhaps some other elements like laser fire move over a relatively static background image. This approach makes life much easier for programmers in at least a couple of ways. Updating big chunks of screen is very expensive in terms of the computing power available to early PCs and stand-up arcade games. Therefore many of them implemented hardware sprites, little movable chunks of graphics that exist separately from the rest of the screen inside the computer, to be overlaid onto it by the video hardware at no cost to the CPU only on the physical monitor screen. A game like Defender or Pac-Man is an ideal fit for such technology; I trust it won’t be difficult to figure out which parts of the screens above are implemented as sprites and which as background graphics. (In the early days all of the work could be left to sprites: a few early games, such as Boot Hill, consist of only sprites which are sometimes projected over a painted background image.)

There’s also another, more subtle advantage to the traditional arcade-game perspective. If you think about it for a moment, you’ll realize that the worlds shown on the screens above don’t correspond to any recognizable version of our reality even postulating that it could contain invading aliens or munching heads being pursued through a maze of food pellets by ghosts. These worlds are strictly 2D; they lack any notion of depth. Pac-Man and his friends are living in a computerized version of Edwin Abbott’s Flatland; if we were to see this world through his perspective, it would be a very strange one indeed. Similarly, your spaceship in Defender can go up and down and left and right, but not in and out. This is very convenient for the programmer because the computer screen also happens to be flat, possessed of an X- and a Y-dimension but no Z-dimension. Thus the coordinates of any object in this flat world being simulated correspond nicely to its coordinates on the physical screen.

But what if you aren’t satisfied with a Flatland-esque world shown from a locked vertical or horizontal perspective? What if you want to immerse your player in your world good and proper, and to make it one that corresponds to our own of three dimensions while you’re at it? Well, now your job just got a whole lot more difficult. As it happened, however, that was exactly what Bell and Braben were soon trying to do. The crux of the problem, the crux of a huge body of 3D graphics theory as well as lots and lots of specialized hardware that is probably a part of the computer you’re using to read this and for which if you’re a hardcore gamer you may have paid hundreds of dollars, is disarmingly simple: how to translate the X, Y, and Z of a world that lives inside the computer to the X and Y of the computer screen. The starting point must be the rules of visual perspective, well understood by artists since at least the Renaissance. But that well-trodden path opens into a thicket of complications when applied to the computer. Lacking as it does an artist’s intuitive understanding of the real world, a computer has to be laboriously instructed on how not to draw objects that are behind other objects on top of them, how to figure out which surfaces of an object are visible and which are not, etc. Just to make the challenges even greater, sprites aren’t of any real use for 3D graphics: the entire screen is necessarily changing all the time when moving a first-person perspective through a 3D world.

Bell and Braben were hardly the first to enter into this territory. Indeed, the field of 3D graphics isn’t all that much older than the field of computer graphics itself. Academic researchers during the 1960s and especially the 1970s laid down much of the work that still grounds the field today. One minor contributor to this growing body of work was a graphics researcher and aviation enthusiast named Bruce Artwick, who finished a Master’s degree at the University of Illinois (home of PLATO) in 1976. For his thesis project, he combined his two interests. “A Versatile Computer-Generated Dynamic Flight Display” described a flight simulator featuring a first-person, out-the-cockpit view of a 3D world. In 1980, Artwick with his new company SubLogic brought to market the aptly titled Flight Simulator for the Apple II and TRS-80. Running in as little as 16 K of memory, it marked microcomputer gamers’ first encounter with the format that now dominates the industry: interactive, animated 3D graphics. The Flight Simulator line, whether sold under the imprint of SubLogic or Microsoft, went on to become a computing institution spanning some three decades.

SubLogic Flight Simulator on the Apple II (1980)

Groundbreaking as they were, however, the early versions of Flight Simulator were also, as their name would imply, much more simulator than game. They provided no story, no goals, no sense of progression — just an empty world to fly through. Yes, they did include a mode called “British Ace 30 Aerial Battle,” which transformed your little Cessna into a World War I biplane and let you fly around trying to shoot other planes out of the sky, but, well, let’s just say that it was always clear when playing it that Artwick’s real priorities lay elsewhere. Mostly you were expected to make your own fun refining your piloting technique and, of course, marveling that this 3D world could exist at all on a 16 K 8-bit microcomputer.

Battlezone

A more traditionally gamelike application of 3D came to arcades that same year in the form of Atari’s Battlezone. In it you control a tank in battle against other tanks. You view the action from a first-person perspective, through a screen made to resemble the periscope of a real tank. Battlezone eventually made it to home computers and consoles as well, albeit not until 1983. While their awareness of Flight Simulator is questionable (it was an American product made for American platforms in a very bifurcated computing world), Bell and Braben were aware of and had played Battlezone in the arcades. It was the impetus for Braben’s rotating 3D spaceships and for the combat game Bell and Braben would soon be designing around them.

BBC Micro Elite

They were determined to bring 3D to a 2 MHz 8-bit computer with 32 K of memory, and to do it in the context of a real game with real things to do. At least they didn’t have to bemoan the uselessness of sprites to this new paradigm: having been created with educational and “practical” uses in mind rather than gaming, the BBC Micro didn’t have any anyway. Programming, like politics, being the art of the possible, compromises would be needed if they were to have a prayer. Braben had already made the wise choice to set his 3D demo in space. Space is full of, well, space. It’s almost entirely empty, thus dramatically reducing the amount of stuff their game would have to draw. One other obvious decision was to perform only the first part of the full two-part rendering process, drawing in the outlines of objects in their 3D world but not going back and filling in their surfaces, an even more complicated and expensive process. (As the screens above illustrate, Artwick and Atari had already made the same compromise in their own initial implementations of 3D.)

Elite ship chart

The pair ruthlessly simplified Braben’s original spaceship models to have as few lines as possible, just enough to make of each a recognizable shape. This turned out to be wise for another reason: complex designs shown in wireframe tend to turn into a confusing mishmash of lines. To simplify rendering, all objects were also made convex, meaning that any given line will only pass in and out of the object once; as Braben himself put it in a talk at a recent Game Designers Conference, a block of cheddar cheese is convex but a block of Swiss is not. They also favored symmetrical ships — ships for which one side was a mirror image of the other, and thus could be rendered by simply negating calculations for the side that had just been drawn. Braben estimates that this step alone sped up rendering by some 40%, while actually representing no large aesthetic sacrifice at all; most vehicles in our world are also symmetrical.

Another area of concern must be your control of your own spaceship, the one through whose cockpit you would be viewing this 3D universe. A spaceship, like an airplane, can change its orientation in six ways, being able to yaw, pitch, or roll in either direction. Yet a joystick can be moved in only four cardinal directions — perfect for a 2D world but problematic for their 3D world. Bell and Braben soon realized, however, that being in space saved them. With no ground, and thus no real notion of up and down with which to contend, turns could be accomplished by simply rolling to the desired orientation and pitching up or down; no need for a yaw control at all. While they took full advantage of the good parts of being in space, they also wisely decided not to try to make the game a remotely realistic simulation of spaceflight. Like Star Wars, their game would be one of dogfights in space, with ships inexplicably subject to a law of inertia that should have been left planetside. Anything else would just feel too disorienting, they judged. Most people would prefer to be Luke Skywalker rather than David Bowman anyway.

So, yes, this would be a game of space combat. That was always a given. But what should it be beyond that? How should that combat be structured, framed? With a workable 3D engine running at last after some months of concerted effort, it was time to ask these questions seriously. One alternative would be to make a traditional arcade-style game, complete with three lives, a score, and ever-escalating waves of enemy ships to gun down. To make, in other words, Battlezone with spaceships. Certainly what they already had was more than impressive enough to sell lots of copies.

Instead, Bell and Braben made their next visionary decision, to make their game something much more than just an arcade-style shooter. They would embed the shooting within a long-form experience that would give it a context, a purpose beyond high-score bragging rights. This was not, as effervescent popular histories of Elite‘s birth have often implied, completely unprecedented. Long-form experiences were not hard to find in computer games years before Bell and Braben — in adventures, in CRPGs, in strategy and war games. It was, however, rather more unusual to see this approach combined with action elements. Taken on their own, the action elements of Bell and Braben’s game were groundbreaking enough to go down as an important moment in gaming history. By refusing to stop there, they would ensure that their game would break ground in multiple directions, and go down as not just important but one of the most important ever.

The inspiration came from tabletop RPGs, a pastime both Bell and Braben indulged in from time to time, although, perhaps tellingly, usually not together. They liked the way an RPG campaign could span many, many sessions, could turn into an ongoing long-form narrative. And they liked the process of building up a character from a low-level nothing to a veritable god over weeks, months, or years. Of course, your “character” in their game was really your spaceship. Fair enough; your goal would be to upgrade that with ever better weapons and defenses that not coincidentally bore a strong resemblance to those in Bell’s favorite RPG: Traveller, the first popular tabletop RPG to replace swords and sorcery with rockets and rayguns. From here the rest of the design seemed to unspool almost of its own accord.

BBC Micro Elite BBC Micro Elite

They needed a mechanism for upgrading the ship, something more interesting than just adding the next piece to the ship automatically every time a certain score threshold was reached. The natural choice was money; every option would have a cost, letting players prioritize and truly make their spaceships their own.

Okay, but how to earn money? Drawing again from Traveller (a game whose imprint would be all over the finished Elite not just in mechanics but in its overall feel), you could be a trader plying the spaceways, buying low in one system in the hopes of selling high in another — a whole new strategic dimension.

But then how would that involve combat? Well, the ships attacking you could be pirates. This would also go a long way to explain why they were so chaotic and kind of random in their behavior, an inevitable result of limited memory and horsepower to devote to their artificial intelligence. Pirates, after all, were chaotic and kind of random by their very nature.

But actually landing on all those trading planets obviously wasn’t going to be workable; avoiding those complications was the reason for setting the game in space in the first place. No problem; you could just dock at space stations in orbit around them. Bell and Braben came up with a new challenge to make this more interesting: in a bit inspired by 2001: A Space Odyssey, you would have to carefully guide your spaceship into the rotating station’s docking bay at the end of every voyage. Of course, over time this could get tedious as well as frustrating (a botched approach generally means instant death). No problem; for a mere 1000 credits, you could buy a docking computer to do it for you. Other non-combat-oriented ship upgrades were also added to the catalog, like a fuel scoop to gather fuel by skimming the surface of a sun instead of buying it at a station.

If those spaceships attacking you really were pirates, thought Bell and Braben, the authorities would probably be quite pleased with you for shooting them down. Why not put bounties on them, so you could make your living as a bounty hunter if you got bored with trading? Now the possibilities really started rolling. If you could shoot pirates for money, you could also attack peaceful traders — become a pirate yourself, in other words, if you felt you could outduel the police Vipers that would attack you from time to time once your reputation became known. They came up with an alternative use for the fuel scoop: use it to scoop up the cargo of ships you’d destroyed to sell on the stations. The fuel scoop also became key to yet another way of making money: buy a special mining laser, break up asteroids with it, and scoop up the alloys they contained to sell stationside. If only they’d had more than 32 K of memory, they could have gone on like this forever.

But 32 K was all they had, and that was a constant challenge to their growing ambitions. For this grand game of trading to work, there had to be a big, varied galaxy to explore. There should be planets with a variety of economies and governments, from safe, established democracies for the conservative, peaceful trader to visit to anarchies home to hordes of pirates for the brave or foolhardy looking to make a big score. They came up with a scheme to let them pack all of the vital information about a star system with a single inhabited planet — its location, its economy, its type of government, its technology level, its population, its dominant species, its GDP, its size, even its name and a bit of flavor text — into just six bytes. Even so, a modest galaxy of 100 star systems would still require 600 bytes that they just couldn’t seem to find. Now came their most storied stroke of inspiration.

In 1202 an Italian mathematician named Fibonacci described a simple construct that became known as the Fibonacci sequence. In its classic form, you begin with two numbers, either 1 and 1 or 0 and 1. To get the third number in the sequence, you add the first two together. You then add the second and third number together to get the fourth. Etc., etc. A common and very useful variation is to drop all but the least significant digit of each number that is generated. It’s also common to begin the sequence not with 1 and 1 or 0 and 1 but some other, arbitrary pair. So, a sequence that begins with 2 and 7 would look like this:

2 7 9 6 5 1 6 7 3 0 3 3 6 9 5 4 9 3 …

The sequence appears random, but is actually entirely predictable for any given starting pair. This variation, however, is only a starting point. You can apply any rules you care to specify to a sequence of numbers with entirely predictable results, as long as you are consistent about it. Bell and Braben realized that they could seed their galaxy with any sequence they wished of six hexadecimal numbers to represent the starting system. Then they could manipulate those numbers in a predetermined way to generate the next; manipulate those to generate the next; etc. They decided that 256 systems was a good size for their galaxy. They needed just those initial six bytes to “store” all 256 planets. In addition to the memory savings, this method of generating their galaxy also saved Bell and Braben many hours spent designing it from scratch. Indeed, growing new galaxies from different starting seeds soon became a game of its own for them. They went through many iterations before finding the one that made it into the final game. Some they had to throw out right away for obvious reasons, such as the one with a system called “Arse” and the ones that had unreachable systems, outside of the player’s ship’s seven-light-year range from any other stars. Others just didn’t feel right.

By late 1983, after almost a year of steady work, the basics of what would become Elite were all in place in their heads if not entirely in their code. They decided it was time to see if anyone would be interested in publishing it. Braben believed they should try to find the biggest publisher possible, one with the reach to properly support and promote this game like no other. He accordingly secured them an appointment at the London offices of Thorn EMI, the recently instantiated software division of one of the largest media conglomerates in the world. Very much a sign of this heady period in British computing, Thorn EMI had been founded in the expectation that computer games were destined to be the next big thing in media. Like their colleagues over in EMI’s music division looking for the next big hit single, they weren’t looking for deathless art or niche audiences; they were looking for big, mainstream hits. They had developed a checklist of sorts, a list of what they thought would appeal to the general public that wasn’t all that far removed from Trip Hawkins’s guidelines for American “consumer software.” Their games should be simple, intuitive, colorful, and not too demanding. Bell and Braben’s complicated game — while it was a technical wonder; anyone could see that — was none of these things. They said it was nothing for them, although Bell and Braben were welcome to come back any time to show a reworked — i.e., simplified — version. (In the end, Thorn EMI would find that technology wasn’t ready for casual consumer software, and wouldn’t be for years. The hardcore was all they had to sell to. Unwilling or unable to adopt to this reality as Hawkins’s Electronic Arts eventually did, they faded away quietly without ever managing to find the breakout mainstream hit they sought.)

Bell suggested they try Acornsoft, who had already published his game Freefall. In many ways Acornsoft should have been the logical choice from the start. Bell already had connections there, they knew the BBC Micro better than anyone, and they were located right there in Cambridge practically next door to the university proper, an institution with which they had deep and abiding links. (Regular readers will remember that it was Acornsoft and Cambridge oceanography professor Peter Killworth who provided a commercial outlet for the adventure games created on Cambridge’s Phoenix mainframe.) Yet Braben was reluctant. Always the more commercially minded of the pair, he knew that Acornsoft was hardly at the forefront of the British games industry. Their modest lineup of adventure games, educational software, and utilities had some very worthy members, yet the operation as a whole, like most software adjuncts to hardware companies, always felt like a bit of an afterthought. With their limited advertising and doughtily minimalist packaging, no Acornsoft title had ever sold more than a few tens of thousands of copies, and most never cracked 5000 — a far cry from the numbers Braben fondly imagined for their game. Acornsoft’s association with Acorn also concerned him in that it would necessarily limit the game to only Acorn computers. He and Bell weren’t hugely fond of the Commodore 64 or especially the Sinclair Spectrum, but he knew that their game would have to be ported to those more prominent gaming platforms at some point if it was to realize its commercial potential. In short, Acornsoft was… provincial.

Still, he agreed to accompany Bell to Acornsoft’s offices. It was, to say the least, a place very different from Thorn EMI’s posh digs in central London. From Francis Spufford’s Backroom Boys:

[Acornsoft] operated from one room of a warren of offices above the marketplace. You got there by sidling around the dustbins next to the Eastern Electricity showroom. Past the window display of cookers and fridge-freezers, up a steep little staircase, and into a cramped maze that would remind one employee, looking back, of a level from Doom. “Very back bedroom,” remembered David Braben, approvingly. In Acornsoft’s office they found a rat’s nest of desks and cables, and four people not much older than themselves.

Two of those four people, managing directory David Johnson-Davies and chief editor Chris Jordan, would become the unsung heroes of Elite. Both got the game immediately, grasping not just its technical wizardry but also Bell and Braben’s larger vision for the whole experience. They both realized that this thing had the potential to be huge, bigger by an order of magnitude than anything Acornsoft had done before. Of course, it also represented a risk. Bell and Braben looked and acted like the couple of headstrong kids they still were. What if they flaked out? Nor was Acornsoft accustomed to issuing contracts and advances on unfinished software. Acornsoft had been conceived as an outlet for moonlighters and hobbyists, who sold them their homegrown software only once it was finished. Their normal policy was to not even look at programs that weren’t done; Bell and Braben were there at all only as a favor to Bell, a fellow with whom Acornsoft had a history and whom they liked personally. Still, Acorn as a whole was doing well; there was enough money to try something new, and this was too big a chance to pass up. They offered Bell and Braben a contract and an advance.

Now Braben made a move that would be as critical to Elite‘s success as anything in the game itself. Still concerned about Acornsoft’s provinciality, he negotiated a slightly lower royalty rate in exchange for he and Bell retaining all rights to their game beyond its implementation on Acorn computers. Not quite sure what he was on about, Johnson-Davies agreed. With his share of the advance, Braben bought his own BBC Micro, retiring his hacked and abused old Atom at last.

As Bell and Braben worked to finish their game, Acornsoft provided essential playtesting while Johnson-Davies and Jordan served as an invaluable source of guidance and a certain adult wisdom. Sometimes the latter was needed to keep their ambitions in check, as when Bell and Braben burst into the Acornsoft office one day having had an epiphany. They had realized that, if all they needed to grow a galaxy was a starting seed of six numbers, they could have an infinite number of them — well, okay, about 282 trillion of them — in the game. They could let the player buy a “galactic hyperdrive” to jump between them, whereupon they would just generate a new random seed and let it rip. Jordan now showed a sharp design instinct of his own in walking them back a bit. Having more galaxies sounds like a great idea, he said, but having so many will actually spoil the illusion of a real persistent universe you’ve worked so hard to create. It will all just start to feel like what it really is: random. Nor will many of these new galaxies be pleasing places to explore, since you won’t be able to look at them and reject the ones with unreachable systems and the like. Bell and Braben agreed to settle for just eight galaxies, with a total of 2048 star systems to visit. That should be more than enough for anyone. Perhaps too many for Bell and Braben and Acornsoft’s testers: a planet Arse sneaked into one of these later galaxies and made it into the released version of the game.

Even as they gently squashed some of Bell and Braben’s more outlandish ideas, Johnson-Davies and Jordon still felt like something was missing. For all its technical and formal innovations, for all its scope of possibility, the game lacked any sort of real goal. Now, to some extent that was just the nature of the beast Bell and Braben had created. They would have dearly loved to have a real story to give context, had even planned on it at some stage (Braben says that “trading was originally going to be a very minor aspect”), but they now had to accept the fact that they weren’t going to be able to wedge some elaborate plot along with everything else into 32 K. Still, suggested Johnson-Davies and Jordan, maybe they could add something simple, something to mark progress and give bragging rights. Thus was born the system of ranks, based on the number of kills you’ve achieved. You start Harmless. After notching eight kills you become Mostly Harmless (a nod to The Hitchhiker’s Guide to the Galaxy). Each rank thereafter is exponentially more difficult to achieve, until, after some 6400 kills, you become Elite. There was the goal, one that should keep players playing a good long time.

It was also in a backhanded sort of way a political statement. Cambridge University was awash with indignation over the policies of Margaret Thatcher; a major coal-miner’s strike which would become the battlefield for Thatcher’s final vanquishing of organized labor had the university’s liberal-arts wings all in a tumult from March of 1984. Bell and Braben bucked the university conventional wisdom to side with Thatcher. The player’s goal of becoming Elite was meant as a subtle nod toward the libertarian ideal of the self-made man, and a little poke in the eye of their leftist acquaintances. It also emphasized their view of their game as fundamentally about space combat, not trading. It gave players a compelling motivation to engage with what Bell and Braben still regarded as the most compelling part of the experience. You can make a lot of money as a peaceful, law-abiding trader who prudently runs from pirates when they show up, but you’ll never make Elite that way.

In finding an overarching goal they also found the title they’d been searching for for some time. They first planned to call the game The Elite, a name to celebrate the group that much of Cambridge was railing against. But the filenames used for the game just said “Elite.” In time, they dropped the article from the official title as well. Elite it became — shorter, punchier, and with fewer political ramifications for Acornsoft to deal with.

Similarly subtle swipes at Cambridge’s liberal-arts students, whom in the long tradition of hard-science students Bell and Braben regarded as mushy-minded prima donnas, made it into the text tables that Bell developed to describe the planets in the game. After the Fibonacci sequence had done its work, some were populated by “edible poets”; others by “carnivorous arts graduates.” Ah, youth.

Bell and Braben had disk drives on their BBC Micros. After compressing their code as much as they possibly could, they finally began to make use of their capabilities within the game. They split the game into two parts: the trading program, loaded in when you docked at a station, and the program handling travel and combat, loaded as soon as you left one. This concerned Acornsoft greatly because most BBC Micro owners still had only cassette drives, which didn’t allow such loading on the fly. What good was the game of the decade if most people couldn’t play it? So they convinced the two to fork the game three ways. One version, the definitive one with all the goodies, would indeed require a BBC Micro with a disk drive. Another, for a tape-equipped BBC Micro, would be similar but would offer a smaller variety of ships to encounter along with simplified trading and a bit less detail to planets you visited and to the overall experience. Finally, Acorn convinced them to create a third version, stripped down even more, for the BBC Micro’s little brother, the Acorn Electron, an attempt to compete with the cheap Sinclair Spectrum that Acorn had introduced the previous year.

Bell and Braben were naturally most excited about the disk-based version, particularly when they realized they had enough space still to add a little something extra. They made a couple of hand-crafted “missions” that pop up when you’ve been playing for a while: one to hunt down and destroy a stolen prototype of a new warship, another to courier some secret documents from one end of the galaxy to the other. These gave at least a taste of the more prominent story elements they wished they had space for.

Elite's packaging

While Bell and Braben finished up the coding, Johnson-Davies and Jordan worked to give the game the packaging and the launch it deserved. Acornsoft figured they needed to do all they could to justify the price they’d chosen to charge for the thing: from £12.95 to £17.65 depending on version, well over twice the typical going rate for a hot new game. They prepared a box of goodies the likes of which had never been seen before, not just from bland little Acornsoft but from anyone in the British games industry. Only some of the more lavish American packages, like those for the Ultimas and various Infocom games, could even begin to compare, and even by their standards Elite was grandiose. To a 63-page instruction manual Johnson-Davies and Jordan added The Dark Wheel, a separate scene-setting novella they commissioned from Robert Holdstock, an author just about to come into his own with the publication of his novel Mythago Wood. And they still weren’t done. They also added a ship-identification poster, a quick-reference guide, a keyboard overlay, some stickers, and a postcard to send to Acornsoft to tell them about it and get your certificate of achievement when you achieved the rank of Competent (an onscreen code revealed at that point would serve as proof). When the packaging began to come off the line they realized they had miscalculated slightly: the box, although far bigger than Acornsoft’s wont, still wasn’t quite big enough. It had a noticeable bulge, like it threatened to burst right out of its shrink wrap. This actually turned out to be a great way of advertising all the goodies inside, even if the boxes were just about impossible to repack and close again once opened.

Acornsoft stepped in and froze further development during the summer of 1984. The packaging was just about ready, and work on the game, while it would never be truly finished in the eyes of Bell and Braben, struck Acornsoft as about to reach a point of diminishing returns. And everyone was a little bit paranoid that something similar to Elite, even if it was nowhere near as good, might come out and steal their thunder. Bell and Braben grudgingly agreed that the time for release had come. But then, just as Acornsoft was about to send the master disk for duplication, Braben called Chris Jordan in a frenzy. They’d solved a niggling problem that had been bothering everyone for months, that of a “radar” scope to show where enemy ships are in relation to your own. The problem was, again, that of trying to map three dimensions onto two. Bell and Braben had done the best they could by providing two complimentary scanners that had to be read in conjunction to get the full picture, but it always felt, in contrast to just about everything else about the game, kind of clunky and un-ideal. Now they had come up with a way to pack everything onto a single screen. It was beautiful. Showing a commitment few publishers then or now could match, Acornsoft agreed to take the new version of the game, which brought with it the painful task of having the manual edited and re-typeset to describe the new radar scope. Now, two years after Braben had first started playing with 3D spaceship models, they were done.

Buzz about Acornsoft’s secret “Project Bell” had been high for months. Continuing to show a promotional instinct that no one had known they had in them, Acornsoft rented for launch day Thorpe Park, a small amusement park (nowadays a much bigger one) near London. In a darkened room, with suitably portentous music playing, the world got its first glimpse of Elite — and of its two creators, who for the next few years would be the face of the young British games industry. In their picture from the launch party they look much as the British public would come to know them: Braben in the foreground, glib and personable; Bell a bit more uncertain and stereotypically nerdy and, much to Acornsoft’s occasional chagrin, more liable to go off-script.

David Braben and Ian Bell

Elite itself, needless to say, became a hit. Acorn and Acornsoft were making a big play for the home-computer market that Christmas, trying to challenge Sinclair and Commodore on their own turf, and Elite became a big part of that push. Advertising was shockingly frequent and grandiose for anyone who remembered the Acornsoft of old. The £50,000 campaign even included some television spots. Actual sales figures for the Acornsoft Elite are hard to come by, but seem to have reached the vicinity of 100,000 units, a huge number for an absurdly expensive game on platforms not particularly popular with gamers. And most of those customers seemed to play Elite with an enthusiasm bordering on the obsessive. The first person known to become Elite was one Hal Bertram, on November 3, 1984, about five weeks after the game’s release. By the end of the year he had many companions in glory, while Acornsoft was positively flooded with postcards sent in by those attaining at least Competent status; they could barely make the badges they sent back to these folks fast enough. Undeterred, they sponsored a series of monthly contests culminating in a grand showdown at the Acorn Users Show.

Still, it was clear to Braben that the really big numbers would come only when Elite came to the Speccy and the Commodore 64. The game was the talk of the industry, with owners of those more popular platforms, who had not even been aware of Acornsoft’s existence a few months ago, clambering to play it after it — along with its creators — began appearing in places like Channel 4 News.

And now we see the significance of Braben’s determination to retain the rights to their game. He heard through the grapevine about a former literary agent named Jacquie Lyons, who had recently become the first agent representing game developers in Britain. Lyons:

A friend rang up and told me about Ian Bell and David Braben. Elite had just happened and Ian and David had retained all rights other than for the BBC, which was extremely bright of them. They wanted me to represent the rest of those rights.

With virtually every publisher in Britain dying for the rights to Elite, Lyons decided that there was one foolproof way to find out who really wanted them, and to make sure her new clients got served as well as possible in the process — i.e., paid as well as possible. At the beginning of December she held an auction, which, in her own words, “caused a lot of trouble in the industry — I was told this was an appalling way to go about it.” Lyons responded that such an approach was common in the publishing world from which she hailed. And what better way to ensure that your publisher would put everything they had into a game than to make them pay as dearly as possible for it? The deep pockets of British Telecom won the day amidst a flurry of media interest. Having just entered the software market with a new imprint called Firebird and eager to make a big splash with the highest-profile game in the industry, BT paid an undisclosed but “substantial” sum — Bell and Braben each got six figures up-front — for publishing rights to Elite on all platforms other than the Acorn machines. Suddenly Bell and Braben, who had yet to receive their first royalty checks from Acornsoft, were very wealthy young men.

For their part, Acornsoft allowed Bell and Braben to move on without fighting at all to retain Elite as a desperately needed platform exclusive. Indeed, they handled Bell and Braben’s departure with almost incomprehensibly good grace, even working out agreements to allow Firebird to reuse most of the wonderful supplemental materials they had stuffed into that bursting box. Perhaps they just had bigger fish to fry. Elite, you see, was the sole bright spot in a disastrous Christmas for Acorn as a whole, one rife with miscalculations which effectively wrecked the company. A desperate Acorn was purchased by the Italian firm Olivetti in 1985, and became thereafter a very different sort of place. The Acornsoft label was retired barely six months after its brief moment in the sun, with Johnson-Davies and Jordan and all of their colleagues going on to other things.

But the game they had introduced to the world was just getting started. Bell and Braben themselves ported it to the Commodore 64. That version is not quite as fast and smooth as the BBC version — the 64′s 6502 is clocked at just 1 MHz instead of the BBC’s 2 MHz — but took advantage of the 64′s better graphics and its positively cavernous 64 K of memory to add in compensation more color and a welcome touch of whimsy to undercut its otherwise uncompromisingly dog-eat-dog world. There’s a third special mission, this one a bit of silliness drawn from the beloved Star Trek episode “The Trouble with Tribbles.” When the tribble — excuse me, “trumble” — population aboard your ship has mushroomed to the point that the little buggers start crawling around the screen in front of you, it’s laugh-out-loud funny, even if it is just about impossible to figure out how to get rid of them absent spoilers. But best of all is the new music which plays during the automated docking sequence: Johann Strauss’s “The Blue Danube,” a tribute to everyone’s favorite part of 2001: A Space Odyssey. It comes as a complete surprise (if you haven’t read an article like this, that is…) when you first flip the switch to try out your hard-won docking computer and are greeted with this unexpected note of easy beauty. Soon your travels assume an addictive rhythm: the calculus of buying and selling, followed by the tension and occasional excitement of the voyage itself, followed by the grace notes of “The Blue Danube,” when you know you’ve survived another voyage and can sit back and enjoy a few minutes of peace before starting the process over again. Life in a microcosm?

The Commodore 64 Elite established a tradition of each port being largely hand-coded all over again; this gives each its own feel. Scottish developers Torus took on the challenging task of converting Elite to the Spectrum, which is built around a Z80 rather than the 6502 microprocessor at the heart of the BBC Micro and Commodore 64. Speccy Elite arrived several months after the Commodore 64 version and about a year after the original, touching off another huge wave of sales. Amidst the usual slate of added and lost features, it added yet more special missions, for a total of five. Missions became the most obvious way for the many individual developers who worked on Elite over the years to put their own creative stamp on the game, a trend actively encouraged by Bell and Braben; “just have your own fun” with the missions was always their response to requested advice. About the same time as the Spectrum Elite arrived in Britain, Firebird brought the Commodore 64 Elite to the United States, where it — stop me if you’ve heard this before — became a huge hit, one of relatively few games of the 1980s to make a major impact in both the European and North American markets. It served to establish Firebird as an important publisher in the U.S., the first such to be based in Britain and one which would give many other British games deserved exposure in that bigger market.

The ball was now well and truly rolling. For almost a decade the existing versions just kept on selling and the ports just kept on coming: to big players of the era like the IBM PC, the Apple II, the Atari ST, the Commodore Amiga, and the Amstrad CPC as well as occasional also-rans like the Tatung Einstein. Even the Nintendo Entertainment System got a surprisingly faithful and enjoyable version in 1991. In the end Elite made it to 17 separate platforms. Hard sales numbers for these many versions in many markets are, once again, beyond my abilities to dig up with any rigor. Ian Bell has guessed in one place that it sold about 600,000 copies. However, David Braben claims that Elite surpassed 1 million copies worldwide, a number which strikes me as perfectly reasonable. Elite was almost certainly the most commercially successful born-on-a-PC game of the 1980s, and if Braben’s figures are true very likely the first to pass that magic number of 1 million copies sold at some point during the very early 1990s.

Bell and Braben’s mainstream fame proved to be almost as enduring — in September of 1991 The One magazine could still write about the latter as “the most famous developer in Britain” — but their partnership less so. The two tried for some time to make Elite II for the BBC Micro and the Commodore 64, but never got close to completing it for reasons which vary with the teller. In Bell’s version, the game was just too ambitious for the hardware; in Braben’s, Bell was more interested in enjoying his new wealth and practicing his new hobby of martial arts than buckling down to work. Braben alone finally made and released Frontier: Elite II, a hugely polarizing sequel, in 1993. The erstwhile partners then spent the rest of the decade in ugly squabbles and petty lawsuits. To the best of my knowledge, the two still refuse to speak to one another. While both agreed to give talks upon the game’s 25th anniversary at the GameCity Festival in Nottingham in 2009, they agreed to do so only if they didn’t have to share a stage together. Like most people who have studied their history, I have my opinions about who is the more difficult partner and who is more at fault. In truth, though, neither one comes out looking very good.

Bell retired quietly to the country many years ago to tinker with mathematics, martial arts, and mysticism. He hasn’t released a game since the original Elite. Braben, in contrast, has built himself a prominent career as a designer and executive in the modern games industry. If he’s no longer quite the most famous developer in Britain, he’s certainly not all that far out of the running. He recently Kickstarted a new iteration of the Elite concept called Elite: Dangerous to the tune of more than £1.5 million, proof of the game’s enduring place in even the contemporary gaming zeitgeist and its enduring appeal as well as the cachet Braben’s name still carries.

And what is the source of that appeal? As with any great game for which it all just seemed to come together somehow, that can be a difficult question to fully answer. I could talk about how it was one of the first games to show the immersive potential of even the most primitive of 3D graphics, prefiguring the direction the entire industry would go a decade later. I could talk about how it was one of the first to graft a larger context to its core action-based gameplay, giving players a reason to care beyond wanting to run up a high score. I could talk about how perfectly realized its universe is, how it absolutely nails atmosphere; its cold beauty is just that, beautiful. Those minimalist wireframe spaceships are key here. I never quite felt that later iterations for more advanced platforms, which fill in the spaceships with color, felt quite like Elite. But then I suspect that for most folks the definitive version of Elite is the one they played first…

Maybe the most impressive thing that Elite evokes is a sense of possibility. You really do feel when you start playing, even today, even when you’ve read articles like this one and know most of its tricks, that you can go anywhere (as, given time and patience, you can), and that anything might happen there (okay, not so much). Yes, over time, especially over these jaded times, that sense fades, this Fibonacci universe starts to lose some of its verisimilitude, and it all starts to feel kind of samey. I must confess that when I played again recently for this article that point came for me long before I got anywhere close to becoming Elite. I think for the game to last longer for me I’d need some more of those story elements Bell and Braben originally hoped to include. But just the fact that that feeling is there, even for a little while, is amazing, the sort of amazing that makes Elite one of the most important computer games ever released. In addition to being a great play in its own right, it represents a fundamental building block of the virtual worlds of today and those still to come.

(In addition to being such a huge hit and such a seminal game historically, Elite comes equipped with a very compelling origin story. Together these factors have caused it to be written and talked about to a degree to which almost no other game of its era compares. Thus my challenge with this article was not so much finding information as sorting through it all and trying to decide which of various versions of events were most likely to be correct.

The lengthiest and most detailed print chronicle of all is that in the book Backroom Boys by Francis Spufford. More cursory histories have been published by Edge Online and IGN. Vintage sources used for this article include: Your Computer of December 1984; The One of January 1991 and September 1991; Micro Adventurer of January 1985; Home Computing Weekly of December 11, 1984; Personal Computing Weekly of August 23, 1984. David Braben’s talk at the 2011 Game Developers Conference was a goldmine, while Ian Bell’s home page has a lot of information in its archives. Other useful fan pages included FrontierAstro and The Acorn Elite Pages. And when you get bored with serious research, check out the Elite episode of Brits Who Made the Modern World, which in its first ten seconds credits the game with starting the British games industry and goes on to indulge in several other howlers before it’s a minute old. It makes a great example of the hilariously hyperbolic press coverage that always clings to Elite.

Finally, rather than provide a playable version of Elite here I’ll just point you once again to Ian Bell’s pages, where you’ll find versions for many, many platforms.)

 
28 Comments

Posted by on December 26, 2013 in Digital Antiquaria, Interactive Fiction

 

Tags: , ,

How Things Work: Commodore 64 and Summer Games Edition

I’m always trying to convey a sense of the audacity and creativity of hackers of the early PC era, who made so much out of so little. I include amongst this group both the hardware hackers who created the machines themselves and the software hackers who took them to place even their creators never imagined. In that spirit, I thought today we’d look at how the Commodore 64′s hardware team managed to make it do some of what it could given the technical constraints under which they labored, and how the software team who created Summer Games at Epyx found ways to make it do even more than they had fully considered. So, much of this article is for the gearheads among you, or at least those of you who’d like to understand a bit more of what the gearheads are on about. If you’re a less technical sort, perhaps you’ll be consoled by learning about some of the softer factors that went into the Summer Games design as well. And if that’s not interesting, hey, you can still watch my wife and I (mostly I) fail horribly at various Summer Games events via the movie clips.

This is, by the way, my first attempt to make use of WordPress 3.6′s integrated video capabilities. You’ll need an up-to-date browser with good HTML 5 support to see the clips. Hopefully my site won’t choke on the bandwidth demands. We’ll see how we go.

While you’re waiting (hopefully not too long) for the videos to load, let’s consider the basic visual capabilities of the Commodore 64: a palette of 16 colors at a resolution of 320 X 200. Those capabilities are, to say the least, modest by modern standards, but they actually present a huge problem when paired with another key specification: the 64 has just 64 K of RAM memory. This is all there is to work with; there is no separate bank of video memory, as on a modern computer. Everything — programs, data, the contents of the screen, and miscellaneous other things like buffers for the disk drive — must draw from this pool.

Now, a modern programmer wishing to represent a 320 X 200 screen with 16 colors in memory would probably just store it as a series of pixels, with one byte devoted to each pixel and storing a value of between 0 and 15 to represent that pixel’s color. This approach, known as bitmap graphics, is straightforward and eminently flexible, but there’s a problem. Consider: a 320 X 200 screen has exactly 64,000 pixels. In other words, by devoting one byte to each pixel we’ve just filled our entire 64 K of memory with a single screen.

Let’s consider then. Even a modern programmer, if she’s a more efficient sort, might note that we only actually need four bits to store a number between 0 and 15, and could therefore, at the cost of a bit more confusing layout, pack two pixels into every byte. That reduces consumption to a little under 32 K — better, but it’s still untenable to devote half of our precious memory to the screen.

It’s because bitmap graphics are so demanding that only high-end machines like the Apple Lisa and Macintosh used them by default at the time of Summer Games‘s release. And, notably, even those 68000-powered machines only displayed black and white, which reduced the requirement from four bits per pixel to one — a simple on-off, black-or-white toggle. Let’s consider the alternative that the 64′s designers, as well as those of many other machines, employed in various ways: character graphics.

Commodore 64 startup screen

In its default mode, the 64 subdivides its screen into a grid of character cells, each 8 X 8 pixels. Thus there are 40 of them across and 25 down, corresponding to the machine’s standard text display. Elsewhere in memory are a set of up to 256 tiles that can be copied into these cells. A default set, containing the glyph for each letter, number, and mark of punctuation in addition to symbols and simple line-drawing figures, lives in ROM. The programmer can, however, swap this set out for her own set of tiles. This system is conceptually the same as the tile-graphics system which Richard Garriott used in the Ultima games, but these tiles are smaller (only the size of a single character) and monochrome, just a set of bits in which 1 represents a pixel in the foreground color, 0 a pixel in the background color. The latter color is set globally, for the whole screen. The former is specified individually for each cell, via a table stored elsewhere in memory.

So, let’s look at what all this means in terms of memory. Each cell on the screen consumes one byte, representing the number (0 to 255) of the tile that is placed there. There are 1000 character cells on a 40 X 25 display, so that’s about 1 K consumed. We need 8 bytes to store each tile as an 8 X 8 grid of on-off pixels. If we use all 256, that’s 2 K. Finally, the color table with the foreground color for each cell fills another 1 K. We’ve just reduced 32 K to 4 K, or just 2 K if we use the default set of character glyphs in ROM. Not bad. Of course, we’ve also introduced a lot of limitations. We now have to build our display, jigsaw-puzzle style, from our collection of tiles. And each cell can only use two of our total of 16 colors, one of which can be unique to that cell but the other of which must be the same for the entire screen. For someone wishing to make a colorful game, this last restriction in particular may just be too much to accept.

Enter multicolor character mode. Here, we tell the 64 that we want each tile to be not monochrome but drawn in four colors. Rather than using one bit per pixel within the tile, we now use two, which allows us to represent any number from 0 to 3. One of these colors is still set individually for each cell; the other three are set globally, for the screen as a whole. And there’s another, bigger catch: because we still only devote eight bytes to each tile, we must correspondingly reduce its resolution, and that of the screen as a whole. Each tile is now 4 X 8 (horizontally elongated) pixels, the screen as a whole 160 X 200. Even so, this is easily the most widely used mode in Commodore 64 games. It’s also the mode that Scott Nelson (little brother of Starpath co-founder Craig Nelson) chose for Summer Games‘s flag selection screen.

Summer Games country selection screen

But… wait, you might be saying. Surely the colorful screen shown above doesn’t always use the same three of the four colors within each tile. In fact, it doesn’t, and this introduces us to one of the keys to getting the most out of the Commodore 64: raster interrupts.

The picture on a cathode-ray-tube television or monitor is generated by an electron gun which moves across and down behind the screen, firing charged electrons at phosphors that coat the back of the screen glass. This causes them to briefly glow — so briefly, in fact, that the gun must paint the screen 60 times per second for televisions using the North American NTSC standard, or 50 times for the European PAL standard, in order to display a stable image without flicker. After painting each line of the screen from left to right, the gun must move back to the left to paint the next. This split second’s delay can be exploited by the Commodore 64 programmer. She can ask the machine to generate what’s known as a raster interrupt when the gun finishes painting a given line. She then has a few microseconds to make changes to the display configuration before the gun starts painting the next line. She can, for example, change one or more of the three supposedly fixed colors, as Scott Nelson does to generate the screen shown above.

But let’s say we don’t want to deal with trying to create a picture using tiles. The Commodore 64 actually does also offer a bitmap mode of sorts, albeit one with restrictions of its own that allow it to reduce the memory footprint from an untenable 32 K to a more reasonable if still painful 9 K. Here an 8 K chunk of memory is allocated to the bitmap, with each bit representing the status (on or off) of a single pixel. The foreground color represented by an “on” pixel is once again determined by a 1 K color table, with the colors still sorted into 8 X 8 pixel blocks. This leads to the most obvious oddity of the 64′s bitmap mode: the bitmap does not run all the way across the screen and then down, but rather across and down through each 8 X 8 cell that is assigned a given foreground color.

Bitmap mode on the Commodore 64

For those willing to trade resolution for colors, there is also a multicolor bitmap mode, which, like the multicolor character mode, treats each two bits as representing a single pixel of one of four possible colors. Horizontal resolution is accordingly reduced to 160 pixels. This mode is, however, more flexible than multicolor character mode in its choice of colors. Another area of memory, of 1 K, is allocated to a collection of color pairs for each cell, each pair packed into a single byte. Thus we can freely choose three of the four colors found within each cell without resorting to raster interrupts or other tricks. Total memory devoted to the display in multicolor bitmap mode amounts to 10 K.

That may not sound like much at first glance, but for a programmer trying to shoehorn a complex game into 64 K it’s quite a sacrifice indeed. For this reason, and because its other restrictions could make it almost as challenging to work with as character mode, bitmap mode is not used as often as you might expect in Commodore 64 games. Summer Games is, however, a partial exception, employing bitmap mode in quite a number of places. For instance, Stephen Landrum’s opening-ceremonies sequence uses a multicolor bitmap. This sequence also demonstrates another critical part of the 64′s display hardware: sprites.

Doing animation by changing the contents of screen memory is very taxing on a little 8-bit CPU like the 64′s 6502, not to mention tricky to time so that changes are not made in the middle of screen paints, which would result in ugly jerking and tearing effects. Sprites come to the rescue. Indeed, their presence or absence is a good indication of whether a given machine from this era is pretty good at playing graphically intense games (the 64, the Atari 8-bit lines) or not (the Apple II, the IBM PC). A sprite is a relatively small graphical element which is overlaid onto the physical screen, but independent of the bitmap or tile map stored in memory. It can be moved about quickly at minimal cost, just by changing a couple of registers. The display circuitry does the rest.

The 64 offers eight sprites to the programmer, each exactly 24 pixels wide by 21 tall. The image for each is stored in memory as the usual grid of on/off bits, for the modest total of 64 bytes used per sprite. An on bit represents the sprite’s color, of which each has exactly one; an off bit represents transparency, so that whatever is on the screen behind shows through. This means that the 24 X 21 pixel pixel size is not so arbitrary as it may first appear; a smaller sprite can be displayed simply by turning off the unneeded pixels.

There is also the inevitable multicolor sprite, which gives us three foreground colors to work with at the expense of half of our horizontal resolution. In this mode, the sprite is effectively just 12 X 21 pixels, but each pixel is now twice as wide as before, resulting in the same physical width on the screen. As in multicolor character mode, the second and third colors are fixed across all sprites in this mode.

A sprite can be pointed to different addresses in memory for its image between screen paints, creating the possibility of making animated sprites which cycle through a sequence of frames, page-flip style. Likewise, single- and multicolor sprites can be placed together and moved in lockstep to create larger or more complex onscreen figures. In the sequence above, the runner is made from three single-color sprites, each of which cycles through 14 frames of animation. (If you’ve played Impossible Mission, he may look familiar to you: he is in fact the same sprite as your avatar in that game, which Dennis Caswell happily shared with his colleagues.) The flames are four multicolor sprites, each with four frames of animation. And each of the eight doves is a single single-color sprite of eight animation frames.

But… again, wait. That’s far more than eight sprites in total, isn’t it? As you may have guessed, Landrum uses raster interrupts to reconfigure and thus reuse sprites as each screen paint proceeds. With the addition of such tricks the 64′s effective limit becomes not eight sprites in total but no more than eight sprites horizontally parallel with one another.

Let’s take another example, this time one showing an actual, interactive event in action: Stephen Landrum’s pole vault. I have my usual mediocre performance in the clip that follows, but my wife Dorte kicks some ass and actually demolishes our old world record.

The screen you see here is another multicolor bitmap. The vaulter is made up of three single-color sprites, which cycle through seven frames of animation as he runs and are then changed appropriately to reflect his state after he goes airborne. The pole is three single-colored sprites and the crossbar is a single multicolored sprite, as is, surprisingly and cleverly, the stationary top of the nearer (right-hand) upright. To understand this last, we have to understand the 64′s concept of sprite priority. Sprites are numbered from 0 to 7. If two sprites overlap one another, the sprite with the lower number is drawn on top of the one with the higher number. Landrum uses this property to easily create the illusion of the jumper passing behind the nearer upright as he soars through the air.

You might have noticed that the pole, the crossbar, and the upright are all quite large. This is down to yet another feature of the 64′s sprite system. It’s possible to expand a sprite vertically or horizontally or both, doubling its size (but not its resolution).

The pole vault is not quite as polished as most of the events, which may be a sign that, as one of the later events completed, it was a bit rushed. There’s some odd artifacting in the pole, for instance. And there’s a wonderful bug that lets you vault under the crossbar on its highest setting, creating a world record for the ages.



The two swimming events, which were started by Randy Glover but finished by Landrum following the former’s abrupt resignation, are the most complex in Summer Games. They’re largely an exercise in rhythm; you have to press the joystick button as your swimmer’s arms enters the water, then releases it when they emerge. I’m awful at it, but Dorte is pretty good.



The clock at top right is formed from six single-color sprites, each swimmer from four. The rest of what you see here may begin to illustrate how crazy you can get with raster interrupts. Each paint begins with the 64 in single-color bitmap mode. This allows the text (“Ready… Set… Go!”), which is drawn and erased directly into the bitmap, to be rendered in the higher resolution. But then, just as the electron gun reaches the top of the stands, the screen is changed to a multicolor bitmap.

Glover and Landrum use a technique known as double buffering to make the scrolling as smooth as possible. There are actually two bitmaps in memory, one of which is always being displayed and the other of which is being updated by the CPU for the next step in the scroll. When the time comes, the two are swapped, as the 64′s VIC-II graphics chip is pointed to the other in the pair. Well, it’s almost that simple. Complications arise because the poor 6502 just doesn’t have time to completely redraw a screen in memory for every pixel of scroll. Luckily, it doesn’t have to. The VIC-II also has what are known as horizontal and vertical fine-scrolling registers. They allow the programmer to shift the bitmap that appears onscreen by from 1 to 7 bits to the right (as in the swimming events) or down. Since this will create an ugly empty zone at the edges of the display for which the computer has no pixel data to display, another register lets the programmer expand the size of the border slightly to cover these cells — the width of the screen is reduced from 40 to 38 columns, or the height from 24 to 23 lines. Now it’s possible for Glover and Landrum to scroll the screen eight pixels before having to swap to the alternate bitmap, giving the CPU time to make said bitmap. Double buffering is rather unusual to find on the 64, as it’s horrendously expensive in memory. And indeed, the swimming events use virtually every last byte.

But that’s probably enough tech talk for today. Just for fun — and because if you got through all that you’ve earned it — let’s look at the other events in somewhat less exhaustive (exhausting?) detail.

The two running events have their origin in Starpath’s old Supercharger decathlon project, but were brought to the 64 and completed by Brian McGhie. Like virtually everyone at Epyx, he had no particular knowledge of or burning interest in Olympic sports. He therefore relied on a stack of old Sports Illustrateds to try to get the look of his runners and the stadium right. The events are very similar in appearance, but unlike the swimming events very different in execution. The 100 meter dash is a notorious joystick killer. You have to move the stick back and forth as quickly as possible — nothing more, nothing less. The 4 X 400 meter relay, by contrast, is the most cerebral of the events, a game of energy conservation and chicken. I’m unaccountably good at both, much to Dorte’s frustration.



Interestingly, the scrolling in these events is implemented in an entirely different way from that in the swimming, illustrating how very much Summer Games is really a collection of individual efforts brought together under one banner. McGhie uses a multicolor character screen, and rather than using double buffering updates the hidden border areas on the fly to… but I promised to stop with the tech talk, didn’t I?

The diving event is yet another of Landrum’s. The diver here rather disconcertingly never surfaces after entering the water, simply because Landrum ran out of time.



Skeet shooting was a joint project of John Leupp, Steve Mudry, and Randy Glover prior to his departure. They originally planned to show the shooter on the screen, as in all of the other events, but found it difficult to work out a practical way of implementing the event from that perspective. So skeet shooting received the only first-person perspective in Summer Games, and the poor shooter was left out entirely.



Finally, there’s the gymnastics event — really just a vault — by Mudry. In an example of the, shall we say, casual approach to box art that was so rife in this era, the Summer Games box shows someone doing a handstand.



If nothing else, this article has hopefully conveyed what a tricksy machine the Commodore 64 is, full of hidden capabilities and exploitable quirks. Learning to make it dance for you requires considerable time even if you have examples to follow. If you don’t… well, small wonder that its games were just beginning to come into their own in 1984, the year it had its second birthday. And Epyx and companies like it were barely scratching the surface. In a couple of years Summer Games would look downright quaint.

You can download the original Commodore 64 Summer Games and its manual from here if you like, for use in the emulator of your choice (I recommend VICE). Unlike most of the disk images floating around the Internet, this one is pristine, with the original set of world records, so you and your friends and/or family can make your own records — which is about 20% of the fun of playing Summer Games — rather than be shamed by the performances of obsessed teenagers from two or three decades ago.

We’ll continue to observe the Commodore 64 scene with interest in future articles. But next we’ll check in with a group of Atari 8-bit loyalists: the backwoods savants of Ozark Softscape.

(This article draws again from the Epyx retrospectives in the July 1988 and August 1989 issues of Commodore Magazine. Technical details of Summer Games were drawn from the Commodore 64 design case study which appeared in the March 1985 IEEE Spectrum. I also lifted the diagram showing the 64′s unusual bitmap mode from there. For what it’s worth, my favorite 64 technical reference is Mapping the Commodore 64 by Sheldon Leemon. And if I may be forgiven a blatant plug, do check out my book on the Amiga if you’re interested in the sort of technical details I’ve delved into in this post. Some of what I go into in the book actually apply equally to the 64, and I explain basic concepts, starting with what a bit and byte actually are, much more fully there.)

 
 

Tags: , ,

From Automated Simulations to Epyx

When Robert Botch joined Automated Simulations as director of marketing just as 1982 expired, it wasn’t exactly the sexiest company in the industry. They were still flogging their Dunjonquest line, which now consisted of no less than eleven sequels, spinoffs, and expansions to Temple of Apshai. More than a year after co-founder Jon Freeman had left in frustration over partner Jim Connelley’s refusal to update Automated’s technology, the entire line was still derived from the same BASIC-based engine that had first been designed to run on a 16 K TRS-80 back in 1979. It was hard for anyone to articulate why someone would choose to play a Dunjonquest game in a world that contained Ultima and Wizardry. And, indeed, Automated’s sales numbers were not looking very good, and the company had stopped making money almost from the moment that the Ultima and Wizardry series debuted. Still, that hadn’t prevented them from benefiting from the torrents of venture capital that entered the young industry in 1982, courtesy of the pundits who were billing home computers as the next big thing to succeed the game consoles. But now the investors were getting worried, wondering if this stodgy company and their somewhat pedantic approach to gaming had really been such a good risk after all. Thus Botch, whom Connelley hired under pressure to remake Automated’s image.

Botch’s first assignment was to visit the Winter Consumer Electronics Show in Las Vegas that January, with Dunjonquest titles in tow to display to the crowd on a big-screen television rented for the occasion. Botch, who knew nothing about computers or computer games, didn’t much understand the Dunjonquest concept. He could hardly be blamed, for just trying to figure out which one you could play was confusing as hell: you needed to already have Temple of Apshai to play these additional games, but needed Hellfire Warrior to play those, etc. He was therefore relieved when another employee handed him a disk containing a straightforward, standalone action/puzzle game for the Atari home-computer line called Jumpman, a sort of massively expanded version of the arcade classic Donkey Kong with thirty levels to explore. Unusually for Automated, who usually developed games in-house, its presence was the result of an unsolicited third-party submission from a hacker named Randy Glover.

Randy Glover, developer of Jumpman

Randy Glover, developer of Jumpman

Botch was such a computer novice that he couldn’t figure out how to boot the game; his colleague had to tell him to “put it in that little slot over by the computer.” But when he finally got it working he fell in love. The rest of the show turned into an extended battle of wills between Botch and Connelley. The latter, who was determined to showcase the Dunjonquest games, would “come over, yell a lot, and tell me to take the disk out. Whenever he left the room, I’d load the program in again.” The crowd seemed to agree with Botch: he left CES with a notebook full of orders for the as yet unreleased Jumpman, convinced that in it he had seen the only viable future for his new employers.

The embattled Connelley saw his power further eroded the following month, when the investors brought in Michael Katz, an unsentimental, hard-driving businessmen with an eye for mainstream appeal. He had spent the past four years at Coleco, where he had masterminded the launch of some very successful handheld electronic games as well as the ColecoVision console, which had just sold more than 500,000 units in its first Christmas on the market. It was first agreed that Connelley and Katz would co-lead the company, but this was obviously impractical and untenable. In a scenario that could have easily happened to Ken Williams at Sierra if he had been less strong-willed and business-savvy, Connelley was being eased out of his own company by the monied interests he had welcomed with open arms. Seeing which way the wind was blowing, he left within months, taking a number of his loyalists with him to form a development studio he named The Connelley Group, which would release a couple of games through Automated before becoming free agents and eventually fading away quietly.

Katz, Botch, and the other newcomers were thus left alone to literally transform Automated Simulations into a new company. Automated had for some now been branding many of their games with the label of “Epyx,” arrived at because their first choice, “Epic,” was already taken by a record label. No matter; “Epyx” was a better name anyway, proof that even a blind-to-PR squirrel like Connelley could find a nut every now and again. Katz and Botch now made it the official name of the reborn company, excising all trace of the stodgy old “Automated Simulations” name. Gone also would be the nerdy old Dunjonquest line which positively reeked of Dungeons and Dragons sessions in parents’ basements. They would instead strive to make Epyx synonymous with colorful, accessible games like Jumpman, aimed straight at the heart of the mass market. The old slogan of “Computer Games Thinkers Play” now became “Strategy Games for the Action-Game Player,” and they hired Chiat Day, Apple’s PR firm and the hottest such in Silicon Valley, to remake Epyx’s image entirely.

Epyx

Jumpman itself made a good start toward that goal. It was a huge hit, especially once ported to the Commodore 64. One of first games to really take proper advantage of the 64′s audiovisual capabilities, it hit that platform like a nova at mid-year, topping the sales charts for months and probably becoming the bestselling single Commodore 64 game of 1983. It alone was enough to return Epyx to profitability. Unsurprisingly given commercial returns like that, from now on Epyx would develop first and most for the 64. They also hired Glover to work in-house. Before the end of the year he had already delivered a cartridge-based pseudo-sequel, Jumpman Junior, to reach ultra-low-end systems without a disk drive.

But now Katz had a problem. Other than Glover, he lacked the technical staff to make the Jumpmans of the future. Most of them had left with Connelley — and anyway games like their old Dunjonquests were exactly what the new Epyx didn’t want to be making. Then Starpath caught Katz’s eye.

Back in 1981, two former Atari engineers, Bob Brown and Craig Nelson, had founded Arcadia, Inc., eventually to be renamed Starpath after the release of the Arcadia 2001, an ill-conceived and short-lived games console from Emerson Radio Corporation. Drawing from friends, family, and former colleagues, Brown and Nelson put together a crack team of hardware and software hackers to make their mark in the Atari VCS market. Their flagship product was the marvelously Rube Goldberg-esque Supercharger, which plugged into the VCS’s cartridge port and added 6 K of memory (which may not sound like much until you remember that the VCS shipped with all of 128 bytes), new graphics routines in ROM, and a cable to connect the console to a cassette player. Starpath developed and released half a dozen games on cassette for use with the Supercharger, most of them apparently quite impressive indeed. But problems dogged Starpath. The company lived in constant fear of legal action by Atari, whom Brown and Nelson had not left on particularly good terms, in response to their unauthorized expansion. It did eventually become clear that Starpath had little to fear from Atari, but for the worst possible reason: the videogame market was collapsing, and Atari had far bigger problems than little Starpath. By late 1983 Starpath was floundering. Katz swooped, buying the entire company for a song and moving them lock, stock, and barrel from Santa Clara, California, into Epyx’s headquarters in nearby Sunnyvale.

Katz had no interest in any of Starpath’s extant products for a dead Atari VCS market. No, he wanted the programming talent and creative flair that had led to the Supercharger and its games in the first place. If they could do work like that on the Atari VCS, imagine what they could do with a Commodore 64. The Starpath folks would prove to be the final, most essential piece in his remaking of Epyx.

One of Starpath’s programmers, Dennis Caswell, had been playing around with ideas for a platforming action-adventure game before the acquisition. Indeed, he was already at work trying to animate the running man who would be the star. It was decided to let Caswell, who had three Supercharger games under his belt, run with his idea on the Commodore 64. He says his elation at the platform change was so great that “I unplugged my [Atari] 2600 and threw it out of my office and into the hall.” Working essentially alone, Caswell crafted one of the iconic Commodore 64 games and one of the bestselling in the history of Epyx: Impossible Mission.

Starpath had also been working on a decathlon simulation. In fact, it was far enough long to be basically playable. They discussed porting it to the 64, but the capabilities of that machine quickly led them to think about something more than just a simulation of track and field. Why not use the luxury of 64 K of memory and disk-based storage to simulate a broader cross-section of Summer Olympic events? With the 1984 Summer Olympics coming to Los Angeles, it seemed the perfect game for the zeitgeist, with exactly the sort of mass-market appeal Katz wanted from his new titles. He thought it a brilliant idea, and even went so far as to approach the Olympic Committee about making it an officially licensed product. He found, however, that Atari had long before sewn up the rights, back when they had been the fastest growing company in America. Epyx therefore decided to do everything possible to associate the game with the Olympics without outright declaring it to be an official Olympics simulation. They pushed the envelope pretty far: the game would be called Summer Games, would begin with an opening ceremony and a runner lighting a flame to the strains of “Bugler’s Dream,” would offer medals, would (as its advertising copy proclaimed) let you “go for the gold!” representing the country of your choice. Such legal boundary-pushing became something of a habit; witness Impossible Mission, which plainly hoped to benefit from an association with Mission: Impossible. (This in spite of the fact that Scott Adams had already been forced by the lawyers to change the name of his third adventure from Mission Impossible to Secret Mission.) In the case of Summer Games, Epyx likely got away with it because Atari was in no financial shape to press the issue and the Olympic Committee, never the most progressive institution, was barely aware of home-computer games’ existence. To this day many people are shocked to realize that Summer Games is not actually an official Olympics game. It all speaks to Katz’s determination to create games that felt up-to-date and relevant to the times. Yes, sometimes that could backfire, leading to trying-way-too-hard titles like Break Dance. Much of the time, however, it was commercial gold.

The original design brief for Summer Games called for ten events. The team also very much wished to include head-to-head, real-time competition wherever the nature of the sport being simulated allowed it. Beyond that, they would pretty much make it up on the fly; even the events themselves were largely chosen in the moment. The Starpath programmers’ talents were augmented by Randy Glover of Jumpman fame and Epyx’s first full-time artist, Erin Murphy. They were all under the gun from the start, for Katz wanted them to have something ready to show at the 1984 Winter CES, barely six weeks away when the project was officially green-lit. They worked through the holidays to deliver. Epyx arrived at CES with a very impressive albeit non-interactive opening-ceremonies sequence, fairly playable 4 X 400-meter relay and 100-meter dash races (both partially adapted from Starpath’s old decathlon project), and a diving event. At the show they learned that they had more competition in the (pseudo-)Olympics genre beyond Atari. HESWare, an aggressive up-and-comer not that dissimilar to Epyx who were about to sign Leonard Nimoy as their spokesman, showed HES Games. The prospect pushed Epyx to make sure Summer Games both met its planned pre-Summer Olympics release date and was as good as they could make it. To help with the former, the original plan for ten events was reduced to eight, principally via the sacrifice of weight lifting (fans of which sport would have to wait until 1986′s World Games to get their due). To help with the latter, more resources and personnel were poured into the project.

Even as this happened, attrition, a constant at Epyx, also became a concern. Katz’s new Epyx could be a rewarding place, but also an unrelentingly intense and competitive one, full of mathematical athletes convinced they were the smartest people in the room and all too happy to demonstrate it at their rivals’ expense. The spirit of competition extended beyond working hours; hundreds of dollars changed hands weekly in epic games of poker. Even some of Epyx’s brightest stars eventually found the company’s testoserone-and brainpower-fueled culture too much to take. Thus Starpath co-founder Bob Brown, finding Starpath’s new masters not to his liking, left quite soon after the acquisition, and Randy Glover, who had been assigned to the swimming events, abruptly left not long after CES. The swimming events were taken up by Stephen Landrum, the biggest single contributor to the project as a whole, who also did the opening ceremonies and the diving and pole-vaulting events.

It had been decided early on that Summer Games would let you compete as the representative of any of a variety of nations, complete with flags and national anthems to play during the medal ceremonies. Since it obviously would not be possible to include all of the 140 countries who would participate in the real Olympics, Epyx was left with the question of which ones should make the cut. Beyond the big, obvious powerhouses of the United States, the Soviet Union, and China, commercial considerations once again reigned supreme here. Katz had begun signing deals with foreign distributors, pushing hard to get Epyx’s games into the vibrant British and steadily emerging Western European software markets. Epyx reasoned that players in these countries would want the opportunity to represent their own nation. Thus relative Summer-Olympic non-factors like Norway and Denmark were included in the game, while potent teams from parts of the world that didn’t buy computer games, like East Germany, Romania, and Yugoslavia, were omitted. Most of the countries included had never been visited by anyone at Epyx. They sourced the flag designs from a world atlas, and called consulates and sales connections in Europe to drum up sheet music for the various anthems. Many of those anthems had never been heard by anyone working on the game; if some sound a bit “off” in tone or tempo, perhaps that’s the reason. For the coup de grâce, Epyx couldn’t resist including their own company as one of the “nations,” complete with a national anthem that was actually the Jumpman theme.

Summer Games was nearing the final crunch time on May, 8, 1984, when the Soviet Union initiated a boycott of the Los Angeles Games in a rather petty quid pro quo for the West’s boycott of the 1980 Moscow Games. (The people who were really hurt by both gestures were not the governments of the boycotees but a generation of athletes on both sides of the political divide, who lost what was for many literally a once-in-a-lifetime opportunity to compete against the true best of their peers on the biggest stage their sports could offer them.) Epyx quickly decided to leave the Soviet Union in their version of the Olympics. After the game’s release, they reached out a bit cheekily to the Soviets in real life. Blotch:

We sent the Russian [read: Soviet] embassy (in Washington, D.C.) several copies of Summer Games for the Commodore 64. An enclosed letter stated since they would not be competing in the regular Olympics, at least they could participate in our version of the Games. This package was eventually returned to us with a thank-you note, because they only had access to Atari home computers. Our marketing people quickly replaced the Commodore software with Atari material and sent it back. I always wondered if they enjoyed the game, because we never heard from them again.

Epyx’s bigger concern was the same as that of everyone involved with the Los Angeles Games, whether directly or tangentially: what commercial impact would the boycott have? It seemed it must inevitably tarnish the Games’ luster somewhat. In the case of both Summer Games and the Olympic Games themselves, the impact would turn out to be less than expected. The latter has gone down in history as the most financially successful Olympics of modern times, while Summer Games would become — and this probably comes as anything but a spoiler to most of you — one of the bestselling computer games of the year, and the first entry of the bestselling series in the history of the Commodore 64.

Katz was determined to get Summer Games out in June, to beat HES Games to the market and to derive maximum advantage from the pre-Olympics media buildup. The team worked frantically to finish the final two events (gymnastics and skeet shooting) and swat bugs. They worked all but straight through the final 72 hours. Disks went into production right on schedule, the morning after the code they contained had been finalized.

Summer Games

Summer Games went on to sell in the hundreds of thousands across North America and Europe, thoroughly overshadowing the less impressive Olympian efforts of HESWare and Atari, the latter of whose games were at any rate only available on their own faltering lines of game consoles and home computers. It would be ported to a variety of platforms, although it would always remain at its best on the Commodore 64. Together with Impossible Mission and a racing game developed by the indefatigable Landrum and Caswell called Pitstop II, both also huge worldwide smashes, Summer Games completed the remaking of Epyx’s image and made of them a worldwide commercial powerhouse. Being for the most part conceptually simple games without much dependence on text, most of Epyx’s games were ideally suited to do well in non-English-speaking countries. Combined with Katz’s aggressive distributional push, this was key to making Epyx one of the first big entertainment-software publishers that could be said to be truly international. With so many potential customers to serve in emerging new markets and several new hits in addition to the still popular Jumpman, sales in 1984 soared as Epyx enjoyed almost exponential growth in earnings as the months passed.

We’ll continue the story of Epyx later, but for now I’m not quite done with Summer Games. Next time I’d like to do something I haven’t done in a while: dig into the technology a bit and explain how some of the magic that wowed so many back in 1984 actually works. It was also give us a chance to get to know the Commodore 64, a computer whose importance to gaming during the middle years of the 1980s can hardly be overstated, just a little bit better.

(The bulk of this article is drawn from two lengthy retrospectives published in the July 1988 and August 1989 issues of Commodore Magazine. The pictures of Randy Glover comes from the April 1984 K-Power.)

 
 

Tags: , ,

A Computer for Every Home?

On January 13, 1984, Commodore held their first board of directors meeting of the year. It should have been a relaxed, happy occasion, a time to make plans for the new year but also one last chance to reflect on a stellar 1983, a year in which they had sold more computers than any two of their rivals combined and truly joined the big boys of corporate America by reaching a billion dollars in gross sales. During the last quarter of 1983 alone they had ridden a spectacular Christmas buying season to more than $50 million in profits. Commodore had won the Home Computer Wars convincingly, driving rival Texas Instruments to unconditional surrender. To make the triumph even sweeter, rival Apple had publicly announced the goal of selling a billion dollars worth of their own computers that year, only to fall just short thanks to the failure of the Lisa. Atari, meanwhile, had imploded in the wake of the videogame crash, losing more than $500 million and laying off more than 2000 workers. Commodore had just the previous summer moved into a sprawling new 585,000 square-foot, two story headquarters in West Chester, Pennsylvania that befitted their new stature. (Some of the manufacturing spaces and warehouses in the place were so large that Commodore veterans insist today that they had their own weather.) Yes, it should have been a happy time at Commodore. But instead there was doubt and trepidation in the air as executives filed into the boardroom on that Friday the 13th.

A day or two before, Jack Tramiel had had a heated argument with Irving Gould, Commodore’s largest shareholder and the man who controlled his purse strings, in the company’s private suite above their exhibit at the 1984 Winter Consumer Electronics Show. That in itself wasn’t unusual; these two corrupt old bulldogs had had an adversarial relationship for almost two decades now. This time, however, observers remarked that Gould was shouting as much as Tramiel. That was unusual; Gould normally sat impassively until Tramiel exhausted himself, then quietly told him which demands he was and wasn’t willing to meet. When Tramiel stormed red-faced out of the meeting and sped away in the new sports car he’d just gotten for his 55th birthday, it was clear that this was not just the usual squabbling. Now observers outside the board-of-directors meeting, which was being chaired as usual by Gould, saw him depart halfway through in a similar huff. He would never darken Commodore’s doors again.

No one who was inside that boardroom has ever revealed exactly what transpired there. With Gould and Tramiel both now dead and the other former board members either dead or aged, it’s unlikely that anyone ever will. On the face of it, it seems hard to imagine. What could cause these two men who had managed to stay together through the toughest of times, during which Commodore had more than once teetered on the edge of bankruptcy, to irrevocably split now, when their company had just enjoyed the best year in its history? We can only speculate.

Commodore had ceased truly being Tramiel’s company in 1966, when Gould swooped in to bail him out from the Financial Acceptance Scandal of the previous year. Tramiel, however, never quite got the memo. He continued to run the company like a sole proprietor to whatever extent that Gould would let him. Tramiel micro-managed to an astonishing degree. He did not, for instance, believe in budgets, considering them a “license to steal,” a guarantee that the responsible manager, knowing he had X million available, would always spend at least X million. Instead he demanded that every expenditure of greater than $1000 be approved personally by him, with the result that much of the company ground to a halt any time he took a holiday. Even as Tramiel enjoyed his best year ever in business, Gould and others in the financial community were beginning to ask the very reasonable question of whether this was really a sustainable way to run a billion-dollar company.

Still, the specific cause of Tramiel’s departure seems likely to have involved his sons. Tramiel valued family above all else, and, like a typical small businessman, dreamed of leaving “his” company to his three sons. Whether by coincidence or something else, it even worked out that each son had an area of expertise that would be critical to running a company like Commodore. Sam, the eldest, had trained in business management at York University, while Gary, the youngest, was a financial analyst with a degree from Manlow Park College and experience as a stockbroker at Merrill Lynch. Leonard, the middle child, was the intellectual and the gearhead; he was finishing a PhD in astrophysics at Columbia, and was by all accounts quite an accomplished hardware and software hacker. Sam and Gary already worked for Commodore, while Leonard planned to start as soon as he finished his PhD in a few more months. Various witnesses have claimed that Tramiel the elder now wished to begin more actively grooming this three-headed monster to take more and more of his responsibilities, and someday to take his place. Feeling nothing good could come out of such blatant nepotism inside a publicly traded corporation that was trying to put its somewhat seedy history behind it, Gould refused absolutely to countenance such a plan. Given Tramiel’s devotion to his family and his attitude toward Commodore as his personal fiefdom, it does make a degree of sense that this particular rejection might have been more than he could stomach.

In any case, Tramiel was gone, and Gould, who had made his fortune in the unglamorous world of warehousing and shipping and was reportedly both a bit jealous of Tramiel’s high profile in an exciting, emerging industry and a bit embarrassed by his gruff, untutored ways, didn’t seem particularly distraught about it. The man he brought in to replace him could hardly have been more different. Marshall F. Smith was a blandly feckless veteran of boardrooms and country clubs who had spent his career in the steel industry. It’s hard to grasp just why Gould latched onto Smith of all people. Perhaps he was following the lead of Apple, who the previous year had brought in their own leader from outside the computer industry, John Sculley. Sculley, however, understood consumer marketing, having cut his teeth at Pepsi, where he was the mastermind behind the Pepsi Challenge, still one of the most iconic and effective advertising campaigns in the long history of the Cola Wars. The anonymous world of Big Steel offered no comparable experience. Smith’s appointment was the first of a long string of well-nigh incomprehensible mistakes Gould would make over the next decade. Engineers that were initially thrilled to have proper funding and actual budgets at last were soon watching with growing concern as Smith puttered about with a growing management bureaucracy and let the company drift without direction. Many were soon muttering that it’s often better to make a decision — even the wrong decision — than to just let things hang. Whatever else you could say about Jack Tramiel, he never lacked the courage of his convictions.

Commodore’s first significant new models, which reached stores at last in October of 1984, more than two years after the Commodore 64, hardly did much to inspire confidence in the new regime. Nothing about the Commodore 16 and the Plus/4 made any sense at all. The 16 was an ultra-low-end model with just 16 K of memory, long after the time for such a beast had passed. The trend in even inexpensive 8-bit computers was onward, toward the next magic number of 128 K, not backward to the late 1970s.

The Commodore Plus/4

The Commodore Plus/4

As for the Plus/4, which like the 64 was built around a variant of the 6502 CPU and had the same 64 K of memory but was nevertheless incompatible… well, it was the proverbial riddle wrapped in a mystery inside an enigma. It was billed as a more “serious” machine than the 64, a computer for “home and business applications” rather than gaming, and priced to match at about $300, more than $100 more than the 64. It featured four applications built right into its ROM (thus the machine’s name): a file manager, a word processor, a spreadsheet, and a graphing program. All were pathetically sub-rate even by the standards of Commodore 64 applications, hardly the gold standard in business computing. The Plus/4 lacked the 64′s sprites and SID sound chip, which made a degree of sense; for a dismaying number of years yet a lack of audiovisual capability would be taken as a signifier of serious intent in computing. But why did it offer more colors, 128 as opposed to the 64′s 16? And as an allegedly more serious computer, why didn’t it offer the 80-column display absolutely essential for comfortable word processing and other typical productive tasks? And as a more serious (and expensive) computer, why did it have a rubbery keyboard almost as awful to type on as the IBM PCjr’s Chiclet model? And would all those serious, more productive buyers really be doing a lot of BASIC programming? If not, why was one of the main selling points a much better BASIC than the bare-bones edition found in the 64? Info, a magazine that would soon built a reputation for saying the things about Commodore’s bizarre decisions that nobody else would, gave the Plus/4 a withering review:

The biggest problem with the Plus/4 is the fundamental concept: an 8-bit, 64 K, 40-column desktop personal computer. Commodore already makes the best 8-bit, 64 K, 40-column desktop personal computer you can buy, with literally thousands of products supporting it! Why should consumers want a “new” machine with no significant advances, several new limitations, and virtually no third-party product support? And why would a company with no competition in the under-$500 category bring out an incompatible [machine] that can’t compete with anybody’s machine except their own? It just doesn’t compute!

Info ran a wonderfully snarky contest in the same issue, giving away the Plus/4 they’d just reviewed. After all, it was “sure to become a collector’s item!” Even the more staid Compute!’s Gazette managed to flummox a poor Commodore representative with a single question: “Why buy a 264 [a pre-release name for the Plus/4] instead of a 64 that has a word processor and, say, a Simon’s BASIC? It would be the equivalent of the 264 for less money.” Commodore happily claimed that the Plus/4 had enough utility built right in for the “average small business” (maybe they meant one of the vast majority that fail within a year or two anyway), but in reality it seemed like it had been cobbled together from spare parts that Commodore happened to have lying around. In fact, that’s not far from what happened — and Tramiel actually bears as much responsibility for the whole fiasco as the clueless Marshall Smith.

Tramiel, you’ll remember, had driven away the heart of his engineering team in his usual hail of recriminations and lawsuits shortly after they had created the 64 for him. He did eventually find more talented young engineers, notably Bil Herd and Dave Haynie. (Commodore always preferred their engineers young and inexperienced because that way they didn’t have to pay them much — a strategy that sometimes backfired but was sometimes perversely successful, netting them brilliant, unconventional minds who would have been overlooked by other companies.) When Herd arrived at Commodore in early 1983, engineers had been tinkering for some time with a new video and audio chip, the TED (short for Text Display). With engineering straitened as ever by Tramiel’s aversion to spending money, the 23-year-old Herd soon found himself leading a project to make the TED the heart of a new computer, despite the fact that it was in some ways a step back, lacking the sprites of the 64′s VIC chip and the marvelous sound capabilities of its SID chip. Marketing came up with the dubious idea of including applications in ROM, which by all accounts delighted Tramiel.

Tramiel, who at some fundamental level still thought of the computers he now sold like the calculators he once had, failed to grasp that the whole value of a computer is the ability to do lots of different things with it, to have lots and lots of options its designers may never have anticipated, all through the magic of software. Locking applications into ROM, making them impossible to replace or update, was kind of missing the point of building a computer in the first place. Failing to understand that a computer is only as good to consumers as the quality and variety of its available software, Tramiel also saw no problem with making the new machine incompatible with the 64. It seems to have come as a complete surprise to him when the machine was announced at that fateful Winter CES and everyone’s first question was whether they could use it to run the Commodore 64 software they already had.

After Tramiel’s abrupt departure, Commodore pushed ahead with the 16 and Plus/4 in the muddled way that would be their wont for the rest of the company’s life, despite a skeptical press and utterly indifferent consumers. It all made so little sense that some have darkly hinted of a conspiracy hatched by Tramiel amongst his remaining loyalists at Commodore to get the company to waste resources, time, and credibility on these obvious losers. (Tramiel recruited a substantial number of said loyalists to join him after he purchased Atari and got back in the home-computer game — exactly the sort of thing for which he so often sued others. But that’s a story for a later article.) Incredibly given the cobbled nature of the machine, it took nine more months after that CES to finally get the 16 and Plus/4 into production and watch them duly flop. Again, such a glacial pace would prove to be a consistent trait of the post-Tramiel Commodore.

By the time they did appear at last, the poor, benighted 16 and Plus/4 had more working against them than just their own failings, considerable as those may have been. The year as a whole was marked by failures in the home-computer segment of the market. Atari was reeling. Coleco was taking massive losses on their tardy entry into the home-computing field, the Adam. And of course I’ve already told you about the IBM PCjr.

Even Apple, who had enjoyed a splashy, successful launch of their new higher-end Macintosh (another story for a later date), had a somewhat disappointing new model amongst their bread-and-butter Apple II line. The “c” in the the Apple IIc’s name stood for “compact,” and it was indeed a much smaller version of Steve Wozniak’s old evergreen design. Like the Macintosh, it was a closed system designed for the end user who just wanted to get work (or play) done, not for the hackers who had adored the earlier editions of the II with their big cases and heaps of inviting expansion slots. The idea was that you would get everything you, the ordinary user, really needed built right in: all of the fundamental interface cards, a disk drive, a full 128 K of memory (as much as the Macintosh), etc. All you would really need to add to have a nice home-office setup was a monitor and a printer.

The Apple IIc

The Apple IIc

But the IIc was not envisioned just as a more practical machine: as the only II model after the first with which Steve Jobs played an important role, it evinced all of his famous obsession with design. Indeed, much of the external look and sensibility that we associate with Apple today begins as much here as with the just slightly older — and, truth be told, just slightly clunkier-looking — first Macintosh model. The Apple IIc was the first product of what would turn into a longstanding partnership with the German firm Frog Design. It marks the debut of what Apple referred to as the “Snow White” design language — slim, modern, sleek, and, yes, white. Everything about the IIc, including the packaging and the glossy manuals inside, oozed the same chic elegance.

Apple introduced the IIc at a lavish party and exhibition in San Francisco’s Moscone Center in April of 1984, just three months after a similar shindig to launch the Macintosh. The name was chosen to mollify restless Apple II owners who feared — rightly so, as it would turn out; even at “Apple II Forever” Jobs made time for a presentation on “The First 100 Days of Macintosh” — that Sculley, Jobs, and their associates had little further interest in them. Geniuses that they have always been for burnishing their own myths, Apple built a museum right there in the conference center, its centerpiece a replica of the garage where it had all begun. The IIc unveiling itself was an audiovisual extravaganza featuring three huge projection screens for the music video Apple had commissioned for the occasion. The most dramatic and theatrical moment came when Sculley held the tiny machine above him onstage for the first time. As the crowd strained to see, he asked if they’d like a closer look. Then the house lights suddenly came up and every fifth person in the audience stood up with an Apple IIc of her own to show and pass around.

Apple confidently predicted that they would soon be selling 100,000 IIcs every month on the strength of the launch buzz and a $15 million advertising campaign. In actuality the machine averaged just 100,000 sales per year over its four years in Apple’s product catalogs. The old, ugly IIe outsold its fairer sibling handily. This left Apple in a huge bind for a while, for they had all but stopped production of the IIe in anticipation of the IIc’s success while wildly overproducing IIcs for a rush that never materialized. Thus for some time stores were glutted with the IIcs that consumers didn’t want and couldn’t get their hands on the IIes that they did. (It’s interesting to consider that the PCjr almost certainly sold more units than the IIc, which has never been tarred with the label of outright flop, during each machine’s first year on the market. Narratives can be funny things.)

It remains even today somewhat unclear why the world never embraced the IIc as it had the three Apple II models that preceded it. There’s some evidence to suggest that consumers, not yet conditioned to expect each new generation of computing technology to be both smaller and more powerful than the previous, took the IIc’s small size to be a sign that it was not as serious or powerful as the IIe. Apple was actually aware of this danger before the IIc debuted. Thus the advertising campaign worked hard to explain that the IIc was more powerful than its size would imply, with the tagline, “Announcing a technological breakthrough of incredible proportions.” Yet it’s doubtful whether this message really got through. In addition, the IIc was, like the PCjr, an expensive proposition for the home-computer buyer: almost $1300, $300 more than a basic IIe. For that price you got twice the memory of the IIe as well as various other IIe add-on options built right in, but the value of all this may have been difficult for the novice buyer, the IIc’s main target, to grasp. She may just have seen that she was being asked to pay more for a smaller and thus presumably less capable machine, and gone with the bigger, more serious-looking IIe (if anything from Apple).

Then again, maybe the IIc was just born under a bad sign. As I’ve already noted, nobody was having much luck with their new home computers in 1984, almost regardless of their individual strengths and weaknesses.

But why was this trend so universal? That’s what people inside the industry and computer evangelists outside it were asking themselves with increasing urgency as the year wore on. As 1984 drew toward a close, the inertia began to affect even the most established warhorses, the Commodore 64 and the Apple IIe. Both Commodore and Apple posted disappointing Christmas numbers, down at least 20% from the year before, and poor Commodore, now effectively a one-product company completely reliant on continuing sales of the 64, sunk back well below that magic billion-dollar threshold again. In the grand scheme of things the Commodore 64 was still a ridiculously successful machine, by far the bestselling computer in the world and the preeminent gaming platform of its era. Yet there increasingly seemed to be something wrong with the home-computer revolution as a whole.

Commodore 64 startup screen

The fact was that a backlash had been steadily building almost from the moment that the spectacular Christmas 1983 buying season had ended. Consumers had begun to say, and not without considerable justification, that home computers promised far more than they delivered. Watching all those bright, happy faces in television and print advertising, people had bought computers expecting them to do the things that the computers there were doing. As Commodore’s advertising put it, “If you’re not pleased with what’s on your TV set tonight, simply turn on your Commodore 64.” Yet what did you get when you turned on your 64 — after you figured out how to connect it to your TV in the first place, that is? No bright fun, just something about 38,911 somethings, a READY prompt, and a cryptically blinking cursor. Everything about television was easy; everything about computers was hard. Computers had been sold to consumers like any other piece of consumer electronics, but they were not like any other piece of consumer electronics. For the vast majority of people — those who had no intrinsic fascination with the technology itself, who merely wanted to do the sorts of things those families on TV were doing — they were stubborn, frustrating, well-nigh intractable things. Ordinary consumer were dutifully buying computers, but computers were at some fundamental level not yet ready for ordinary consumers.

The computer industry was still unable to really answer the question which had dogged and thwarted it ever since Radio Shack had run the first ads showing a happy housewife sorting her recipes on a TRS-80 perched on the kitchen table: why do I, the ordinary man or woman with children to feed and a job to keep, need one? Commodore had cemented the industry’s go-to rhetoric with the help of William Shatner in their VIC-20 advertising campaign that first carved out a real market segment for home computers. You needed a computer for productivity tasks and for your children’s future, “Johnny can’t read BASIC” having replaced “Johnny can’t read” as the marker of a neglectful parent. Entertainment was relegated to an asterisk at the end: “Plays great games too!”

Yet, honestly, how productive could you really be with even the Commodore 64, much less the 5 K VIC-20? Some people did manage to do productive things with their 64s, but most of those who did forgot or decided not to ask themselves a simple question: is doing this on the computer really easier than the alternative? The answer was almost always no. Hobbyists chose to do things on the computer because it was cool, not because it was practical. Never mind if it took far more effort to keep one’s address book on the Commodore 64, what with its slow disk drive and quirky, unrefined software, than it would have to just have a paper card file. Never mind if it was much riskier as well, prone to deletion by an errant key swipe or a misbehaving disk drive. It was cooler, and that was all that mattered — to a technology buff. Most other people found it easier to address their Christmas cards by hand than to try to feed envelopes through a tractor-fed dot-matrix printer that made enough noise to wake the neighbors.

Perhaps the one possible compelling productive use of a machine like the Commodore 64 in the home was as a word processor. Kids today can’t imagine how students once despaired when their teachers told them that a report had to be typed back in the era of typewriters, can’t conceive how difficult it was to get anything on paper in typewritten form when every mistake made by untutored fingers meant trying to decide between pulling out the Liquid Paper or just starting all over again. But even word processing on the 64 was made so painful by the 40-column screen and manifold other compromises that there was room to debate whether the cure was worse than the disease. Specialized hardware-based word processors became hugely popular during this era for just this reason. These single-function, all-in-one devices were much more pleasant to use than a Commodore 64 equipped with a $30 program, and cheaper than buying a whole computer system, especially if you went with a higher priced and thus more productively useful model like the Apple II.

The idea that every child in America needed to learn to program, lest she be left behind to flip burgers while her friends had brilliant careers, was also absurd on the face of it. It was akin to declaring during the days of the Model T that every citizen needed to learn to strip down and rebuild one of these newfangled automobiles. Basic computer literacy was important (and remains so today); BASIC literacy was not. What a child really needed to know could largely be taught in school. Parents needn’t have fretted if Junior preferred reading, listening to music, playing sports, or practicing origami to learning the vagaries of PEEKs and POKEs in BASIC 2.0. There would be time enough for computing when computing and Junior had both grown up a bit.

So, everything had changed yet nothing had changed since the halcyon days of the trinity of 1977. Computers were transforming the face and in some cases the very nature of business, yet there remained just two compelling reasons to have one in the home: 1) for the sheer joy of hacking or 2) for playing games. Lots more computers were now being used for the latter than the former, thanks to the vastly more and vastly better games that were now available. But for many folks games just weren’t a compelling enough reason to own one. The Puritan ethic that makes people feel guilty of their pleasures was as strong in America then as it remains today. It certainly didn’t help that the media had been filled for several years now with hand-wringing about the effect videogames were having on the psyches of youngsters. (This prompted many computer publishers of this period to work hard, albeit likely with limited success, to label their computer games as something different, something more cerebral and rewarding and even, dare we say it, educational than their simplistic videogame cousins.)

But, perhaps most of all, computers still remained quite expensive when you really dug into everything you needed for a workable system. Yes, you could get a Commodore 64 for less than $200 by the Christmas of 1983. But then you needed a disk drive ($220) if you wanted to do, well, much of anything with it; a monitor ($220) if you wanted a nice picture and didn’t want to tie up the family television all the time; a printer ($290) for word processing, if you wanted to take that fraught plunge; a modem ($60) to go online. It didn’t take long until you were approaching four digits, and that’s without even entering into a discussion of software. There was thus a certain note of false advertising in that sub-$200 Commodore 64. And because these machines were being sold through mass merchandisers rather than dealers, there was no one who really knew better, who could help buyers to put a proper system together at the point of sale. Consumers, conditioned by pretty much everything else that was sold to them not to expect the 64 on its own to be pretty much useless, were often baffled and frustrated when they realized they had bought an expensive doorstop. Many of the computer sold during that Christmas of 1983 were turned on a few times only, then consigned to the back of the closet or attic to gather dust. The bad taste they put in many people’s mouths would take years to go away. Meanwhile the more complete, useful machines, like the Apple IIc and the PCjr, were still more expensive than a complete Commodore 64 system — and the games on them weren’t as good to boot. Hackers and passionate gamers (or, perhaps more commonly, their generous parents) were willing to pay the price. Curious novices largely were not. Faced with no really good all-purpose options, many — most, actually — soon decided home computers just weren’t worth it. The real home-computer revolution, as it turned out, was still almost ten years away. About 15% of American homes had computers — at least ostensibly; many of them were, as just mentioned, buried in closets — by January 1, 1985, but that figure would rise with agonizing slowness for the rest of the decade. People could still live perfectly happy lives fully plugged into the cultural discourse around them and raise healthy, productive children in the process without owning a computer. Only much later, with the arrival of the World Wide Web and computers equipped with more intuitive graphical user interfaces for accessing it, would that change.

Which is not to say that the software and information industries that had exploded in and around the home-computer revolution during 1982 and 1983 died just like that. Many of its prominent members, however, did, as the financial gambles they had taken in anticipation of the home-computer revolution came back to haunt them. We’ve just seen how Sierra nearly went under during this period. Muse Software and Scott Adams’s Adventure International, to name two other old friends from this blog, weren’t so lucky; both folded in 1985. Electronic Arts survived, but steered their rhetoric and choice of titles somewhat away from Trip Hawkins’s original vision of “consumer software” toward titles tilted more toward the hardcore, in proven hardcore genres like the CRPG and the adventure game.

Magazines were even harder hit. By early 1984 there were more than 300 professionally published computing periodicals of one sort or another, many of them just founded during the boom of the previous year. Well over half of these died during 1984 and 1985. Mixed in with the dead Johnny-come-latelys were some cherished veteran voices, among them pioneers Creative Computing (1974), SoftSide (1978), and Softalk (1980). The latter’s demise, after exactly four years and 48 issues of sometimes superb people-focused journalism, came as a particular blow to the Apple II community; Apple historian Steven Weyhrich names this moment as nothing less than the end of the “golden age” of the Apple II. Those magazines that survived often did so in dramatically shrunken form. Compute!, for instance, went from 392 pages in December of 1983 to 160 ten months later.

Yet it wasn’t all doom and gloom. Paradoxically, some software publishers still did quite well. Infocom, for example, had the best single year in their history in 1984 in terms of unit sales, selling almost 750,000 games. It seemed that, with more options than ever before, software buyers were becoming much more discerning. Those publishers like Infocom who could offer them fresh, quality products showing a distinctive sensibility could do very well. Those who could not, like Adventure International with their tired old two-word parsers and simplistic engines, suffered the consequences. That real or implied asterisk (“Plays great games too!”) at the end of the advertising copy remained the great guilty secret of the remaining home-computer industry, the real reason computers were in homes at all. Thankfully, the best games were getting ever more complex and compelling; otherwise the industry may have been in even more trouble than it actually was.

Indeed, with a staggering number of machines already out there and heaps still to be sold for years to come, the golden age for Commodore 64 users was just beginning. This year of chaos and uncertainty was the year that the 64 really came into its own as a games machine, as programmers came to understand how to make it sing. Companies who found these keyboard maestros would be able to make millions from them. The home-computer revolution may not have quite panned out as anticipated and the parent company may have looked increasingly clueless, but for gamers the Commodore 64 stood alone with its combination of audiovisual capability, its large and ever growing catalog of games, and its low price. What with game consoles effectively dead in the wake of Atari’s crash and burn, all the action was right here.

In that spirit, we’ll look next time at the strange transformation that led one of our stodgiest old friends from earlier articles to become the hip purveyor of some of the slickest games that would ever grace the 64.

(The indispensable resources on Commodore’s history remains Brian Bagnall’s On the Edge and its revised edition, Commodore: A Company on the Edge. Frank Rose’s West of Eden is the best chronicle I know of this period of Apple’s history. The editorial pages and columnists in Compute! and Compute!’s Gazette provided a great unfolding account of a chaotic year in home computing as it happened. Particular props must go to Fred D’Ignazio for pointing out all of the problems with the standard rhetoric of the home-computer revolution in Compute!‘s May 1984 issue — but he does lose points for naming the PCjr as the answer to all these woes in the next issue.)

 

Tags: ,

Business is War

In the 64 Commodore had their potentially world-beating home computer. Now they needed to sell it. Fortunately, Jack Tramiel still had to hand Kit Spencer, the British mastermind behind the PET’s success in his home country and the VIC-20′s in the United States. The pitchman for the latter campaign, William Shatner, was no longer an option to help sell the 64. His contract had run out just as the new machine was released, and his asking price for another go-round had increased beyond what Tramiel was willing to pay in the wake of his hit television series T.J. Hooker and the movie Star Trek II: The Wrath of Khan. Spencer decided to forgo a pitchman entirely in favor of a more direct approach that would hammer on the competition while endlessly repeating those two all-important numbers: 64 K and less than $600. He queued up a major advertising blitz in both print and television for the 1982 Christmas season, the second and final time in their history that Commodore would mount such a concentrated, smart promotional effort.

Effective as it was, the campaign had none of the creativity or easy grace of the best advertising from Apple or IBM. The ads simply regurgitated those two critical numbers over and over in a somewhat numbing fashion, while comparing them with the memory size and price of one or more unfortunate competitors. Surprisingly, there was little mention of the unique graphics and sound capabilities that in the long run would define the Commodore 64 as a platform. It almost seems as if Commodore themselves did not entirely understand the capabilities of the chips that Al Charpentier and Bob Yannes had created for them. Still, Spencer showed no fear of punching above his weight. In addition to the 64′s obvious competitors in the low-end market, he happily went after much more expensive, more business-oriented machines like the Apple II and the IBM PC. Indeed, here those two critical numbers, at least when taken without any further context, favored the 64 even more markedly. The computer industry had never before seen advertising this nakedly aggressive, this determined to name names and call out the competition on their (alleged) failings. It would win Commodore few friends inside the industry. But Tramiel didn’t care; the ads were effective, and that was the important thing.

Commodore took their shots at the likes of Apple and IBM, but the real goal had become ownership of the rapidly emerging low-end — read, “home computer” — market. Tramiel’s competition there were the game consoles and the two other computer makers making a serious mass-market play for the same consumers, Atari and Texas Instruments. For the lower end of the low-end (if you will) Commodore had the VIC-20; for the higher end, the 64.

Atari 5200

Atari 5200

Atari’s big new product for Christmas 1982 was the 5200, a new console based on the same chipset as their computer designs. (Those chips had originally been designed for a successor to the VCS, but rerouted into full-fledged computers when sales of the current VCS just kept increasing. Thus the process finally came full circle, albeit three years later than expected.) The 5200 was something of a stopgap, a rather panicked product from a company whose management had long since lost interest in engineering innovations. It actually marked Atari’s first major new hardware release in three years. Research and development, you see, had shrunk to virtually nil under the stewardship of CEO Ray Kassar, a former titan of the textile industry who held videogames and his customers in something perilously close to contempt. Despite being based on the same hardware, the 5200 was inexplicably incompatible with cartridges for the existing Atari home computers. Those games that were available at launch were underwhelming, and the 5200 was a major disappointment. Only the VCS — now retroactively renamed the 2600 to account for the new 5200 — continued to sell in good quantities, and those were shrinking steadily. Aside from the 2600 and 5200, Atari had only its two three-year-old computers, the chinzy, little-loved 400 and the impressive but also more expensive 800 with only 48 K of memory. With the latter selling for upwards of $600 and both machines halfheartedly (at best) promoted, the big battles of the conflict that the press would soon dub the “Home Computer Wars” would be fought between TI and Commodore. It would be a disappointing Christmas for Atari, and one which foretold bigger problems soon to come.

Put more personally — and very personal it would quickly become — the Home Computer Wars would be fought between Jack Tramiel and the youthful head of TI’s consumer-products division, William J. Turner. The opening salvo was unleashed shortly before the 64′s introduction by, surprisingly, TI rather than Commodore. At that time the TI-99/4A was selling for about $300, the VIC-20 for $240 to $250. In a move they would eventually come to regret, TI suddenly announced a $100 rebate on the TI-99/4A, bringing the final price of the machine to considerably less than that of the inferior VIC-20. With TI having provided him his Pearl Harbor, Jack Tramiel went to war. On the very same day that Turner had opened hostilities, Tramiel slashed the wholesale price of the VIC-20, bringing the typical retail price down into the neighborhood of $175. Despite this move, consumers chose the TI-99/4A by about a three to one margin that Christmas, obviously judging its superior hardware worth an extra $25 and the delayed gratification of waiting for a rebate check. Some fun advertising featuring Bill Cosby didn’t hurt a bit either, while Commodore’s own contract with William Shatner was now history, leaving little advertising presence for the VIC-20 to complement the big push Spencer was making with the 64. TI sold more than half a million computers in just a few months. Round One: TI.

Of course, the 64 did very well as well, although at almost $600 it sold in nowhere near the quantities it eventually would. In those days, computers were sold through two channels. One was the network of dedicated dealers who had helped to build the industry from the beginning, a group that included chains like Computerland and MicroAge as well as plenty of independent shops. A more recent outlet were the so-called mass merchandisers — discounters like K-Mart and Toys ‘R’ Us that lived by stacking ‘em deep and selling ‘em cheap, with none of the knowledge and support to be found at the dealers. Commodore and TI had been the first to begin selling their computers through mass merchandisers. Here Tramiel and Turner shared the same vision, seeing these low-end computers as consumer electronics rather than tools for hobbyists or businessmen — a marked departure from the attitude of, say, Apple. It really wasn’t possible for a computer to be successful in both distribution models. As soon as it was released to the merchandisers, the game was up for the dealers, as customers would happily come to them to get all of their questions answered, then go make the actual purchase at the big, splashy store around the corner. Commodore’s dealers had had a hard time of it for years, suffering through the limited success of the PET line in the American market only to see Commodore pass its first major sales success there, the VIC-20, to the mass merchandisers. They were understandably desperate to have the 64. Cheap as it was for its capabilities, it still represented much more of an investment than the VIC-20. Surely buyers would want to take advantage of the expertise of a real dealer. Tramiel agreed, or at least claimed to. But then, just as the Christmas season was closing, he suddenly started shipping the 64 to the mass merchandisers as well. Dealers wondering what had happened were left with only the parable of the scorpion and the frog for solace. What could Jack say? It was just his nature. By the following spring the street price of a Commodore 64 had dropped below $400, and it could be found on the shelves of every K-Mart and Toy ‘R’ Us in the country.

With the Commodore 64 joining the VIC-20 in the trenches, Christmas 1982 was looking like only the opening skirmish. 1983 was the year when the Home Computer Wars would peak. This was also the year of the Great Videogame Crash, when the market for Atari 2600 hardware and software went into free fall. In one year’s time Atari went from being the darling of Wall Street to a potentially deadly anchor — hemorrhaging millions of dollars and complete with a disgraced CEO under investigation for insider trading — for a Warner Communications that was suddenly desperate to get rid of it before it pulled the whole corporation down. Just as some had been predicting the previous year, home computers moved in to displace some of the vacuum left by the 2600′s sudden collapse.

Atari 1200XL

Atari 1200XL

In a desperate attempt to field a counterargument to the 64, Atari rushed into production early in 1983 their first new computer since introducing the 400 and 800 more than three years before. Thanks to a bank-switching scheme similar to that of the 64, the Atari 1200XL matched that machine’s 64 K of memory. Unfortunately, it was in almost every other respect a disaster. Atari made the 1200XL a “closed box” design, with none of the expansion possibilities that had made the 800 a favorite of hackers. They used new video hardware that was supposed to be better than the old, but instead yielded a fuzzier display on most monitors and televisions. Worst of all, the changes made to accommodate the extra memory made the new machine incompatible with a whole swathe of software written for the older machines, including many of the games that drove home-computer sales. An apocryphal story has sales of the Atari 800 dramatically increasing in the wake of the 1200XL’s release, as potential buyers who had been sitting on the fence rushed to buy the older machine out of fear it would soon be cancelled and leave them no option but the white elephant that was the 1200XL.

Whatever the truth of such stories, sales for the Atari computer line as a whole continued to lag far behind those of Commodore and TI, and far behind what would be needed to keep Atari a viable concern in this new world order. Huge as Atari (briefly) was, they had no chip-making facilities of their own. Instead, their products were full of MOS chips amongst others. Not only were both their console and computer lines built around the 6502, but MOS manufactured many of the game cartridges for the 2600 and 5200. Thus even when Commodore lost by seeing a potential customer choose an Atari over one of their own machines they still won in the sense that the Atari machine was built using some of their chips — chips for which Atari had to pay them.

Atari would largely be collateral damage in the Home Computer Wars. As I remarked before, however, it was personal between Tramiel and TI. You may remember that almost ten years before these events Commodore had been a thriving maker of calculators and digital watches. TI had entered those markets along with Japanese companies with devices built entirely from their own chips, which allowed them to dramatically undercut Commodore’s prices and very nearly force them out of business. Only the acquisition of MOS Technologies and the PET had saved Commodore. Now Tramiel, who never forgot a slight much less a full-on assault, could smell payback. Thanks to MOS, Commodore were now also able to make for themselves virtually all of the chips found in the VIC-20 and the 64, with the exception only of the memory chips. TI’s recent actions would seem to indicate that they thought they could drive Commodore out of the computer market just as they had driven them out of the watch and calculator markets. But this time, with both companies almost fully vertically integrated, things would be different. Bill Turner’s colossal mistake was to build his promotional campaign for the TI-99/4A entirely around price, failing to note that it was not just much cheaper than the 64 but also much more capable than the VIC-20. As it was, no matter how low Turner went, Tramiel could always go lower, because the VIC-20 was a much simpler, cheaper design to manufacture. If the Home Computer Wars were going to be all about the price tag, Turner was destined to lose.

The TI-99/4A also had another huge weakness, one ironically connected with what TI touted as its biggest strength outside of its price: its reliance on “Solid State Software,” or cartridges. Producing cartridges for sale required vastly more resources than did distributing software on cassettes or floppy disks, and at any rate TI was determined to strangle any nascent independent software market for their machine in favor of cornering this lucrative revenue stream for their own line of cartridges. They closely guarded the secrets of the machine’s design, and threatened any third-party developers who managed to write something for the platform with law suits if they failed to go through TI’s own licensing program. Those who entered said program would be rewarded with a handsome 10 percent of their software’s profits. Thus the TI-99/4A lacked the variety of software — by which I mainly mean games, the guilty pleasure that really drove the home-computer market — that existed for the VIC-20 and, soon, the 64. Although this wasn’t such an obvious concern for ordinary consumers, the TI-99/4A was thus also largely bereft of the do-it-yourself hacker spirit that marked most of the early computing platforms. (Radio Shack was already paying similarly dearly for policies on their TRS-80 line that were nowhere near as draconian as those of TI.) This meant far less innovation, far less interesting stuff to do with the TI-99/4A.

Early in 1983, Commodore slashed the wholesale price of the VIC-20 yet again; soon it was available for $139 at K-Mart. TI’s cuts in response brought the street price of the TI-99/4A down to about $150. But now they found to their horror that the tables were turned. TI now sat at the break-even point, yet Commodore was able to cut the price of the VIC-20 yet further, while also pummeling them from above with the powerful 64, whose price was plunging even more quickly than that of the VIC-20. TI was reduced to using the TI-99/4A as a loss leader. They would just break even on the computer, but would hopefully make their profits on the cartridges they also sold for it. That can be a good strategy in the right situation; for instance, in our own time it’s helped Amazon remake the face of publishing in a matter of a few years with their Kindle e-readers. But it’s dependent on having stuff that people want to buy from you after you sell them the loss leader. TI did not; the software they had to sell was mostly unimpressive in both quality and variety compared to that available for the developer-friendly Commodore machines. And the price of those Commodore machines just kept dropping, putting TI deeper and deeper into a hole as they kept struggling to match. Soon just breaking even on each TI-99/4A was only a beautiful memory.

By September the price of a 64 at a big-box discount store was less than $200, the VIC-20 about $80. Bill Turner had already been let go in disgrace. Now a desperate TI was selling the TI-99/4A at far below their own cost to make them, even as Commodore was continuing to make a modest profit on every unit sold thanks to continuous efforts to reduce production costs. At last, on October 28, 1983, TI announced that it was pulling out of the PC market altogether, having lost a stunning half a billion dollars on the venture to that point in 1983 and gutted their share prices. The TI-99/4A had gone from world beater to fiasco in barely nine months; Turner from visionary to scapegoat in less. As a parting shot, TI dumped the rest of their huge unsold inventory of TI-99/4As onto the market, where at a street price of $50 or less they managed to cause a final bit of chaos for everyone left competing in the space.

But this Kamikaze measure was the worst they could do. Jack Tramiel had his revenge. He had beaten Bill Turner, paid him back with interest for 1982. More importantly, he had beaten his old nemesis TI, delivering an embarrassment and a financial ache from which it would take them a long time to recover. With the battlefield all but cleared, 1983 turned into the first of two Christmases of the Commodore 64. By year’s end sales were ticking merrily past the 2-million-unit mark; they would get even stronger during 1984. Even with all the discounting, North American sales revenue on Commodore’s hardware for 1983 more than doubled from that of 1982. A few non-contenders like the Coleco Adam and second-stringers like Atari’s persistent computer line aside, the Home Computer Wars were over. When their MOS chip-making division and their worldwide sales were taken into account, Commodore was now bigger than Apple, bigger than anyone left standing in the PC market with the exception only of IBM and Radio Shack, both of whose PC divisions accounted for only a small part of their total revenue. The 64 had also surpassed the Apple II as simply the computer to own if you really liked games, while also filling the gap left by the imploded Atari VCS market and, increasingly as the price dropped, the low-end home-computer market previously owned by the VIC-20 and TI-99/4A. Thanks to the Commodore 64, computer games were going big time. Love the platform and its parent company or hate them (and plenty did the latter, not least due to Tramiel’s instinct for the double cross that showed itself in more anecdotes than I can possibly relate on this blog), everybody in entertainment software had to reckon with them. Thanks largely to Commodore and TI’s price war, computer use exploded in the United States between 1982 and 1984. In late 1982, Compute!, a magazine pitched to the ordinary consumer with a low-cost home computer, had a monthly circulation of 100,000. Eighteen months later it was over 500,000. The idea of 500,000 people who not only owned PCs but were serious enough about them to buy a magazine dedicated to the subject would have sounded absurd at the time that the Commodore 64 was launched. And Compute! was just one piece of an exploding ecosystem.

Yet even at this, the supreme pinnacle of Tramiel’s long career in business, there was a whiff of the Pyrrhic in the air as the battlefield cleared. The 64 had barely made it out the door before five of its six principal engineers, the men who had put together such a brilliant little machine on such a shoestring, left Commodore. Among them were both Al Charpentier, designer of its VIC-II graphics chip, and Bob Yannes, designer of its SID sound chip. The problems had begun when Tramiel refused to pay the team the bonuses they had expected upon completing the 64; his justification was that putting the machine together had taken them six months rather than the requested three. They got worse when Tramiel refused to let them start working on a higher-end follow-up to the 64 that would offer 80-column text, a better disk system and a better BASIC, and could directly challenge the likes of the Apple II and IBM PC. And they reached a breaking point when Tramiel decided not to give them pay raises when review time came, even though some of the junior engineers, like Yannes, were barely making a subsistence living.

The five engineers left to start a company of their own. For a first project, they contracted with Atari to produce My First Computer, a product which would, via a membrane keyboard and a BASIC implementation on cartridge, turn the aged VCS into a real, if extremely limited, computer for children to learn with. Tramiel, who wielded lawyers like cudgels and seemed to regard his employees as indentured servants at best, buried the fledgling start-up in lawsuits. By the time they managed to dig themselves out, the VCS was a distant memory. Perhaps for the best in the long run: three of the engineers, including Charpentier and Yannes, formed Ensoniq to pursue Yannes’s love of electronic music. They established a stellar reputation for their synthesizers and samplers and eventually for a line of sound cards for computers which were for years the choice of the discriminating audiophile. Commodore, meanwhile, was left wondering just who was going to craft the follow-up to the 64, just as they had wondered how they would replace Chuck Peddle after Tramiel drove him away in a similar hail of legal action.

Tramiel also inexplicably soured on Kit Spencer, mastermind of the both the VIC-20 and the 64′s public roll-out, although he only sidelined him into all but meaningless bureaucratic roles rather than fire and/or sue him. Commodore’s advertising would never again be remotely as effective as it had been during the Spencer era. And in a move that attracted little notice at the time, Tramiel cut ties with Commodore’s few remaining dealers in late 1983. From now on the company would live or die with the mass merchandisers. For better or worse, Commodore was, at least in North America, now every bit a mass-market consumer-electronics company. The name “Commodore Business Machines” was truly a misnomer now, as the remnants of the business-oriented line that had begun with the original PET were left to languish and die. In later years, when they tried to build a proper support network for a more expensive machine called the Amiga, their actions of 1982 and 1983 would come back to haunt them. Few dealers would have any desire to get in bed with them again.

In January of 1984 things would get even stranger for this company that never could seem to win for long before a sort of institutionalized entropy pulled them sideways again. But we’ll save that story for later. Next time we’ll look at what Apple was doing in the midst of all this chaos.

(I highly recommend Joseph Nocera’s article in the April 1984 Texas Monthly for a look at the Home Computer Wars from the losers’ perspective.)

 
5 Comments

Posted by on December 20, 2012 in Digital Antiquaria, Interactive Fiction

 

Tags: ,