RSS

My Eamon Problem

Fair warning — this post is going to be a bit meta. It has two purposes. The first is easily dispensed with: to tell you that I’ve revised my earlier posts on the history of Eamon to reflect what I believe to be a more supportable chronology which does not have the system appearing until late 1979. The rest of what follows describes briefly how I came to my conclusions. This is all rather inside baseball, but those of you thinking of growing up to become digital antiquarians yourselves might be interested in this slice of my poor detail-obsessed life.

Traditional histories have given Eamon a release date of 1980, presumably because the first published article about the system, a piece written by Don Brown himself for Recreational Computing, dates from the summer of that year. I initially saw no reason to doubt the traditional chronology. But then I made contact with John Nelson, founder of the National Eamon Users Club. He dropped a bomb on me by saying he had first played Eamon in 1978, and that at that time there were already four additional scenarios available. As the guy who probably did more for Eamon than anyone else, including its creator, Nelson was a hard fellow to doubt. So I wrote those posts based largely on his chronology, even though I never could manage to feel really confident in it. Ever since, those posts have remained the ones I’m least happy about. My dissatisfaction was such that I recently started rummaging through all of the early Eamon disks again, looking for something that would let me pin a definite date onto at least one of them, and thereby begin to build a chronology. As it happened, I found what I was looking for, and that in turn prompted me to revise the earlier articles and write this post. Before I tell you what I found, however, let me first state some of the misgivings that sent me looking in the first place.

The Apple II actually had two versions of the BASIC language. The original machine had in its ROM a very stripped-down version of the language, one that had been put together quickly by Steve Wozniak himself. This version was soon dubbed “Integer BASIC” because it had no support for floating-point (i.e., decimal) numbers, only integers. Because floating-point numbers are very important to certain types of applications, Apple quickly realized the need for a better, more complete implementation of BASIC. They bought one from Microsoft and spent considerable effort customizing it for the Apple II. They dubbed it Applesoft BASIC upon its release in January of 1978. Applesoft was initially not widely used, however, both because its earliest incarnation was quite buggy and because it was housed on tape or disk rather than in ROM, meaning the user had to load it into RAM to use it. With most machines still equipped with only 16 K of memory in these early days, Applesoft, which consumed 10 K by itself, was impractical for most users. It only really caught on from May of 1979, when Apple began shipping the II Plus with Applesoft in ROM; to run an Integer BASIC program on the II Plus, one had to load that language in from disk.

Yet Eamon is written in Applesoft BASIC. And there’s something else: the standard Eamon needs pretty much all of a 48 K Apple II’s memory. (The master disk did originally contain a special, stripped-down version of the program for 32 K machines.) It’s doubtful that it would even be possible to load Applesoft from disk and still have room for Eamon. Even if it was, a 48 K machine would have been a very unusually powerful one for 1978. After the 48 K Apple II Plus began shipping, however, the larger memory quite quickly became an expected standard.

And there’s the text-adventure chronology problem. Scott Adams first released Adventureland and Pirate Adventure during the second half of 1978 for the TRS-80. These games did not appear on the Apple II until early the following year, where they represent the first text adventures available for that platform. To have developed Eamon in 1978, Brown would have had to either: 1) be aware enough of the TRS-80 world that he played Adams’s games and decided to implement a similarly parser-based interface on the Apple II ; 2) have played Crowther and Woods’s Adventure or one of the other games it spawned on a big institutional computer; or 3) have come up with the concept of the text-adventure interface on his own, from scratch. None of these are impossible, but none seems hugely likely either. Depending on when in 1978 Eamon was released, an early Eamon even creates the somewhat earthshaking possibility that it may have been Brown, not Scott Adams, who first brought the text adventure to the microcomputer. Again, this just doesn’t feel right to me.

And then there’s that Recreational Computing article itself. In it Brown writes, “I know of five additional adventure diskettes.” Nelson, on the other hand, believes that “about 20” adventures were available by 1980. He suggested to me that Brown was perhaps referring to adventures that he himself had not written, but it’s very hard for me to read this sense into the paragraph in question. Nelson’s other suggestion, that the article had just lain on the shelf for many months before being printed, seems equally a stretch. If everything else pointed to an earlier chronology, I could accept such reasoning, but in combination with the other questions it becomes a good deal harder.

And then I found what I was looking for. Eamon #3, The Cave of the Mind, was the first not to be written by Brown himself, being from Jim Jacobson and Red Varnum. At the beginning of one of its programs is an REM statement with an actual date: January 30, 1980. This was enough to tip me back over to something much closer to the traditional chronology, with Brown developing the system in the latter half of 1979 in the wake of the Apple II Plus’s release. Sure, it’s possible that the date in the code of Cave represents a revision date rather than a date of completion or release, even though it doesn’t say this. But weighed together with all the other evidence, I feel pretty confident a later date for Eamon is more likely than an earlier.

None of this is meant to criticize John Nelson, who generously shared his memories with me. It’s just that 30 years is a long time. It’s also possible that Nelson might have played an earlier proto-Eamon, presumably written in Integer BASIC for an Apple II with much less memory, which Brown expanded at a later date into the Eamon we know today. Yet unless some real documentary evidence surfaces, or Brown suddenly starts talking, that remains only speculation.

So, the current Eamon articles still represent something of a best guess, and as such I’m still not entirely happy with them. But I think it’s a better guess than the one I made the first time around. Barring more new data, that will have to do.

 
 

Tags: ,

Castle Wolfenstein

One night circa early 1981, Silas Warner of Muse Software dropped by a local 7-Eleven store, where he saw an arcade game called Berzerk.

Berzerk essentially played like an interactive version of the programming game Warner had just finished writing on the Apple II, Robot War. The player controlled a “humanoid” who looked more than a little like a robot himself, battling an array of other robots each equipped with their own armaments and personalities. But most impressively, Berzerk talked. The enemy robots shouted out science-fiction cliches like “Intruder alert!” and, Dalek style, single-word imperatives like “Attack!,” “Kill!,” and “Destroy!” Warner was entranced, especially considering that one of Muse’s flagship products was Warner’s own The Voice, an Apple II voice-synthesis system. Still, he’d had enough of robots for a while.

Then one night the old World War II flick The Guns of Navarone came on the television. The most successful film of 1961, it’s the story of a tiny group of Allied commandos who make their way across a (fictional) Greek island to destroy a vital German gun installation. Like most films of its ilk, it can be good escapist fun if you’re in the right frame of mind, even if most of its plot is forehead-slappingly silly. After seeing Navarone, Warner started thinking about whether it might be possible to replace robots with Nazis. One nice thing about filmic Nazis, after all, is that they tend to be as aggressively stupid as videogame robots, marching blithely into trap after ambush after deception while periodically shouting out “Achtung!,” “Jawohl!,” and “Sieg Heil!” in lieu of Berzerk‘s “Attack!,” “Kill!,” and “Destroy!” (One imagines that the Greeks in the movie, when not engaging in ethnically appropriate song and dance or seducing our heroes with their dewy-eyed, heroic-resistance-fighter gazes, must be wondering just how the hell they managed to get themselves conquered by this bunch of clowns.) Other elements of the movie also held potential. The heroes spend much of the latter half disguised in German uniforms, sneaking about until someone figures out the ruse and the killing has to start again. What a game mechanic!

So, from the odd couple of Berzerk and The Guns of Navarone was born Castle Wolfenstein.

Given Wolfenstein‘s position in the history of ludic narrative, it’s appropriate that it should have resulted from the pairing of an arcade game with a work of fiction. Wolfenstein was the first game to unify the two strands of computer gaming I described in my previous post, combining a real story and fictional context with action mechanics best carried out with a joystick or set of paddles. Yet even this gameplay also demanded considerable thought, even strategizing, for success. In the console world, Warren Robinett had attempted a similar fusion a couple of years earlier with the Atari VCS game Adventure, which was directly inspired by Crowther and Woods’s game of the same name. Still, the VCS was horribly suited to the endeavor. Because it couldn’t display text at all, Adventure couldn’t set the scene like Wolfenstein did when the player first started a game. The following is mouthed by a dying cellmate in the castle/fortress in which you are being held prisoner:

“WELCOME TO CASTLE WOLFENSTEIN, MATE! THE NAZIS BROUGHT YOU HERE TO GET INFORMATION OUT OF YOU BEFORE THEY KILL YOU. THAT’S WHAT THIS PLACE IS FOR – IF YOU LISTEN YOU CAN HEAR THE SCREAMS. THEY’VE ALREADY WORKED ME OVER AND I’LL NEVER GET OUT ALIVE, BUT MAYBE YOU CAN WITH THIS GUN. I GOT IT OFF A DEAD GUARD BEFORE THEY CAUGHT ME. IT’S STANDARD ISSUE – EACH CLIP HOLDS 10 BULLETS, AND IT’S FULLY LOADED.

“BE CAREFUL, MATE, BECAUSE EVERY ROOM IN THE CASTLE IS GUARDED. THE REGULAR GUARDS CAN’T LEAVE THEIR POSTS WITHOUT ORDERS, BUT WATCH OUT FOR THE SS STORMTROOPERS. THEY’RE THE ONES IN THE BULLETPROOF VESTS AND THEY’RE LIKE BLOODY HOUNDS. ONCE THEY’VE PICKED UP YOUR TRAIL THEY WON’T STOP CHASING YOU UNTIL YOU KILL THEM AND YOU ALMOST NEED A GRENADE TO DO THAT.

“CASTLE WOLFENSTEIN IS FULL OF SUPPLIES TOO. I KNOW ONE CHAP WHO FOUND A WHOLE GERMAN UNIFORM AND ALMOST SNEAKED OUT PAST THE GUARDS. HE MIGHT HAVE MADE IT IF HE HADN’T SHOT SOME POOR SOD AND GOT THE SS ON HIS TRAIL. IF YOU CAN’T UNLOCK A SUPPLY CHEST, TRY SHOOTING IT OPEN. NOW I WOULDN’T GO SHOOTING AT CHESTS FULL OF EXPLOSIVES…

“ONE MORE THING. THE BATTLE PLANS FOR OPERATION RHEINGOLD ARE HIDDEN SOMEWHERE IN THE CASTLE. I’M SURE YOU KNOW WHAT IT WOULD MEAN TO THE ALLIED HIGH COMMAND IF WE COULD GET OUR HANDS ON THOSE…

“THEY’RE COMING FOR ME! GOOD LUCK!

“AIIIIEEEEEEE….”

Once into the game proper the text dries up, but there are still elements that make it feel like some facsimile of a real situation rather than an exercise in abstract arcade mechanics. The “verbs” available to the player are very limited in comparison to, say, even an old-school text adventure: move, aim, shoot, search a surrendered soldier or corpse, open a door or chest, throw a grenade, use a special item, take inventory. Yet the game’s commitment to simulation is such that this limited suite of actions yields a surprising impression of verisimilitude. One can, for example, use a grenade to blow up guards, but one can also use it to blast holes in walls. Such possibilities make the game a tour de force of early virtual worldbuilding; arguably no one had created a simulated world so believable on such a granular level prior to Wolfenstein.

There is even some scope for moral choice. If you catch them by surprise, guards will sometimes lift their arms in surrender, at which point you are free to kill them or leave them alive, as you will. Similarly, the game allows different approaches to its central problem of escape. One can attempt to methodically dispatch every single guard in every single room, but one can also try to dodge past them or outrun them, only killing as a last resort. Or one can find a uniform, and (in the game’s most obvious homage to The Guns of Navarone) try to just walk right out the front door that way. These qualities have led many to call Wolfenstein the first ancestor of the much later genre of stealth-based games like Metal Gear Solid and Thief. I don’t know as much about such games as I probably ought to, but I see no reason to disagree. The one limiting factor on the “sneaking” strategy is the need to find those battle plans in order to achieve full marks. To do that you have to search the various chests you come across, something which arouses the guards’ suspicion. (These may be videogame Nazis, but they aren’t, alas, quite that stupid.)

In order to make the game a replayable exercise (shades of the arcade again), the castle is randomly stocked with guards and supplies each time the player begins a new game. In addition, play progresses through a series of levels. The first time you play you are a private, and things are appropriately easier — although, it should be noted, never easy; Wolfenstein is, at least for me, a punishingly difficult game. Each time you beat the game on a given level, you increase in rank by one, and everything gets more difficult the next time around. The ultimate achievement is to become a field marshal.

In Warner’s own words, he threw “everything” Muse had on their shelf of technical goodies into Wolfenstein. For instance, we once more see here the high-res character generator Warner had also used in Robot War.

But most impressive was the inclusion of actual speech, a first for a computer game. To really appreciate how remarkable this was, you first have to understand how extraordinarily primitive the Apple II’s sound hardware actually was. The machine contained no sound synthesizer or waveform generator. A program could make sound only by directly toggling current to the speaker itself. Each time it did this, the result was an audible click. Click the speaker at the appropriate frequency, and you could create various beeps and boops, but nothing approaching the subtlety of human speech — or so went the conventional wisdom. The story of Wolfenstein‘s talking Nazis begins back in 1978, when a programmer named Bob Bishop released a pair of programs called Apple-Lis’ner and Appletalker.

Every Apple II shipped with a port that allowed a user to connect to it a standard cassette drive for storage, as well as the internal hardware to convert binary data into sound for recording and vice versa. Indeed, cassettes were the most common storage medium for the first few years of the Apple II’s life. Bishop realized that, thanks to the cassette port, every Apple II effectively contained a built-in audio digitizer, a way of converting sound data into binary data. If he attached a microphone to the cassette port, he should be able to “record” his own speech and store it on the computer. He devised a simplistic 1-bit sampling algorithm: for every sample at which the level of the incoming sound was above a certain threshold, click the speaker once. The result, as played back through Appletalker, was highly distorted but often intelligible speech. Warner refined Bishop’s innovations in 1980 in The Voice. It shipped with a library of pre-sampled phonemes, allowing the user to simply enter text at the keyboard and have the computer speak it — if the program properly deduced what phoneme belonged where, of course.

For Wolfenstein, Warner took advantage of an association that Muse had with a local recording studio, who processed Muse’s cassette software using equalizers and the like to create tapes that Muse claimed were more robust and reliable than those of the competition. Warner: “We went down there [to the studio] one fine day, and I spent several hours on the microphone saying, ‘Achtung!'” Given the primitive technology used to create them (not to mention Warner’s, um, unusual German diction), Wolfenstein‘s assorted shouts were often all but indecipherable. Rather than hurting, however, the distortion somehow added to the nightmare quality of the scenario as a whole, increasing the tension rather than the contrary.

Castle Wolfenstein

Warner’s magnum opus as a designer and programmer, Castle Wolfenstein remained Muse’s most successful product and reliable seller from its release in September of 1981 through Muse’s eventual dissolution, not only in its original Apple II incarnation but also in ports to the Atari 400 and 800, MS-DOS, and (most notably) the Commodore 64. Muse produced a belated sequel in 1984, Beyond Castle Wolfenstein, in which the player must break into Adolf Hitler’s underground bunker to assassinate the Fūhrer himself rather than break out of a generic Nazi fortress. However, while Warner was involved in design discussion for that game, the actual implementation was done by others. The following year, Muse suddenly collapsed, done in by a string of avoidable mistakes in a scenario all too common for the early, hacker-led software publishers. Warner stayed in the games industry for another decade after Muse, but never found quite the creative freedom and that certain spark of something that had led to Robot War and Castle Wolfenstein in his banner year of 1981. He died at the age of 54 in 2004. Wolfenstein itself, of course, lived on when id Software released Wolfenstein 3D, the precursor to the landmark Doom, in 1992.

Whether we choose to call Castle Wolfenstein the first PC action adventure or the first stealth game or something else, its biggest importance for ludic narrative is its injection of narrative elements into a gameplay framework completely divorced from the text adventures and CRPGs that had previously represented the category on computers. As such it stands at the point of origin of a trend that would over years and decades snowball to enormous — some would say ridiculous — proportions. Today stories in games are absolutely everywhere, from big-budget FPSs to casual puzzlers. With its violence and cartoon-like Nazi villains, Wolfenstein is perhaps also a harbinger of how cheap and coarse so many of those stories would be. But then again, we can’t really blame Warner for that, can we?

If you’d like to try Silas Warner’s greatest legacy for yourself, you can download the Apple II disk image and manual from here.

Next time we have some odds and ends to clean up as we begin to wrap up 1981 at last.

 
 

Tags: , ,

This Game Is Over

Before the famous Videogame Crash of 1983 there was the Videogame Crash of 1976. By that year Atari’s Pong had been in arcades for four years, along with countless ball-bouncing variants: Handball, Hockey, Pin Pong, Dr. Pong, and of course Breakout. The public was already growing bored of all of them, as well as with the equally simplistic driving and shooting games that made up the rest of arcade fare. As videogame revenues declined, pinball, the form they were supposed to have superseded, started to make a comeback. Even Atari themselves started a pinball division, as manufacturers began applying some of the techniques they’d learned in videogames to a new generation of electromechanical pinball tables that rewarded players with lots of sounds, flashing lights, and high-score leaderboards. When Atari introduced its VCS home-game console in October of 1977, sales were predictably sluggish. Then, exactly one year later, Space Invaders arrived.

Developed by the Japanese company Taito and manufactured and sold in North America under license by Midway, Space Invaders had the perfect theme for a generation of kids entranced with Star Wars and Close Encounters. Its constant, frenetic action and, yes, the violence of its scenario also made it stand out markedly from comparatively placid games like Pong and Breakout. Space Invaders became the exemplar of videogames in general, the first game the general public thought of when one mentioned the form. With coin-operated arcade games suddenly experiencing a dramatic revival, sales of the Atari VCS also began to steadily increase. Thanks to a very good holiday season, sales for 1979 hit 1 million.

However, the real tipping point that would eventually result in Atari VCSs in more than 15% of U.S. homes came when Manny Gerard and Ray Kassar, Atari’s vice president and president respectively, negotiated a deal with their ostensible rivals Taito and Midway to make a version of Space Invaders for the VCS. Kassar is known today as the man who stifled innovation at Atari and mistreated his programmers so badly that the best of them decided to form their own company, Activision. Still, his marketing instinct at this moment was perfect. Kassar predicted that Space Invaders would not only be a huge hit with the VCS’s existing owners, but that it would actually sell consoles to people who wanted to play their arcade favorite at home. He was proven exactly right upon the VCS Space Invaders‘s release in January of 1980. The VCS, dragged along in the wake of the game, doubled its sales in 1980, to 2 million units.

Atari took the lesson of Space Invaders to heart. Instead of investing energy into original games with innocuously descriptive titles like Basketball, Combat, and Air Sea Battle, as they had done for the first few years of the VCS, they now concentrated on licensing all of the big arcade hits. Atari had learned an important lesson: that the quantity and quality of available software is more important to a platform than the technical specifications of the platform itself. This fact would allow the Atari VCS to dominate the console field for years despite being absurdly primitive in comparison to competition like the Intellivision and the Vectrex.

Apple was learning a similar lesson at this time in the wake of the fortuitous decision that Dan Bricklin and Bob Frankston made to first implement VisiCalc on the Apple II. Indeed, one could argue that the survivors from the early PC industry — companies like Apple and, most notably, Microsoft — were the ones that got the supreme importance of software, while those who didn’t — companies like Commodore, Radio Shack’s computer division, and eventually Atari itself — were the ones ultimately destined for the proverbial dustbin of history. Software like VisiCalc provided an answer to the question that had been tripping up computer hobbyists for years when issued from the mouths of wives, girlfriends, and parents: “But what can you really do with it?” A computer that didn’t have a good base of software, no matter how impressive its hardware, wasn’t much use to the vast majority of the public who weren’t interested in writing their own programs.

With all this in mind, let’s talk about computer games (as opposed to console games) again. We can divide entertainment software in these early years into two broad categories, only one of which I’ve so far concerned myself with in this blog. I’ve been writing about the cerebral branch of computer gaming, slow-paced works inspired by the tabletop-gaming and fiction traditions. These are the purest of computer games, in that they existed only on PCs and, indeed, would have been impossible on the game consoles of their day. They depend on a relatively large memory to hold their relatively sophisticated world models (and, increasingly, disk storage to increase the scope of possibility thanks to virtual memory); a keyboard to provide a wide range of input possibilities; and the ability to display text easily on the screen to communicate in relatively nuanced ways with their players.

The other category consists of arcade-style gameplay brought onto the PC. With the exception of the Atari 400 and 800, none of the earliest PCs were terribly suited to this style of game, lacking sprites and other fast-animation technologies and often even appropriate game controllers. Yet with the arcade craze in full bloom, these games became very, very popular. Even the Commodore PET, which lacked any bitmapped graphics mode at all, had a version of Breakout implemented entirely in “text” using the machine’s extended ASCII character set.

On a machine like the Apple II, which did have bitmapped graphics, such games were even more popular. Nasir Gebelli and Bill Budge were the kings of the Apple II action game, and as such were known by virtually every Apple II hobbyist. Even Richard Garriott, programmer of a very different sort of game, was so excited upon receiving that first call from California Pacific about Akalabeth because CP was, as everyone knew, the home of Budge. If Computer Gaming World is to be believed, it was not Zork or Temple of Apshai or Wizardry that was the bestselling Apple II game of all time in mid-1982, but rather K-Razy Shootout, a clone of the arcade game Berzerk. They may have sold in minuscule numbers compared to their console counterparts and may not have always looked or played quite as nicely, but arcade-style games were a big deal on PCs right from the start. When the Commodore VIC-20 arrived, perched as it was in some tenuous place between PC and game console, the trend only accelerated.

You may have noticed a theme in my discussion of these games in this post and in a previous post: many of these games were, um, heavily inspired by popular coin-operated arcade games. In the earliest days, when the PC-software industry was truly minuscule and copyright still a foreign concept to many programmers, many aspired to make unabashed clones of the latest arcade hits, down to the name itself. By 1980, however, this approach was being replaced by something at least a little more subtle, in which programmers duplicated the gameplay but changed the title and (sometimes, to some extent) the presentation. It should be noted that not all PC action-game programmers were cloners; Gebelli and Budge, for instance, generally wrote original games, and perhaps therein lies much of their reputation. Still, clones were more the rule than the exception, and by 1981 the PC software industry had grown enough for Atari to start to notice — and to get pissed off about it. They took out full-page advertisements in many of the big computer magazines announcing “PIRACY: THIS GAME IS OVER.”

Some companies and individuals have copied Atari games in an attempt to reap undeserved profits from games that they did not develop. Atari must protect its investment so that we can continue to invest in new and better games. According, Atari gives warning to both the intentional pirate and to the individuals simply unaware of the copyright laws that Atari registers the audiovisual works associated with its games with the Library of Congress and considers its game proprietary. Atari will protect its rights by vigorously enforcing these copyrights and by taking the appropriate action against unauthorized entities who reproduce or adapt substantial copies of Atari games, regardless of what computer or other apparatus is used in their performance.

In referring to cloning as “piracy,” Atari is conflating two very separate issues, but they aren’t doing so thoughtlessly — there’s a legal strategy at work here.

Literally from the dawn of the PC era, when Bill Gates wrote his famous “Open Letter to Hobbyists,” software piracy was recognized by many in the industry as a major problem, a problem that some even claimed could kill the whole industry before it got properly started. Gates considered his letter necessary because the very concept of commercial software was a new thing, as new as the microcomputer itself. Previously, programs had been included with hardware and support contracts taken out with companies like IBM and DEC, or traded about freely amongst students, hackers, and scientists on the big machines. In fact, it wasn’t at all clear that software even could be copyrighted. The 1909 Copyright Act that was still in effect when Gates wrote his letter in January of 1976 states that to be copyrightable a work must be “fixed in a tangible medium of expression.” One interpretation of this requirement holds that an executable computer program, since it lives only electronically within the computer’s memory, fails the tangibility test. The Copyright Act of 1976, a major amendment, failed to really clarify the situation. Astonishingly, it was only with the Computer Software Copyright Act of 1980 that it was made unambiguously clear that software was copyrightable in the same way as books and movies and that, yes, all those pirates were actually doing something illegal as well as immoral.

But there was still some confusion about exactly what aspect of a computer program was copyrightable. When we’re talking about copyright on a book, we’re obviously concerned with the printed words on the page. When we’re talking about copyright on a film, we’re concerned with the images that the viewer sees unspooling on the screen and the sounds that accompany them. A computer program, however, has both of these aspects. There’s the “literary” side, the code to be run by the computer, which in many cases takes two forms, the source code written by the programmer and the binary code that the computer actually executes after the source has been fed through an assembler or compiler. And then there’s the “filmic” side, the images that the viewer sees on the screen before her and the sounds she hears. The 1980 law defines a computer program as a “set of statements or instructions to be used directly or indirectly in a computer in order to bring about a certain result.” Thus, it would seem to extend protection to source and executable code, but not to the end experience of the user.

Such protection was not quite enough for Atari. They therefore turned to a court case of 1980, Midway vs. Dirkschneider. Dirkschneider was a small company who essentially did in hardware what many PC programmers were doing in software, stamping out unauthorized clones of games from the big boys like Atari and Midway, then selling them to arcade operators at a substantial discount on the genuine article. When they started making their own version of Galaxian, one of Midway’s most popular games, under the name Galactic Invader, Midway sued them in a Nebraska court. The judge in that case ruled in favor of the plaintiff, on the basis of a new concept that quickly became known as the “ten-foot rule”: “If a reasonable person could not, at ten feet, tell the difference between two competitive products, then there was cause to believe an infringement was occurring.”

So, in conflating pirates who illegally copied and traded software with cloners who merely copied the ideas and appearance of others’ games, implementing them using entirely original code, Atari was attempting to dramatically expand the legal protections afforded to software. The advertisement is also, of course, a masterful piece of rhetoric meant to tar said cloners with the same brush of disrepute used for the pirates, who were criticized in countless hand-wringing editorials in the exact same magazines in which Atari’s advertisement appeared. All of this grandstanding moved out of the magazines and into the courts in late 1981, via the saga of Jawbreaker.

The big arcade hit of 1981 was Pac-Man. In fact, calling Pac-Man merely “big” is considerably underestimating the matter. The game was a full-fledged craze, dwarfing the popularity of even Space Invaders. Recent studies have shown Pac-Man to still be the most recognizable videogame character in the world, which by extension makes Pac-Man easily the most famous videogame ever created. Like Space Invaders, Pac-Man was an import from Japan, created there by Namco and distributed, again like Space Invaders, by Atari’s arch-rival of the standup-arcade world, Midway. Said rivalry did not, however, prevent the companies from working out a deal to get Pac-Man onto the Atari VCS. It was to be released just in time for Christmas 1981, and promised to be the huge VCS hit of the season. Kassar and his cronies rubbed their hands in anticipation, imagining the numbers it would sell — and the number of VCSs it would also move as those who had been resistant so far finally got on the bandwagon.

Yet long before the big release day came, John Harris, Ken Williams’s star Atari 400 and 800 programmer at On-Line Systems, had already written a virtually pixel-perfect clone of the game after obsessively studying it in action at the local arcade. Ken took one look and knew he didn’t dare release it. Even leaving aside Atari’s aggressive attempts to expand the definition of software “piracy,” the Pac-Man character himself was trademarked. Releasing the game as-is risked lawsuits from multiple quarters, all much larger and richer in lawyers than On-Line Systems. The result could very well be the destruction of everything he had built. Yet, the game was just so damn good. After discussing the problem with others, Ken told Harris to go home and redo the game’s graphics to preserve the gameplay but change the theme and appearance. Harris ended up delivering a bizarre tribute to the seemingly antithetical joys of candy and good dental hygiene. Pac-Man became a set of chomping teeth; the dots Life Savers; the ghosts jawbreakers. Every time the player finished a level, an animated toothbrush came out to brush her avatar’s teeth. None of it made a lot of sense, but then the original Pac-Man made if anything even less. Ken put it out there. It actually became On-Line’s second Pac-Man clone; another one called Gobbler was already available for the Apple II.

Meanwhile Atari, just as they had promised in that advertisement, started coming down hard on Pac-Man cloners. They “persuaded” Brøderbund Software to pull Snoggle for the Apple II off the market. They “convinced” a tiny publisher called Stoneware not to even release theirs, despite having already invested money in packaging and advertising. And they started calling Ken.

The situation between On-Line and Atari was more complicated than the others. Jawbreaker ran on Atari’s own 400 and 800 computers rather than the Apple II. On the one hand, this made Atari even more eager to stamp it out of existence, because they themselves had belatedly begun releasing many of their bestselling VCS titles (a group sure to include Pac-Man) in versions for the 400 and 800. On the other hand, though, this represented an opportunity. You see, Harris had naively given away some copies of his game back when it was still an unadulterated Pac-Man. Some of these (shades of Richard Garriott’s experience with California Pacific) had made it all the way to Atari’s headquarters. Thus their goals were twofold: to stamp out Jawbreaker, but also if possible to buy this superb version of Pac-Man to release under their own imprint. Unfortunately, Harris didn’t want to sell it to them. He loved the Atari computers, but he hated the company, famous by this time for their lack of respect for the programmers and engineers who actually built their products. (This lack of respect was such that the entire visionary team that had made the 400 and 800 had left the company by the time the machines made it into stores.)

At the center of all this was Ken, the very picture of a torn man. He wasn’t the sort who accepts being pushed around, and Atari were trying to do just that, threatening him with all kinds of legal hellfire. Yet he also knew that, well, they kind of had a point; if someone did to one of his games what On-Line was doing to Pac-Man, he’d be mad as hell. Whatever the remnants of the hippie lifestyle that hung around On-Line along with the occasional telltale whiff of marijuana smoke, Ken didn’t so much dream of overthrowing the man as joining him, of building On-Line into a publisher to rival Atari. He wasn’t sure he could get there by peddling knockoffs of other people’s designs, no matter how polished they were.

Thanks largely to Ken’s ambivalence, the final outcome of all this was, as tends to happen in real life, somewhat anticlimactic. On-Line defied Atari long enough to get dragged into court for a deposition, at which Atari tried to convince the judge to grant a preliminary injunction forcing On-Line to pull Jawbreaker off the market pending a full trial. The judge applied the legal precedent of the ten-foot rule, and, surprisingly, decided that Jawbreaker looked different enough from Pac-Man to refuse Atari’s motion. You can judge for yourself: below is a screenshot of the original arcade Pac-Man pair with one of Jawbreaker.

Atari’s lawyers were reportedly stunned at the rejection, but still, Ken had no real stomach for this fight. He walked out of the courtroom far from triumphant: “If this opens the door to other programmers ripping off my software, what happened here was a bad thing.” Shortly after, he called Atari to see if they couldn’t work something out to keep Jawbreaker on the market but share the wealth.

Right on schedule, Atari’s own infamously slapdash implementation of Pac-Man appeared just in time for Christmas. It moved well over 7 million units to consumers who didn’t seem to care a bit that the ghosts flickered horribly and the colors were all wrong. The following year, On-Line and Harris developed a version of the now authorized Jawbreaker for the Atari VCS, publishing it through a company called Tigervision. It didn’t sell a fraction of what its inferior predecessor had sold, of course, but it did represent a change in the mentality of Ken and his company. Much of the fun and craziness continued, but they were also becoming a “real” company ready to play with the big boys like Atari — with all the good and bad that entails.

Similar changes were coming to the industry as a whole. Thanks to Atari’s legal muscling, blatant clones of popular arcade games dried up. The industry was now big enough to attract attention from outside its own ranks, with the result that intellectual property was starting to become a big deal. Around this time Edu-Ware got sued for its Space games that were a little bit too inspired by Game Designers’ Workshop’s Traveller tabletop RPG; they replaced them with a new series in the same spirit called Empire. Scott Adams got threatened with a lawsuit of his own over Mission Impossible Adventure, and in response changed the name to Secret Mission.

Indeed, 1981 was the year when the microcomputer industry as a whole went fully and irrevocably professional, as punctuated by soaring sales of VisiCalc and the momentous if belated arrival of IBM on the scene. That’s another story we really have to talk about, but later. Next time, we’ll see how the two broad styles of computer gaming met one another in a single game for the first time.

(My most useful sources in writing this post were an article by Al Tommervik in the January 1982 Softline and Steven Levy’s Hackers.)

 

Tags: ,

Computers for the Masses

The company that would eventually become Commodore International was formed in 1958 as an importer and assembler of Czechoslovakian portable typewriters for Canada and the northeastern United States. Its founder was a Polish immigrant and Auschwitz survivor named Jack Tramiel. Commodore first made the news as a part of the Atlantic Acceptance scandal of 1965, in which one of Canada’s largest savings and loans suddenly and unexpectedly collapsed. When the corpse was dissected, a rotten core of financial malfeasance, much of it involving its client Commodore, was revealed. It seems that Tramiel had become friends with the head of Atlantic, one C.P. Morgan, and the two had set up some mutually beneficial financial arrangements that were not, alas, so good for Atlantic Acceptance as a whole. Additionally, it appears that Tramiel likely lied under oath and altered documents to try to obscure the trail. (The complicated details of all this are frankly beyond me; Zube dissects it all at greater length on his home page, for those with better financial minds than mine.) The Canadian courts were plainly convinced of Tramiel’s culpability in the whole sorry affair, but ultimately decided they didn’t have enough hard evidence to prosecute him. A financier named Irving Gould rescued Tramiel and his scandal-wracked company from a richly deserved oblivion. Commodore remained alive and Tramiel remained in day-to-day control, but thanks to his controlling investment Gould now had him by the short hairs.

Tramiel and Gould would spend almost two decades locked in an embrace of loathing codependency. Tramiel worked like a demon, seldom taking a day off, fueled more by pride and spite than greed. Working under his famous mantra “Business is War,” he seemed to delight in destroying not only the competition but also suppliers, retailers, and often even his own employees when they lost favor in his eyes. Gould was a more easygoing sort. He put the money Tramiel earned him to good use, maintaining three huge homes in three countries, a private yacht, a private jet, and lots of private girlfriends. His only other big passion was tax law, which he studied with great gusto in devising schemes to keep the tax liability of himself and his company as close to zero as possible. (His biggest coup in that department was his incorporation of Commodore in the Bahamas, even though they had no factories, no employees, and no product for sale there.) Some of his favorite days were those in which Tramiel would come to him needing him to release some capital from his private stash to help him actually, you know, run a proper business, with a growth strategy and research and development and all that sort of thing. Gould would toy with him a bit on those occasions, and sometimes even give him what he wanted. But usually not. Better for Tramiel to pay for it out of his operating budget; Gould needed his pocket money, after all.

Commodore’s business over the next decade changed its focus from the manufacturing of typewriters and mechanical adding machines to a new invention, the electronic calculator, with an occasional sideline in, of all things, office furniture. They also built up an impressive distribution network for their products around the world, particularly in Europe. Indeed, Europe, thanks to well-run semi-independent spinoffs in Britain and West Germany, became the company’s strongest market. Commodore remained a niche player in the U.S. calculator market, but in Europe they became almost a household name. Through it all Commodore’s U.S. operation, the branch that ultimately called the shots and developed the product line, retained an everpresent whiff of the disreputable. One could quickly sense that this company just wasn’t quite respectable, that in most decisions quick and dirty was likely to win out over responsible and ethical. Which is not, I need to carefully emphasize, to cast aspersions on the many fine engineers who worked for Commodore over the years, who often achieved heroic results in spite of management’s shortsightedness or, eventually, outright incompetence.

Tramiel and Commodore stumbled into a key role in both the PC revolution and the videogame revolution. In 1976 the company was, not for the first nor the last time, struggling mightily. Texas Instruments had virtually destroyed their calculator business by introducing machines priced cheaper than Commodore could possibly match. The reason: TI owned its own chip-fabrication plants rather than having to source its chips from other suppliers. It was a matter of vertical integration, as they say in the business world. Desperate for some integration of his own, Tramiel bought a chip company of his own, MOS Technologies. With MOS came a new microprocessor, one that had been causing quite a lot of excitement amongst homebrew microcomputer hackers like Steve Wozniak: the 6502. Commodore also ended up with the creator of the 6502, MOS’s erstwhile head of engineering Chuck Peddle. For his next trick, Peddle was keen to build a computer around his CPU. Tramiel wasn’t so sure about the idea, but reluctantly agreed to let Peddle have a shot. The Commodore PET became the first of the trinity of 1977 to be announced, but the last to actually ship. Tramiel, you see, was having cash-flow problems as usual, and Gould was as usual quite unforthcoming.

The PET wasn’t a bad little machine at all. It wasn’t quite as advanced in some areas as the Apple II, but it was also considerably cheaper. Still, it was hard to articulate just where it fit in the North American market. Hobbyists on a budget favored the TRS-80, easily available from Radio Shack stores all over the country, while those who wanted the very best settled on the more impressive Apple II. Business users, meanwhile, fixated early on the variety of CP/M machines from boutique manufacturers, and later, in the wake of VisiCalc, also started buying Apple IIs. The PET therefore became something of an also-ran in North America in spite of the stir of excitement its first announcement had generated.

Europe, however, was a different story. Neither Apple nor Radio Shack had any proper distribution network there in the beginning. The PET therefore became the first significant microcomputer in Europe. With effectively no competition, Commodore was free to hike its prices in Europe to Apple II levels and beyond. This meant that PETs were most commonly purchased by businesses and installed in offices. Only France, where Apple set up distribution quite early on, remained resistant, while West Germany became a particularly strong market, with the Commodore name accorded respect in business equivalent to what CP/M received in the U.S. And when a PET version of VisiCalc was introduced to Europe in 1980, it caused almost as big a sensation as the Apple II version had the year before in America. Within a year or two, Commodore stopped even seriously trying to sell PETs in North America, but rather shipped most of the output of their U.S. factory to Europe, where they could charge more and where the competition was virtually nonexistent.

In North America Commodore’s role in the early microcomputer and game-console industries was also huge, but mostly behind the scenes, and all centered around the Commodore Semiconductor Group — what had once been MOS Technologies. In an oft-repeated scenario that Dave Haynie has dubbed the “Commodore Curse,” most of the innovative engineers who had created the 6502 fled soon after the Commodore purchase, driven away by Tramiel’s instinct for degradation and his refusal to properly fund their research-and-development efforts. For this reason, MOS, poised at the top of the microcomputer industry for a time, would never even come close to developing a viable successor to the 6502. Nevertheless, Commodore inherited a very advanced chipmaking operation — one of the best in the country in fact. It would take some years for inertia and neglect to break down the house that Peddle and company had built. In the meantime, they delivered the 6502s and variants found not only in the PET but also in the Apple II, the Atari VCS, the Atari 400 and 800, and plenty of other more short-lived systems. They also built many or most of the cartridges on which Atari VCS games shipped. All of which put Commodore in the enviable position of making money every time many of their ostensible competitors built something. Thanks to MOS and Europe, Commodore went from near bankruptcy to multiple stock splits, while Tramiel himself was worth $50 million by 1980. That year he rewarded Peddle, the technical architect of virtually all of this success, with termination and a dubious lawsuit that managed to wrangle away the $3 million in Commodore stock he had earned.

Commodore’s transformation from a business-computer manufacturer and behind-the-scenes industry player to the king of home computing also began in 1980, when Tramiel visited London for a meeting. He saw there for the first time an odd little machine called the Sinclair ZX-80. Peddled by an eccentric English inventor named Clive Sinclair, the ZX-80 was something of a throwback to the earliest U.S.-made microcomputers. It was sold as a semi-assembled kit, and, with just 1 K of memory and a display system so primitive that the screen went blank every time you typed on the keyboard, pretty much the bare-minimum machine that could still meet some reasonable definition of “computer.” For British enthusiasts, however, it was revelatory. Previously the only microcomputers for sale in Britain had been the Commodore PET line and a few equally business-oriented competitors. These machines cost thousands of pounds, putting them well out of reach of most private individuals in this country where average personal income lagged considerably behind that of the U.S. The ZX-80, though, sold for just under £100. For a generation of would-be hackers who, like the ones who had birthed the microcomputer industry in the U.S. five years before, simply wanted to get their hands on a computer — any computer — it was a dream come true. Sinclair sold 50,000 ZX-80s before coming out with something more refined the next year.

We’ll talk more about Sinclair and his toys in later posts, but for now let’s focus on what the ZX-80 meant to Tramiel. He began to think about a similar low-cost computer for the U.S. consumer market — this idea of a “home computer” that had been frequently discussed but had yet to come to any sort of real fruition. To succeed in the U.S. mass market Commodore would obviously need to put together something more refined than the ZX-80. It would have to be a fully assembled computer that was friendly, easy to use, and that came equipped with all of the hardware needed to hook it right up to the family television. And it would need to be at least a little more capable than the Atari VCS in the games department (to please the kids) and to have BASIC built in (to please the parents, who imagined their children getting a hand up on their future by learning about computers and how to program them).

Luckily, Commodore already had most of the parts they needed just sort of lying around. All the way back in 1977 their own Al Charpentier had designed the Video Interface Chip (the VIC) for a potential game console or arcade machine. It could display 16-color graphics at resolutions of up to 176 X 184, and could also generate up to three simple sounds at one time. Commodore had peddled it around a bit, but it had ended up on the shelf. Now it was dusted off to become the heart of the new computer. Sure, it wasn’t a patch on the Atari 400 and 800’s capabilities, but it was good enough. Commodore joined it up with much of the PET architecture in its most cost-reduced form, including the BASIC they’d bought from Microsoft years before, added a cartridge port, and they had their home computer. After test marketing it in Japan as the VIC-1001, they brought it to North America as the VIC-20 in the spring of 1981, and soon after to Europe. (In the German-speaking countries it was called the VC-20 because of the unfortunate resemblance “VIC” had to the German verb “ficken” — to fuck.) In the U.S. the machine’s first list price was just under $300, in line with Tramiel’s new slogan: “Computers for the masses, not the classes.” Tramiel may have been about the last person in the world you’d expect to start advocating for the proletariat, but business sometimes makes strange bedfellows. Discounting construction kits and the like, the VIC-20 was easily the cheapest “real computer” yet sold in the U.S.

For the first time in the company’s history, Commodore created a major U.S. advertising campaign that was well-funded and smart, perhaps because it was largely the work of an import from Commodore’s much more PR-savvy British subsidiary named Kit Spencer. He hired as spokesman William Shatner, Captain Kirk himself. “Why buy just a videogame?” Shatner asked. “Invest in the wonder computer of the 1980s,” with “a real computer keyboard.” The messaging was masterful. The box copy announced that the VIC-20 was great for “household budgeting, personal improvement, student education, financial planning.” In reality, the VIC-20, with just 5 K of memory and an absurdly blocky 22-characters-per-line text display, was of limited (at best) utility for any of those things. But always Commodore snuck in a reference, seemingly as an afterthought, to the fact that the VIC-20 “plays great games too!” Commodore was effectively colluding with the kids they were really trying to reach, giving them lots of ways to convince Mom and Dad to buy them the cool new game machine they really wanted. Understanding that a good lineup of games was crucial to this strategy, they made sure that upon release a whole library of games, many of them unauthorized knockoffs of current arcade hits, was ready to go. For the more cerebral sorts, they also contracted with Scott Adams to make cartridge versions of his first five adventures available at launch.

Within a few months of the launch, Tramiel made a deal with K-Mart, one of the largest U.S. department-store chains of the time, to sell VIC-20s right from their shelves. This was an unprecedented move. Previously department stores had been the domain of the game consoles; the Atari VCS owed much of its early success to a distribution deal that Atari struck with Sears. Computers, meanwhile, were sold from specialized dealers whose trained employees could offer information, service, and support before and after the sale. Tramiel alienated and all but destroyed Commodore’s dealer network in the U.S., such as it was, by giving preferential treatment to retailers like K-Mart, even indulging in the dubiously legal practice of charging the latter lower prices per unit than he did the loyal dealers who had sometimes been with him for years. Caught up in his drive to make Commodore the home-computer company as well as his general everyday instinct to cause as much chaos and destruction as possible, Tramiel couldn’t have cared less when they complained and dropped their contracts in droves. Eventually this betrayal, like so many others, would come back to haunt Commodore. But for now they were suddenly riding higher than ever.

The VIC-20 resoundingly confirmed at last the mutterings about the potential for a low-cost home computer. It sold 1 million units in barely a year, the first computer of any type to do so. Apple, by comparison, had after five years of steadily building momentum managed to sell about 750,000 Apple IIs by that point, and Radio Shack’s numbers were similar. The VIC-20 would go on to sell 2.5 million units before crashing back to earth almost as quickly as it had ascended; Commodore officially discontinued it in January of 1985, by which time it was generally selling for well under $100. Attractive as its price was, it was ultimately just too limited a machine to have longer legs. Still, and while the vast majority of VIC-20s were used almost exclusively for playing games (at least 98% of the software released for the machine were games), some who didn’t have access to a more advanced machine used it as their gateway to the wonders of computing. Most famously, Linus Torvalds, the Finnish creator of Linux, got his start exploring the innards of the VIC-20 installed in his bedroom. For European hackers like Torvalds, without as many options as the U.S. market afforded, the VIC-20 as well as the cheap Sinclair machines were godsends.

The immediate reaction to the VIC-20 from users of the Apple II and other more advanced machines was generally somewhere between a bemused shrug and a dismissive snort. With its minuscule memory and its software housed on cartridges or cassette tapes, the VIC-20 wasn’t capable of running most of the programs I’ve discussed on this blog, primitive as many of them have been. Even the Scott Adams games were possible only because they were housed on ROM cartridges rather than loaded into the VIC-20’s scant RAM. Games like Wizardry, Ultima, The Wizard and the Princess, or Zork — not to mention productivity game-changers like VisiCalc — were simply impossible here. The VIC-20’s software library, large and (briefly) profitable as it was, was built mostly of simple action games not all that far removed from the typical Atari VCS fare. Companies like On-Line Systems released a VIC-20 title here and there if someone stepped forward with something viable (why throw away easy money?), but mostly stayed with the machines that had brought them this far. To the extent that the VIC-20 was relevant to them at all, it was relevant as a stepping stone — or, if you will, a gateway drug to computing. Hopefully some of those VIC-20 buyers would get intrigued enough that they’d decide to buy a real system some day.

Yet in the long run the VIC-20 was only a proof of concept for the home computer. With the segment now shown to be viable and, indeed, firmly established, the next home computer to come from Commodore wouldn’t be so easy to ignore.

(By far the best, most unvarnished, and most complete history of Commodore is found in Brian Bagnall’s Commodore: A Company on the Edge and its predecessor On the Edge: The Spectacular Rise and Fall of Commodore. Both books are in desperate need of a copy editor, making them rather exhausting to read at times, and Bagnall’s insistence on slamming Apple and IBM constantly gets downright annoying. Still, the information and stories are there.

Michael Tomczyk’s much older The Home Computer Wars was previously the only real insider account of Commodore during this period, but it’s of dubious value at best in the wake of Bagnall’s books. Tomczyk inflates his own role in the creation and marketing of the VIC-20 enormously, and insists on painting Tramiel as a sort of social visionary. He’s amazed that Tramiel is willing to do business in Germany after spending time in Auschwitz, seeing this as a sign of the man’s essential nobility and forgiving nature. News flash: unprincipled men seldom put principles — correct or misguided — above the opportunity to make a buck.)

 

Tags: , ,

Of Game Consoles, Home Computers, and Personal Computers

When I first started writing the historical narrative that’s ended up consuming this blog, I should probably have stated clearly that I was writing about the history of computer games, not videogames or game consoles. The terms “computer game” and “videogame” have little or no separation today, but in the late 1970s and early 1980s the two were regarded as very distinct things. In Zap!, his history of Atari written just as that company was imploding in 1983, Scott Cohen takes the division as a given. He states, “Perhaps Atari’s most significant contribution is that it paved the way for the personal computer.” In predicting the future of the two categories, he is right about one and spectacularly wrong about the other. The PC, he says, will continue up a steadily inclining growth curve, becoming more and more an expected household fixture as the years go by. The game console, however, will be dismissed in future years as a “fad,” the early 1980s version of the Hula Hoop.

If we trace back far enough we can inevitably find some common origins, but the PC and game console were generally products of different folks with very different technical orientations and goals. Occasional collisions like Steve Jobs’s brief sojourn with Atari were more the exception than the rule. Certainly the scales of the two industries were completely out of proportion with one another. We’ve met plenty of folks on this blog who built businesses and careers and, yes, made lots of money from the first wave of PCs. Yet everything I’ve discussed is a drop in the bucket compared to the Atari-dominated videogame industry. A few figures should make this clear.

Apple, the star of the young PC industry, grew at an enviable rate in its early years. For example, sales more than doubled from 1979 to 1980, from 35,000 units to 78,000. Yet the Atari VCS console also doubled its sales over the same period: from 1 million in 1979 to 2 million in 1980. By the time the Apple II in 1983 crossed the magical threshold of 1 million total units sold, the VCS was knocking at the door of 20 million. Even the Intellivision, Mattel’s distant-second-place competitor to the VCS, sold 200,000 units in 1980 alone. In mid-1982, the height of the videogame craze, games consoles could already be found in an estimated 17% of U.S. households. Market penetration like that would be years in coming to the PC world.

In software the story is similar. In 1980, a PC publisher with a hit game might dream of moving 15,000 units. Atari at that time already had two cartridges, Space Invaders and Asteroids, that had sold over 1 million copies. Activision, an upstart VCS-game-maker formed by disgruntled Atari programmers, debuted in 1980 with sales of $67 million on its $25 game cartridges. By way of comparison, Apple managed sales of $200 million on its $1500 (or more) computer systems. The VCS version of Pac-Man, the big hit of 1981, sold over 2 million copies that year alone. Again, it would be a decade or more before PC publishers would begin to see numbers like that for their biggest titles.

So, we have two very different worlds here, that of the mass-market, inexpensive game consoles and that of the PC, the latter of which remained the province of the most affluent, technology-savvy consumers only. But then a new category began to emerge, to slot itself right in the middle of this divide: the “home computer.” The first company to dip a toe into these waters was Atari itself.

Steve Jobs during his brief association with Atari brought a proposal for what would become the Apple II to Atari’s then-head Nolan Bushnell. With Atari already heavily committed to both arcade machines and the project that would become the VCS, Bushnell declined. (Bushnell did, however, get Jobs a meeting with potential investor Don Valentine, who in turn connected him with Mike Markkula. Markkula became the third employee at Apple, put up most of the cash the company used to get started in earnest, and played a key role in early marketing efforts. Many regard him as the unsung hero of Apple’s unlikely rise.) Only later on, after the success of the Apple II and TRS-80 proved the PC a viable bet, did Atari begin to develop a full-fledged computer of its own.

The Atari 400 and 800, released in late 1979, were odd ducks in comparison to other microcomputers. The internals were largely the work of three brilliant engineers, Steven Mayer, Joe Decuir, and Jay Miner, all of whom had also worked on the Atari VCS. Their design was unprecedented. Although they had at their heart the same MOS 6502 found in the Atari VCS and the Apple II, the 400 and 800 were built around a set of semi-intelligent custom chips that relieved the CPU of many of its housekeeping burdens to increase its overall processing potential considerably. These chips also brought graphics capabilities that were nothing short of stunning. Up to 128 colors could be displayed at resolutions of up to 352 X 240 pixels, and the machines also included sprites, small graphics blocks that could be overlaid over the background and moved quickly about; think of the ghosts in Pac-Man for a classic example. By comparison, the Apple II’s hi-res mode, 280 X 160 pixels with 6 possible colors, no sprites, and the color-transition limitations that result in all that ugly color fringing, had represented the previous state of the art in PC graphics. In addition, the Atari machines featured four-voice sound-synthesis circuitry. Their competitors offered either no sound at all, or, as in the case of the Apple II, little more than beeps and squeaks. As an audiovisual experience, the new Atari line was almost revolutionary.

Still, externally the Apple II looked and was equipped (not to mention was priced) like a machine of serious intent. The Ataris lacked the Apple’s flexible array of expansion slots as well as Steve Wozniak’s fast and reliable floppy-disk system. They shipped with just 8 K of memory. Their BASIC implementation, one of the few not sourced from Microsoft, was slow and generally kind of crummy. The low-end model, the 400, didn’t even have a proper keyboard, just an awkward membrane setup. And it wasn’t even all a story of missing features. When you inspected the machines more closely, you found something unexpected: a console-style port for game cartridges. The machines seemed like Frankensteins, stuck somewhere between the worlds of the game console and the PC. Enter the home computer — a full-fledged computer, but one plainly more interested in playing games and doing “fun” things than “serious” work. The Atari logo on the cases, of course, also contributed to the impression that, whatever else they were, these machines weren’t quite the same thing as, say, the Apple II.

Alas, Atari screwed the pooch with the 400 and 800 pretty badly. From the beginning it priced them too high for their obvious market; the 800 was initially only slightly less expensive than the Apple II. And, caught up like the rest of the country in VCS-fever, they put little effort into promotion. Many in management hardly seemed aware that they existed at all. In spite of this, their capabilities combined with the Atari name were enough to make them modest sales successes. They also attracted considerable software support. On-Line Systems, for instance, made them their second focus of software development, behind only the Apple II, during their first year or two in business. Still, they never quite lived up to their hardware’s potential, never became the mass-market success they might (should?) have been.

The next company to make a feint toward the emerging idea of a home computer was Radio Shack, who released the TRS-80 Color Computer in 1980. (By the end of that year Radio Shack had four separate machines on the market under the TRS-80 monicker, all semi- or completely incompatible with one another. I haven’t a clue why no one could come up with another name.) Like so much else from Radio Shack, the CoCo didn’t seem to know quite what it wanted to be. Radio Shack did get the price about right for a home computer: $400. And they provided a cartridge port for instant access to games. Problem was, those games couldn’t be all that great, because the video hardware, while it did indeed allow color, wasn’t a patch on the Atari machines. Rather than spend money on such niceties, Tandy built the machine around a Motorola 6809, one of the most advanced 8-bit CPUs ever created. That attracted a small but devoted base of hardcore hackers who did things like install OS-9, the first microcomputer operating system capable of multitasking. Meanwhile the kids and families the machine was presumably meant to attract shrugged their shoulders at the unimpressive graphics and went back to their Atari VCSs. Another missed opportunity.

The company that finally hit the jackpot in the heretofore semi-mythical home-computer market was also the creator of the member of the trinity of 1977 that I’ve talked about the least: Commodore, creator of the PET. I’ll try to make up for some of that inattention next time.

 
 

Tags: